8.4 C
New York
Wednesday, March 12, 2025

Join Canonical at NVIDIA GTC 2025


Canonical, the company behind Ubuntu and the trusted source for open source software, is thrilled to announce its presence at NVIDIA GTC again this year. Join us in San Jose from March 18th to the 21st and explore what’s next in AI and accelerated computing.

Register for NVIDIA GTC 2025

Unlock AI potential: innovate on trusted open source

The world demands AI everywhere, pushing data processing from the cloud to the edge. While cloud resources are essential, many applications require real-time insights directly at the source. Issues like latency, bandwidth, and privacy make centralized AI impractical in many scenarios.

Canonical, in collaboration with NVIDIA, provides a powerful AI solution built on Ubuntu’s trusted open-source foundation. This comprehensive stack empowers innovation from cloud to edge, offering a consistent experience across any infrastructure. Benefit from simplified management, modular design, and optimized performance on NVIDIA’s accelerated computing platform. 

Unlock the full potential of AI and transform data into actionable intelligence, wherever it resides, with Ubuntu and NVIDIA.

Schedule a meeting at GTC

What to anticipate from Canonical at GTC

At GTC, discover how Canonical and NVIDIA are partnering to accelerate AI. Explore our new reference architectures for open-source LLM chatbots and edge AI, built on NVIDIA technology for scalability, security, and efficiency. Learn about our reference architecture for open-source machine learning infrastructure.

See Ubuntu and NVIDIA Jetson™ deliver powerful AI at the edge, managed seamlessly with Canonical Landscape. Join us to explore how Canonical simplifies AI development and deployment.

NVIDIA Jetson, Ubuntu and Landscape: streamlining Edge AI

NVIDIA Jetson is a leading platform for AI at the edge. Canonical and NVIDIA have collaborated to bring optimized Ubuntu images to these powerful devices, enabling a wide range of applications.

At our booth, we’re showcasing the power of Ubuntu on NVIDIA Jetson, managed seamlessly with Canonical Landscape. See how you can easily deploy, monitor, and manage your entire Jetson fleet through Landscape’s centralized interface. This combination delivers a streamlined and secure edge AI solution, ready for everything from robotics to industrial automation.

Open Source LLM chatbots with NVIDIA

We’re launching a new reference architecture for building chatbot applications using open-source LLMs and NVIDIA technology. This solution streamlines LLM application development with a scalable and reliable platform.

Leveraging NVIDIA NIM technology, it supports models like LLaMA 3, utilizes OpenSearch for vector search, and features an optimized RAG pipeline. Integrated with Kubeflow and KServe, it offers a production-ready AI pipeline on Kubernetes.

The architecture is cloud-native, optimized for NVIDIA GPUs, and includes lifecycle management and security features. Kubeflow Pipelines, OpenSearch, KServe, and Canonical Observability Stack (COS) deliver a complete, powerful, and open-source chatbot platform.

Unleashing Edge AI

Edge AI is transforming industries, but building robust edge architectures remains challenging. We’re introducing a new reference architecture for AI inference at the edge, designed for scalability, maintainability, security, and efficiency.

This solution addresses the complexities of deploying AI to thousands of devices, ensuring autonomous operation and protection against threats. It includes a comprehensive approach for both model training and edge deployment. An MLOps platform and GPU scheduling optimize model development, while secure packaging ensures safe deployment to resource-constrained edge devices, bringing powerful AI capabilities closer to the source.

Open Source ML infrastructure with NVIDIA, MicroK8s, and Dell PowerEdge

Introducing a reference architecture for open-source machine learning infrastructure, combining NVIDIA NIM, MicroK8s, Charmed Kubeflow, and Dell PowerEdge R7525 servers. This solution accelerates AI development.

Leveraging Canonical’s MicroK8s and Charmed Kubeflow, validated with NVIDIA, this architecture delivers faster iteration, improved scalability, and enhanced security through Ubuntu. It meets the demanding requirements of AI workloads.

This end-to-end stack empowers data scientists to run ML on powerful hardware. The guide provides a clear blueprint for setup, offering a valuable resource for those looking to advance their AI strategies.

Schedule a meeting at GTC

Speaking session: Building enterprise sovereign AI

Nicholas Dimotakis, Global Field Engineering VP at Canonical, will join our partner PNY to speak about the future of data sovereignty and secure AI infrastructure.

This testimonial session will delve into the critical aspects of designing and deploying sovereign AI data centers, emphasizing the importance of national security, data governance, and technological independence in the age of rapid AI advancement. 

PNY have designed full AI training and inference stack from scratch handling the financial and investment aspects and providing technological expertise to ensure the data centers are optimized for AI and high-performance computing workloads, while also investing in the necessary energy infrastructure to power these facilities.

  • Sovereign AI infrastructure and design requirements
  • Location, power infrastructure, and energy efficiency
  • Importance of collaboration and partnerships in driving success on this scale
  • Project planning and use case selection

Add it to your schedule

Canonical and NVIDIA: a powerful partnership for the future of computing

Canonical and NVIDIA are collaborating to drive innovation across AI, cloud, and edge computing. By optimizing Ubuntu for NVIDIA’s GPUs, we provide developers and enterprises with a robust platform for building and deploying cutting-edge applications.

This collaboration gives users access to NVIDIA’s powerful tools within the familiar Ubuntu environment. Enterprises benefit from enhanced performance, security, and manageability for AI and data science workloads. Together, Canonical and NVIDIA are making high-performance computing more accessible than ever

Learn more about

Getting to the Event

Join the Canonical team at GTC to discuss how to take advantage of solutions that capitalize on the benefits of Open Source software with security and compliance.

Location
San Jose Convention Center
150 West San Carlos Street, San Jose, CA 95113

Dates
March 17 – 21

Expo Hours
Tuesday, 1 PM – 7 PM PT
Wednesday, 12 PM – 7 PM PT
Thursday, 12 PM – 7 PM PT
Friday, 11 AM – 2 PM PT

You can meet with the Canonical team on-site in San Jose and pick our technical experts’ brains about your particular open source scenario.

Register for NVIDIA GTC 2025

Want to learn more?
Please stop by booth 2021, to speak to our experts and check out our demos.

Are you interested in setting up a meeting with our team?
Reach out to our Alliances team at partners@canonical.com

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles