28.9 C
New York
Monday, July 14, 2025

Let’s meet at AI4 and talk about AI infrastructure with open source


Date: 11 – 13 August 2025

Booth: 353

You know the old saying: what happens in Vegas… transforms your AI journey with trusted open source. On August 11-13, Canonical is back at AI4 2025 to share the secrets of building secure, scalable AI infrastructure to accelerate every stage of your machine learning lifecycle – whether you’re creating a sovereign AI cloud or deploying with hyperscalers.

Here’s a taste of what you can expect if you come by booth 353.

Building AI on a solid open source foundation

Canonical is the publisher of Ubuntu, the leading Linux distribution. With over 20 years of experience maintaining open source software, we deliver the same stability and support you’re familiar with from Ubuntu for your machine learning operations. 

Our MLOps stack delivers all the open source solutions you need to streamline the complete machine learning lifecycle. These tools are tightly integrated to ensure a smooth MLOps journey, from experimentation to production.

  • Data science stack for beginners:  DSS is an easy-to-deploy solution that can run on any workstation. It allows you to set up ML environments on your AI workstation using an out-of-the-box solution for data science.
  • Cloud Native MLOps: Charmed Kubeflow is our fully open source end-to-end MLOps, used to develop and deploy models at scale. It integrates with our portfolio of Data and AI applications, including MLFlow, Feast, Spark, and OpenSearch. Our platform runs on any CNCF-conformant Kubernetes, including Canonical K8s, AKS, EKS, and GKE.
  • Ubuntu Core and Kserve for Edge AI: Devices at the edge often support mission critical applications and running a securely designed Operative System is crucial in order to protect all the artefacts, regardless of the architecture. We support deploying and scaling models at the edge thanks to our portable and hardened application-packaging technology and our enterpsise support for KServe.

AI infrastructure with trusted open source

This year at AI4, we’ll also be showcasing how you can build your AI infrastructure with trusted open source. From sovereign cloud to public cloud, we’ll demonstrate how to leverage securely designed open source building blocks for any environment.

  • Build your AI sovereign cloud with fully open source technologies. Canonical enables private and air-gapped AI deployments with full control over your stack. From infrastructure to models, everything runs on securely designed, certified components – no vendor lock-in.
  • Run AI workloads on any public cloud, with consistent experience and cost control. Canonical’s AI stack is cloud-agnostic, optimized for all major providers. Deploy and scale with confidence – from development to production – using the same stack everywhere.
  • Confidently deploy and maintain AI/ML workloads with hardened containers. We build, sign, and maintain securely designed containers for AI/ML – optimized for performance and compliance. Our long-term support and CVE patching ensure your infrastructure stays protected and up to date.

If any of these topics catch your attention, bring your questions to booth 353 where our team will be happy to explore them further and deep dive into your challenge or use case. You can pre-book some time with us via the link below.

Monarchs in the sky: building sovereign AI clouds with open source tools

As AI workloads scale and regulations tighten, organizations are under pressure to deploy machine learning with full control over data, infrastructure, and compliance. In this keynote, Stefano Fioravanzo, AI Product Manager at Canonical, will share how to build a sovereign AI cloud using Canonical’s open-source MLOps platform and infrastructure stack. 

By the end of the session, you will learn how to orchestrate end-to-end ML pipelines, integrate secure container technologies, and automate operations across hybrid and multi-cloud environments. We’ll dive into real-world strategies for achieving data sovereignty, avoiding vendor lock-in, and running production-grade AI at scale with open source at the core.

The session is scheduled for Monday, August 11 at 3.10-3.30 pm at the MLOps& Platforms Technical Track.

To continue investigating the topic, we will run demos at our 353 booth to explore how to achieve data sovereignty in your GenAI applications with RAG. Juan Pablo Norena, our AI Field Engineer, will demonstrate how to build your open source RAG application on Ubuntu infrastructure and will walk us through:  

  • Efficient GPU resource management with Canonical K8s,
  • Self-hosted vector database,
  • Kafka streaming events on the knowledge sources,
  • Local inference of LLMs at scale on Kubeflow inference services.

Finally, unwind Ai4 with the Ubuntu team and network with fellow open source enthusiasts and industry leaders over drinks and treats. We’re holding our self-hosted event on the first evening of the conference – Monday, August 11 at 7.30pm – 9.30pm PT. 

This informal gathering is a chance to connect with peers and the Ubuntu team, hear what others are building, share insights and indulge yourself with some tasty treats and refreshments. We will talk about:

  • Building infrastructure that doesn’t hate AI workloads
  • Why open source is the friend you want in your compliance strategy
  • Streamlining complex environments with open source tools

Secure your spot here: https://ubuntu.com/engage/ubuntu-canonical-event-ai4

Join us at Booth 353 

We look forward to meeting old friends and new at this nexus of the global AI community. If you’re attending AI4 2025 in Las Vegas, US between 11-13 August, make sure to visit booth 353! 

You can also book a meeting with one of our team members in advance using the link below.

See you soon in Las Vegas and have safe travels! 

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles