Self-host AI on Kubernetes: GPU clusters, private models, and the GitOps Catalog
Date
Time
Click to show UTC
Duration
1 hr


w/ John Dietz & M R Rishi

Spin up a GPU workload cluster using Konstruct's new GPU cluster templates, deploy a self-hosted LLM, and use it in your organization — all live on stream. This hands-on session shows how shipping AI workloads to GPU clusters is just as easy as deploying to Konstruct physical or virtual clusters, and how open source apps in the GitOps Catalog make it even faster. Walk away knowing how to cut your token spend by running models privately on your own infrastructure.
Register for this webinar
Loading form...
Share this webinar with your network
%2520(2).png&w=3840&q=75)
%2520(1).png&w=3840&q=75)
