Serverless LLM inference for everyone: learnings from an AI beginner video

Speakers: Engin Diri

In this Civo Navigate 2024 session, Engin Diri shares his journey into managing Open-Source Large Language Models (LLMs) in cloud infrastructure. He explores three key strategies he proposed to clients, evaluating the pros and cons of each. Engin also discusses the challenges encountered and the solutions implemented to overcome them.