Intro and demo to LLM models and ollama on AKS

Simply put, this is how you can deploy an LLM/SLM models like Mistral, Microsoft Phi, Llama on your preferred machine whether that is your local machine, docker container or Kubernetes.
The demo uses ollama as a platform for deploying and managing the LLM models.

Disclaimer: This is part of…

Learn More
Share:

You may be interested in

What you're searching for?

Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors