Skip to content

gowsiyabs/docker-llm-fastapi-app

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🐳 Run LLMs Locally with Docker Model Runner — Project Series

This repo explores how to build GenAI apps locally using Docker Model Runner with step-by-step examples.


✅ Part 1: Hello LLMs from Docker

Run a simple LLM (e.g. llama2, mistral) locally using Docker Model Runner.

View Article ➜


⚙️ Part 2: Call LLM from Python

A simple Python script that sends prompts to the running model and prints the response.

Run the Model

docker model pull ai/mistral
docker model run ai/mistral

Run the Script

cd docker-model-runner-llm-demo
pip install -r requirements.txt
python app/main.py

View Article ➜


🚀 Part 3: Build a FastAPI GenAI Backend

Wrap your prompt logic in a FastAPI app and serve it as a local GenAI API using Docker Compose.

Folder: docker-llm-fastapi-app

cd docker-llm-fastapi-app
docker compose up --build

Access the API at http://localhost:8000/generate

View Article ➜


🌐 Part 4: Add Prompt Templates and Roles (Frontend UI)

Build a web UI with role-based templates using FastAPI + Jinja2. Run it locally with Docker and connect to Model Runner via OpenAI-compatible APIs.

Folder: docker-genai-ui

cd docker-genai-ui
docker compose up --build

Open in browser: http://localhost:8000

Note: First LLM response may take 1–2 minutes to load due to model warm-up.

View Article ➜


🔜 Coming Next: Part 5

How to Set Up Your Company for Success with Docker — standardize dev environments, improve security, and streamline team adoption of Docker.


📦 Prerequisites for All Parts

  • Docker Desktop v4.41+ with Model Runner enabled
  • Enable host-side TCP support (Docker Settings → Beta Features → Model Runner → TCP)
  • Enable experimental features in Docker Desktop
  • Python 3.10+ for running the scripts

Stay tuned! 🚀

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published