DEV Community

Jocimar Lopes
Jocimar Lopes

Posted on

Flask API com DeepSeek-R1 via Ollama with Python

🧠 Flask API with DeepSeek-R1 via Ollama

This is an API developed with Flask in Python, connecting to the LLM model DeepSeek-R1 using the Ollama platform.

It allows to perform inferences with LLMs in a simple, efficient and local way with a basic UI and response stream return.

βš™οΈ Prerequisites

Before running the project, make sure you have the following items installed:

πŸš€ Installation

  1. Clone the repository
git clone https://github.com/jocimarlopes/ollama-llm-deepseek-server.git
cd ollama-llm-deepseek-server
Enter fullscreen mode Exit fullscreen mode
  1. Install the dependencies
pip install -r requirements.txt
Enter fullscreen mode Exit fullscreen mode
  1. Install and start Ollama If you don't already have Ollama:
# On Linux via curl
curl -fsSL https://ollama.com/install.sh | sh
Enter fullscreen mode Exit fullscreen mode
  1. Download the DeepSeek-R1 model If you don't have Ollama yet:
ollama pull deepseek-r1:latest
Enter fullscreen mode Exit fullscreen mode

πŸ§ͺ Running the API

Just run the app.py script:

python app.py
Enter fullscreen mode Exit fullscreen mode

πŸ“ Basic structure

Just run the app.py script:

β”œβ”€β”€ app.py
β”œβ”€β”€ requirements.txt
└── README.md
Enter fullscreen mode Exit fullscreen mode

πŸ‘€ Credits

Project developed by Jocimar Lopes.
Feel free to contribute or use in your own projects.

πŸ“„ License

This project is licensed under the terms of the MIT License.

Top comments (0)