π§ Flask API with DeepSeek-R1 via Ollama
This is an API developed with Flask in Python, connecting to the LLM model DeepSeek-R1 using the Ollama platform.
It allows to perform inferences with LLMs in a simple, efficient and local way with a basic UI and response stream return.
βοΈ Prerequisites
Before running the project, make sure you have the following items installed:
- Python 3.8+
- Ollama
-
deepseek-r1
downloaded from Ollama
π Installation
- Clone the repository
git clone https://github.com/jocimarlopes/ollama-llm-deepseek-server.git
cd ollama-llm-deepseek-server
- Install the dependencies
pip install -r requirements.txt
- Install and start Ollama If you don't already have Ollama:
# On Linux via curl
curl -fsSL https://ollama.com/install.sh | sh
- Download the DeepSeek-R1 model If you don't have Ollama yet:
ollama pull deepseek-r1:latest
π§ͺ Running the API
Just run the app.py script:
python app.py
π Basic structure
Just run the app.py script:
βββ app.py
βββ requirements.txt
βββ README.md
π€ Credits
Project developed by Jocimar Lopes.
Feel free to contribute or use in your own projects.
π License
This project is licensed under the terms of the MIT License.
Top comments (0)