A simple client implementation for interacting with Model Context Protocol (MCP) servers using Ollama models.
This project provides a client that connects to an MCP server and uses Ollama to process queries, enabling tool use capabilities. The client establishes a connection to an MCP server, sends queries to Ollama, and handles any tool calls that the model might make.
This branch provides an example of a simple implementation of an MCP client. For a more feature-rich and advanced version, including additional enhancements and capabilities, please refer to the main branch.
- Connect to any MCP-compliant server
- Use different Ollama models for processing
- Support for Python and JavaScript MCP servers
- Simple interactive chat interface
- Tool usage capabilities
- Easy to extend and modify for custom use cases
- Example MCP server implementation included
- Support for multiple Ollama models
- Python 3.10+
- Ollama running locally
llama3.2:3b
(if using default model)
- UV package manager
uv venv
source .venv/bin/activate
uv pip install .
This command will start the client with the default llama3.2:3b
model and connect to the included MCP server implementation.
uv run client.py --mcp-server server.py
Run the client with:
uv run client.py --mcp-server <path_to_mcp_server> --model <ollama_model>
--mcp-server
: Path to the MCP server script (.py or .js)--model
: Ollama model to use (default:llama3.2:3b
)
Using the default model (llama3.2:3b
):
uv run client.py --mcp-server server.py
Using a different model:
uv run client.py --mcp-server server.py --model qwen2.5:7b
Using a JavaScript MCP server:
uv run client.py --mcp-server server.js --model llama3.2:3b
This repository includes a sample MCP server implementation in server.py
that you can use for testing or as a reference implementation.
While this project was inspired by the official MCP client quickstart guide, it has been adapted specifically for Ollama models rather than Anthropic models. The official guide focuses on Claude and other Anthropic models, whereas this implementation provides a similar experience but tailored for open-source models running through Ollama.
If you're looking for the official MCP implementation for Anthropic models, please refer to the official documentation.