Query model running with Ollama from within Claude Desktop or other MCP clients
A Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.
ollama pull llama2
)Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json
on macOS, %APPDATA%\Claude\claude_desktop_config.json
on Windows):
{
"mcpServers": {
"ollama": {
"command": "uvx",
"args": [
"mcp-ollama"
]
}
}
}
Install in development mode:
git clone https://github.com/yourusername/mcp-ollama.git
cd mcp-ollama
uv sync
Test with MCP Inspector:
mcp dev src/mcp_ollama/server.py
The server provides four main tools:
list_models
- List all downloaded Ollama modelsshow_model
- Get detailed information about a specific modelask_model
- Ask a question to a specified modelMIT
A Python package enabling LLM models to interact with the Memos server via the MCP interface for searching, creating, retrieving, and managing memos.
A python repl for MCP
Connect your chat repl to wolfram alpha computational intelligence
MCP server for the windows API.