Reliable LLM Memory for AI Applications and AI Agents
Clone the cognee repo
Install dependencies
brew install uv
cd cognee-mcp
uv sync --dev --all-extras --reinstall
source .venv/bin/activate
The file should be located here: ~/Library/Application\ Support/Claude/
cd ~/Library/Application\ Support/Claude/
You need to create claude_desktop_config.json in this folder if it doesn’t exist Make sure to add your paths and LLM API key to the file bellow Use your editor of choice, for example Nano:
nano claude_desktop_config.json
{
"mcpServers": {
"cognee": {
"command": "/Users/{user}/cognee/.venv/bin/uv",
"args": [
"--directory",
"/Users/{user}/cognee/cognee-mcp",
"run",
"cognee"
],
"env": {
"ENV": "local",
"TOKENIZERS_PARALLELISM": "false",
"LLM_API_KEY": "sk-"
}
}
}
}
Restart your Claude desktop.
To install Cognee for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install cognee --client claude
Define cognify tool in server.py Restart your Claude desktop.
To use debugger, run:
mcp dev src/server.py
Open inspector with timeout passed:
http://localhost:5173?timeout=120000
To apply new changes while developing cognee you need to do:
poetry lock in cognee folderuv sync --dev --all-extras --reinstall mcp dev src/server.pyAn MCP (Model Context Protocol) tool that provides real-time weather data, forecasts, and historical weather information using the OpenWeatherMap API, specifically designed for Claude Desktop.
Model Context Protocol server for OpenStreetMap data
An MCP server implementation that integrates the Tavily Search API, providing optimized search capabilities for LLMs.