DeepView MCP is a Model Context Protocol server that enables IDEs like Cursor and Windsurf to analyze large codebases using Gemini 2.5 Pro's extensive context window.
DeepView MCP is a Model Context Protocol server that enables IDEs like Cursor and Windsurf to analyze large codebases using Gemini’s extensive context window.
To install DeepView for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @ai-1st/deepview-mcp --client claude
pip install deepview-mcp
Note: you don’t need to start the server manually. These parameters are configured in your MCP setup in your IDE (see below).
## Basic usage with default settings
deepview-mcp [path/to/codebase.txt]
## Specify a different Gemini model
deepview-mcp [path/to/codebase.txt] --model gemini-2.0-pro
## Change log level
deepview-mcp [path/to/codebase.txt] --log-level DEBUG
The codebase file parameter is optional. If not provided, you’ll need to specify it when making queries.
--model MODEL
: Specify the Gemini model to use (default: gemini-2.0-flash-lite)--log-level {DEBUG,INFO,WARNING,ERROR,CRITICAL}
: Set the logging level (default: INFO){
"mcpServers": {
"deepview": {
"command": "/path/to/deepview-mcp",
"args": [],
"env": {
"GEMINI_API_KEY": "your_gemini_api_key"
}
}
}
}
Setting a codebase file is optional. If you are working with the same codebase, you can set the default codebase file using the following configuration:
{
"mcpServers": {
"deepview": {
"command": "/path/to/deepview-mcp",
"args": ["/path/to/codebase.txt"],
"env": {
"GEMINI_API_KEY": "your_gemini_api_key"
}
}
}
}
Here’s how to specify the Gemini version to use:
{
"mcpServers": {
"deepview": {
"command": "/path/to/deepview-mcp",
"args": ["--model", "gemini-2.5-pro-exp-03-25"],
"env": {
"GEMINI_API_KEY": "your_gemini_api_key"
}
}
}
}
The server provides one tool:
deepview
: Ask a question about the codebase
question
- The question to ask about the codebasecodebase_file
- Path to a codebase file to load before queryingDeepView MCP requires a single file containing your entire codebase. You can use repomix to prepare your codebase in an AI-friendly format.
## Make sure you're using Node.js 18.17.0 or higher
npx repomix
This will generate a repomix-output.xml
file containing your codebase.
npx repomix --init
This creates a repomix.config.json
file that you can edit to:
Here’s an example repomix.config.json
file:
{
"include": [
"**/*.py",
"**/*.js",
"**/*.ts",
"**/*.jsx",
"**/*.tsx"
],
"exclude": [
"node_modules/**",
"venv/**",
"**/__pycache__/**",
"**/test/**"
],
"output": {
"format": "xml",
"filename": "my-codebase.xml"
}
}
For more information on repomix, visit the repomix GitHub repository.
MIT
Dmitry Degtyarev ([email protected])
Mcp Qdrant Memory
MCP server providing a knowledge graph implementation with semantic search capabilities powered by Qdrant vector database
Mcp2serial
A open-source library enabling AI models to control hardware devices via serial communication using the MCP protocol. Initial support for Raspberry Pi Pico.
Hh Jira Mcp Server
The registry mcp server updates your resume while you code
MCP server for fb-idb bridge.
An intelligent MCP server that serves as a guardian of development knowledge, providing Cline assistants with curated access to latest documentation and best practices across the software development landscape