A Model Context Protocol (MCP) server implementation for LanceDB vector database operations. This server enables efficient vector storage, similarity search, and management of vector embeddings with associated metadata.
The server exposes vector database tables as resources:
table://{name}
: A vector database table that stores embeddings and metadata
POST /table
{
"name": "my_table", # Table name
"dimension": 768 # Vector dimension
}
POST /table/{table_name}/vector
{
"vector": [0.1, 0.2, ...], # Vector data
"text": "associated text" # Metadata
}
POST /table/{table_name}/search
{
"vector": [0.1, 0.2, ...], # Query vector
"limit": 10 # Number of results
}
## Clone the repository
git clone https://github.com/yourusername/lancedb_mcp.git
cd lancedb_mcp
## Install dependencies using uv
uv pip install -e .
## Add the server to your claude_desktop_config.json
"mcpServers": {
"lancedb": {
"command": "uv",
"args": [
"run",
"python",
"-m",
"lancedb_mcp",
"--db-path",
"~/.lancedb"
]
}
}
## Install development dependencies
uv pip install -e ".[dev]"
## Run tests
pytest
## Format code
black .
ruff .
LANCEDB_URI
: Path to LanceDB storage (default: “.lancedb”)This project is licensed under the MIT License. See the LICENSE file for details.
Mcp Paradex Py
Connect AI agents to the Paradex trading platform. Retrieve market data, manage accounts, and execute trades seamlessly. Enhance your trading experience with automated tools and real-time insights.
Zaj Mysql Mcp
Git Forensics Mcp
An MCP server for deep git repository investigation and analysis. Provides detailed insights into repository history, branch relationships, and development patterns, focusing solely on git repository analysis rather than general GitHub or git operation.
🤖 MCP Server for Substack integration with Claude AI Desktop. Download and parse Substack posts directly in your Claude conversations.
Global Notion workspace-accessible MCP server for all Notion pages within the workspace
A Model Context Protocol (MCP) server that provides basic mathematical and statistical functions to Large Language Models (LLMs). This server enables LLMs to perform accurate numerical calculations through a simple API.
MCP Server for Trino