An MCP server enhances AI responses with real-time search results via Higress ai-search.
A Model Context Protocol (MCP) server that provides an AI search tool to enhance AI model responses with real-time search results from various search engines through Higress ai-search feature.
https://github.com/user-attachments/assets/60a06d99-a46c-40fc-b156-793e395542bb
https://github.com/user-attachments/assets/5c9e639f-c21c-4738-ad71-1a88cc0bcb46
The server can be configured using environment variables:
HIGRESS_URL
(optional): URL for the Higress service (default: http://localhost:8080/v1/chat/completions
).MODEL
(required): LLM model to use for generating responses.INTERNAL_KNOWLEDGE_BASES
(optional): Description of internal knowledge bases.Using uvx will automatically install the package from PyPI, no need to clone the repository locally.
{
"mcpServers": {
"higress-ai-search-mcp-server": {
"command": "uvx",
"args": [
"higress-ai-search-mcp-server"
],
"env": {
"HIGRESS_URL": "http://localhost:8080/v1/chat/completions",
"MODEL": "qwen-turbo",
"INTERNAL_KNOWLEDGE_BASES": "Employee handbook, company policies, internal process documents"
}
}
}
}
Using uv requires cloning the repository locally and specifying the path to the source code.
{
"mcpServers": {
"higress-ai-search-mcp-server": {
"command": "uv",
"args": [
"--directory",
"path/to/src/higress-ai-search-mcp-server",
"run",
"higress-ai-search-mcp-server"
],
"env": {
"HIGRESS_URL": "http://localhost:8080/v1/chat/completions",
"MODEL": "qwen-turbo",
"INTERNAL_KNOWLEDGE_BASES": "Employee handbook, company policies, internal process documents"
}
}
}
}
This project is licensed under the MIT License - see the LICENSE{:target=“_blank”} file for details.
A beginner-friendly guide server that helps users understand MCP concepts, provides interactive examples, and demonstrates best practices for building MCP integrations. Features tools for exploring MCP capabilities, resources for learning core concepts, and prompts for guided tutorials.
A Model Context Protocol (MCP) server that provides onchain tools for LLMs, allowing them to interact with the Base network and Coinbase API.
A Model Context Protocol (MCP) server that enables AI assistants to perform network scanning operations using NMAP