MCP Server LogoMCP Server
MCPsカテゴリディレクトリ投稿する
投稿する
MCPsカテゴリディレクトリ投稿する
投稿する

MCPサーバー

MCPサーバーのリスト、Awesome MCPサーバーとClaude MCP統合を含む。AIの能力を強化するためのMCPサーバーを検索して発見します。

お問い合わせ

[email protected]

MCPサーバーについて

プライバシーポリシー利用規約

リソース

モデルコンテキストプロトコルMCPスターターガイドClaude MCPサーバー

コミュニティ

GitHub

© 2025 mcpserver.cc © 2025 MCPサーバー. 全著作権所有.

プライバシーポリシー利用規約
  1. Home
  2. /Categories
  3. /Developer Tools
  4. /Mcp Client
Mcp Client

Mcp Client

作成者 rakesh-eltropy•6 months ago
サイトを訪問する

Developer Tools
MCPAPI-clientCLI

MCP REST API and CLI Client

A simple REST API and CLI client to interact with Model Context Protocol (MCP) servers.

Key Features

1. MCP-Compatible Servers

  • Supports any MCP-compatible servers servers.
  • Pre-configured default servers:
    • SQLite (test.db has been provided with sample products data)
    • Brave Search
  • Additional MCP servers can be added in the mcp-server-config.json{:target=“_blank”} file

2. Integrated with LangChain

  • Leverages LangChain to execute LLM prompts.
  • Enables multiple MCP servers to collaborate and respond to a specific query simultaneously.

3. LLM Provider Support

  • Compatible with any LLM provider that supports APIs with function capabilities.
  • Examples:
    • OpenAI
    • Claude
    • Gemini
    • AWS Nova
    • Groq
    • Ollama
    • Essentially all LLM providers are supported as long as they provide a function-based API. Please refer langchain documentation for more details.

Setup

  1. Clone the repository:

    git clone https://github.com/rakesh-eltropy/mcp-client.git
    
  2. Navigate to the Project Directory After cloning the repository, move to the project directory:

    cd mcp-client
    
  3. Set the OPENAI_API_KEY environment variable:

    export OPENAI_API_KEY=your-openai-api-key
    

    You can also set the OPENAI_API_KEY in the mcp-server-config.json{:target=“_blank”} file.

    You can also set the provider and model in the mcp-server-config.json{:target=“_blank”} file. e.g. provider can be ollama and model can be llama3.2:3b.

4.Set the BRAVE_API_KEY environment variable:

export BRAVE_API_KEY=your-brave-api-key

You can also set the BRAVE_API_KEY in the mcp-server-config.json{:target=“_blank”} file. You can get the free BRAVE_API_KEY from Brave Search API.

  1. Running from the CLI:

    uv run cli.py
    

    To explore the available commands, use the help option. You can chat with LLM using chat command. Sample prompts:

      What is the capital city of India?
    
      Search the most expensive product from database and find more details about it from amazon?
    
  2. Running from the REST API:

    uvicorn app:app --reload
    

    You can use the following curl command to chat with llm:

    curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?"}' http://localhost:8000/chat
    

    You can use the following curl command to chat with llm with streaming:

    curl -X POST -H "Content-Type: application/json" -d '{"message": "list all the products from my local database?", "streaming": true}' http://localhost:8000/chat
    

Contributing

Feel free to submit issues and pull requests for improvements or bug fixes.

前提条件

  • •サーバーのドメインに精通している
  • •関連技術の基本的な理解
  • •Developer Toolsの知識

おすすめのサーバー

もっと見る → →

詳細

作成日

June 11, 2025

最終更新日

June 11, 2025

カテゴリー

Developer Tools

作成者

rakesh-eltropy

シェアする