Compresto MCP server
A Model Context Protocol (MCP) server for Compresto, providing AI assistants with real-time data about Compresto’s usage statistics.
Compresto is a file compression app that helps users reduce file sizes. This MCP server allows AI assistants to access current statistics about Compresto’s usage.
The Model Context Protocol (MCP) is a standard that connects AI systems with external tools and data sources. This MCP server extends AI capabilities by providing access to Compresto’s usage statistics.
git clone https://github.com/dqhieu/compresto-mcp
cd compresto-mcp
npm install
npm run build
Add the following to your MCP settings file
{
"mcpServers": {
"compresto": {
"command": "node",
"args": [
"/ABSOLUTE/PATH/TO/PARENT/FOLDER/compresto-mcp/build/index.js"
]
}
}
}
When integrated with compatible AI assistants, this MCP server provides real-time data about Compresto’s usage.
The Compresto MCP server provides the following tools:
Returns the total number of Compresto users.
Example response: 12345
Returns the total number of files processed by Compresto.
Example response: Processed 67890 files
Returns the total amount of file size reduced by Compresto.
Example response: Reduced 1234567890 bytes
src/index.ts - Main entry point containing MCP server implementationpackage.json - Project dependencies and scriptstsconfig.json - TypeScript configurationMIT License
Mcp Server Smtp
A Model Context Protocol server for SMTP email services
Hana Mcp Server
Model Context Server Protocol for your HANA DB
Mcp Crypto Price
A Model Context Protocol (MCP) server that provides real-time cryptocurrency analysis via CoinCap's API. Enables Claude and other MCP clients to fetch crypto prices, analyze market trends, and track historical data.
Server implementing Model Context Protocol for Waldur
MCP server for Dub.co link shortener API integration
A DuckDuckGo search plugin for Model Context Protocol (MCP), compatible with Claude Code. Provides web search functionality with advanced navigation and content exploration features.