MCP server to download entire websites
This MCP server provides a tool to download entire websites using wget. It preserves the website structure and converts links to work locally.
The server requires wget
to be installed on your system.
Using Homebrew:
brew install wget
sudo apt-get update
sudo apt-get install wget
sudo dnf install wget
choco install wget
The server provides a tool called download_website
with the following parameters:
url
(required): The URL of the website to downloadoutputPath
(optional): The directory where the website should be downloaded. Defaults to the current directory.depth
(optional): Maximum depth level for recursive downloading. Defaults to infinite. Set to 0 for just the specified page, 1 for direct links, etc.{
"url": "https://example.com",
"outputPath": "/path/to/output",
"depth": 2 // Optional: Download up to 2 levels deep
}
The website downloader:
npm install
npm run build
{
"mcpServers": {
"website-downloader": {
"command": "node",
"args": ["/path/to/website-downloader/build/index.js"]
}
}
}
gitlab mcp
A open-source library enabling AI models to control hardware devices via serial communication using the MCP protocol. Initial support for Raspberry Pi Pico.
MCP server for RAG-based document search and management
An MCP implementation for Selenium WebDriver