MCP server that provides tools and resources for interacting with n8n API
A Model Context Protocol (MCP) server that allows AI assistants to interact with n8n workflows through natural language.
This project provides a Model Context Protocol (MCP) server that empowers AI assistants to seamlessly interact with n8n, a popular workflow automation tool. It acts as a bridge, enabling AI assistants to programmatically manage and control n8n workflows and executions using natural language commands.
npm install -g @leonardsellem/n8n-mcp-server
## Clone the repository
git clone https://github.com/leonardsellem/n8n-mcp-server.git
cd n8n-mcp-server
## Install dependencies
npm install
## Build the project
npm run build
## Optional: Install globally
npm install -g .
You can also run the server using Docker:
## Pull the image
docker pull leonardsellem/n8n-mcp-server
## Run the container with your n8n API configuration
docker run -e N8N_API_URL=http://your-n8n:5678/api/v1 \
-e N8N_API_KEY=your_n8n_api_key \
-e N8N_WEBHOOK_USERNAME=username \
-e N8N_WEBHOOK_PASSWORD=password \
leonardsellem/n8n-mcp-server
How you update the server depends on how you initially installed it.
If you installed the server using npm install -g @leonardsellem/n8n-mcp-server
:
npm install -g @leonardsellem/n8n-mcp-server@latest
If you cloned the repository and installed from source:
cd path/to/n8n-mcp-server
git stash
You can apply them later with git stash pop
.main
branch):git pull origin main
If you are on a different branch, replace main
with your branch name.npm install
npm run build
npm install -g .
, you might want to run this command again to update the global link:npm install -g .
node build/index.js
in your AI assistant’s MCP configuration, ensure the path is still correct. Using npm install -g .
and then n8n-mcp-server
as the command should keep this consistent.If you are running the server using Docker:
docker pull leonardsellem/n8n-mcp-server:latest
docker ps
):docker stop <your_container_name_or_id>
docker rm <your_container_name_or_id>
docker run
command you used previously, including all your necessary environment variables (refer to the “Docker Installation” section for an example command). For instance:docker run -e N8N_API_URL=http://your-n8n:5678/api/v1 \
-e N8N_API_KEY=your_n8n_api_key \
-e N8N_WEBHOOK_USERNAME=username \
-e N8N_WEBHOOK_PASSWORD=password \
leonardsellem/n8n-mcp-server:latest
Ensure you use :latest
or the specific version tag you intend to run.Create a .env
file in the directory where you’ll run the server, using .env.example
as a template:
cp .env.example .env
Configure the following environment variables:
Variable | Description | Example |
---|---|---|
N8N_API_URL |
Full URL of the n8n API, including /api/v1 |
http://localhost:5678/api/v1 |
N8N_API_KEY |
API key for authenticating with n8n | n8n_api_... |
N8N_WEBHOOK_USERNAME |
Username for webhook authentication (if using webhooks) | username |
N8N_WEBHOOK_PASSWORD |
Password for webhook authentication | password |
DEBUG |
Enable debug logging (optional) | true or false |
.env
fileFrom the installation directory:
n8n-mcp-server
Or if installed globally:
n8n-mcp-server
After building the server (npm run build
), you need to configure your AI assistant (like VS Code with the Claude extension or the Claude Desktop app) to run it. This typically involves editing a JSON configuration file.
Example Configuration (e.g., in VS Code settings.json
or Claude Desktop claude_desktop_config.json
):
{
"mcpServers": {
// Give your server a unique name
"n8n-local": {
// Use 'node' to execute the built JavaScript file
"command": "node",
// Provide the *absolute path* to the built index.js file
"args": [
"/path/to/your/cloned/n8n-mcp-server/build/index.js"
// On Windows, use double backslashes:
// "C:\path\to\your\cloned\n8n-mcp-server\build\index.js"
],
// Environment variables needed by the server
"env": {
"N8N_API_URL": "http://your-n8n-instance:5678/api/v1", // Replace with your n8n URL
"N8N_API_KEY": "YOUR_N8N_API_KEY", // Replace with your key
// Add webhook credentials only if you plan to use webhook tools
// "N8N_WEBHOOK_USERNAME": "your_webhook_user",
// "N8N_WEBHOOK_PASSWORD": "your_webhook_password"
},
// Ensure the server is enabled
"disabled": false,
// Default autoApprove settings
"autoApprove": []
}
// ... other servers might be configured here
}
}
Key Points:
/path/to/your/cloned/n8n-mcp-server/
with the actual absolute path where you cloned and built the repository./
for macOS/Linux, double backslashes \
for Windows).N8N_API_URL
(including /api/v1
) and N8N_API_KEY
.npm run build
) before the assistant can run the build/index.js
file.The server provides the following tools:
This MCP server supports executing workflows through n8n webhooks. To use this functionality:
run_webhook
tool to trigger the workflow, passing just the workflow name.Example:
const result = await useRunWebhook({
workflowName: "hello-world", // Will call <n8n-url>/webhook/hello-world
data: {
prompt: "Hello from AI assistant!"
}
});
The webhook authentication is handled automatically using the N8N_WEBHOOK_USERNAME
and N8N_WEBHOOK_PASSWORD
environment variables.
workflow_list
: List all workflowsworkflow_get
: Get details of a specific workflowworkflow_create
: Create a new workflowworkflow_update
: Update an existing workflowworkflow_delete
: Delete a workflowworkflow_activate
: Activate a workflowworkflow_deactivate
: Deactivate a workflowexecution_run
: Execute a workflow via the APIrun_webhook
: Execute a workflow via a webhookexecution_get
: Get details of a specific executionexecution_list
: List executions for a workflowexecution_stop
: Stop a running executionThe server provides the following resources:
n8n://workflows/list
: List of all workflowsn8n://workflow/{id}
: Details of a specific workflown8n://executions/{workflowId}
: List of executions for a workflown8n://execution/{id}
: Details of a specific executionThe n8n MCP Server is a community-driven project, and its future direction will be shaped by your feedback and contributions!
Currently, our roadmap is flexible and under continuous development. We believe in evolving the server based on the needs and ideas of our users.
We encourage you to get involved in shaping the future of this tool:
Please share your thoughts, feature requests, and ideas by opening an issue on our GitHub Issues page. Let’s build a powerful tool for AI assistants together!
npm run build
npm run dev
npm test
npm run lint
We welcome contributions from the community and are excited to see how you can help improve the n8n MCP Server! Whether you’re fixing a bug, proposing a new feature, or improving documentation, your help is valued.
If you encounter a bug, please report it by opening an issue on our GitHub Issues page.
When submitting a bug report, please include the following:
We’re always looking for ways to make the server better. If you have an idea for an enhancement or a new feature, please open an issue on our GitHub Issues page.
Please provide:
If you’d like to contribute code, please follow these steps:
git checkout -b feature/your-feature-name
or bugfix/issue-number
).npm run lint
).npm test
.main
branch of the official n8n-mcp-server
repository.
We’ll review your PR as soon as possible and provide feedback. Thank you for your contribution!
This project is a vibrant, community-driven tool actively used by AI enthusiasts and developers. Currently, it’s maintained on a part-time basis by a passionate individual who isn’t a seasoned engineer but is dedicated to bridging AI with workflow automation. To help this project flourish, ensure its long-term health, and keep up with its growing user base, we’re looking for enthusiastic co-maintainers to join the team!
We welcome contributions in many forms! Here are some areas where you could make a big difference:
If you’re excited about the intersection of AI and workflow automation, and you’re looking for a rewarding open-source opportunity, we’d love to hear from you!
Ready to contribute?
Let’s build the future of AI-powered workflow automation together! 🙌
Mcp Send Email
Send emails directly from Cursor with this email sending MCP server
Mcp Dbutils
DButils is an all-in-one MCP service that enables your AI to do data analysis by harnessing versatile types of database (sqlite, mysql, postgres, and more) within a unified configuration of multiple connections in a secured way (like SSL and controlled write access).
Nlweb
Natural Language Web