A task management system designed for AI development
Transform requirements into actionable tasks, generate implementation plans, track progress, and accelerate development – all powered by AI, directly within your workflow.
Conductor Tasks is an intelligent assistant designed for developers. It integrates seamlessly into your editor (via MCP) or works as a standalone CLI tool, leveraging multiple LLMs to streamline your development process from planning to execution.
While many AI-powered task management tools offer valuable assistance, Conductor Tasks is engineered to provide a more comprehensive, flexible, and deeply integrated AI development assistant. Here’s how Conductor Tasks stands out:
True Multi-LLM Architecture for Optimal Results & Cost: Conductor Tasks is built with a foundational multi-LLM strategy, not just as an add-on. It seamlessly integrates with a broad spectrum of providers (OpenAI, Anthropic, Groq, Mistral, Google Gemini, Perplexity, xAI, Azure OpenAI, OpenRouter) and local/custom endpoints (e.g., Ollama, LM Studio). This empowers you to:
Unlike systems that may rely heavily on a single primary LLM or offer limited provider choices, Conductor Tasks offers genuine flexibility and strategic LLM utilization at its core.
Advanced AI-Driven Development & Task Lifecycle Management: Beyond basic PRD parsing, Conductor Tasks offers a richer suite of AI tools that assist throughout the development lifecycle:
generate-implementation-steps
and expand-task
provide detailed, actionable plans.suggest-task-improvements
to iteratively refine task definitions and scope.research-topic
command allows AI to gather information directly related to a task, embedding knowledge gathering into your workflow.generate-diff
help in visualizing and creating code changes.This provides a more in-depth AI partnership from planning through to aspects of implementation, exceeding the scope of simpler task generation tools.
Built-in Visual Project Oversight: Gain clearer insights into your project’s status and structure with:
visualize-tasks-kanban
for a familiar agile overview.visualize-tasks-dependency-tree
to understand task relationships.visualize-tasks-dashboard
for a high-level statistical view.Many task systems require external tools for such visualizations; Conductor Tasks integrates them.
Versatile Task Templating Engine: Standardize common project setups and repetitive task structures with:
list-task-templates
, get-task-template
, and create-task-from-template
.This feature promotes reusability and efficiency, often not found in less comprehensive task systems.
In essence, Conductor Tasks aims to be a more powerful, adaptable, and economically sensible AI co-pilot for the entire development process.
mcp.json
, settings.json
):{
"mcpServers": {
"conductor-tasks": {
"command": "npx",
// Ensure conductor-tasks is installed or use the correct path
"args": ["conductor-tasks", "--serve-mcp"],
// Set API keys and preferences via environment variables
"env": {
"OPENAI_API_KEY": "YOUR_OPENAI_KEY_HERE",
"ANTHROPIC_API_KEY": "YOUR_ANTHROPIC_KEY_HERE",
"GOOGLE_API_KEY": "YOUR_GOOGLE_KEY_HERE",
// Add other keys (MISTRAL, GROQ, PERPLEXITY, OPENROUTER, XAI, AZURE) as needed
"DEFAULT_LLM_PROVIDER": "openai" // Or your preferred default
}
}
}
}
"Initialize conductor-tasks for my project."
"Parse the PRD at 'docs/requirements.md' into tasks."
"What's the next task I should work on?"
"Help me implement task <ID>."
"Generate implementation steps for task <ID>."
"Show me the tasks as a kanban board."
# Install globally (recommended for CLI use)
npm install -g conductor-tasks
# Or use npx without installing globally
# npx conductor-tasks <command>
.env
file in your project or export variables (e.g., export OPENAI_API_KEY="sk-..."
). See Configuration below.# Initialize Conductor Tasks in a new or existing project
conductor-tasks init --projectName "My Awesome App" --projectDescription "Building the future"
# Parse a PRD file and create/update TASKS.md
conductor-tasks parse-prd ./path/to/your/prd.md --createTasksFile
# List all tasks
conductor-tasks list
# Get the next suggested task
conductor-tasks next
# Get details for a specific task
conductor-tasks get --id <TASK_ID>
# Update a task (e.g., set status to 'in_progress')
conductor-tasks update --id <TASK_ID> --status in_progress
# Generate detailed implementation steps for a task
conductor-tasks generate-steps --id <TASK_ID>
# Visualize tasks
conductor-tasks visualize --kanban
conductor-tasks visualize --dependency-tree
For more detailed information, check out the documentation in the docs
directory or explore the CLI help (conductor-tasks --help
or conductor-tasks <command> --help
).
docs/mcp-setup.md
){:target=“_blank”} (Detailed guide for MCP-specific environment variables and editor integration)Conductor Tasks uses environment variables for configuration, typically loaded from a .env
file in your project root or set via MCP. For a detailed guide on environment variable settings, please see the MCP Configuration Guide{:target=“_blank”}.
Required:
OPENAI_API_KEY
, ANTHROPIC_API_KEY
, GOOGLE_API_KEY
, MISTRAL_API_KEY
, GROQ_API_KEY
, PERPLEXITY_API_KEY
, OPENROUTER_API_KEY
, XAI_API_KEY
, AZURE_OPENAI_API_KEY
).Optional:
DEFAULT_LLM_PROVIDER
: (e.g., openai
, anthropic
, google
) Sets the default provider if multiple keys are present.OPENAI_MODEL
, ANTHROPIC_MODEL
, etc.: Specify default models for each provider.OPENAI_BASE_URL
: Use a custom OpenAI-compatible endpoint (e.g., for Ollama, LM Studio).LOG_LEVEL
: (e.g., info
, debug
) Control logging verbosity.Example .env
file:
## Required Keys (add all you intend to use)
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=ant-...
GOOGLE_API_KEY=AIza...
## Optional Defaults & Customization
DEFAULT_LLM_PROVIDER=openai
OPENAI_MODEL=gpt-4o
ANTHROPIC_MODEL=claude-3-opus-20240229
## OPENAI_BASE_URL=http://localhost:11434/v1 # Example for local Ollama
LOG_LEVEL=info
Contributions, issues, and feature requests are welcome!
This project is licensed under the MIT License. See the LICENSE{:target=“_blank”} file for details.
Mcp2serial
A open-source library enabling AI models to control hardware devices via serial communication using the MCP protocol. Initial support for Raspberry Pi Pico.
Nmap Mcp Server
A Model Context Protocol (MCP) server that enables AI assistants to perform network scanning operations using NMAP
Mcp Server Diff Typescript
Obsidian Knowledge-Management MCP (Model Context Protocol) server that enables AI agents and development tools to interact with an Obsidian vault. It provides a comprehensive suite of tools for reading, writing, searching, and managing notes, tags, and frontmatter, acting as a bridge to the Obsidian Local REST API plugin.
A simple note-taking MCP server for recording and managing notes with AI models.
The registry mcp server updates your resume while you code
An intelligent MCP server that serves as a guardian of development knowledge, providing Cline assistants with curated access to latest documentation and best practices across the software development landscape