Local-first AI orchestrator and CLI that spins up multi-agent LLMs, FastAPI services, and a message bus with Docker. Offline by default, cloud optional.
-
Updated
Sep 7, 2025 - Python
Local-first AI orchestrator and CLI that spins up multi-agent LLMs, FastAPI services, and a message bus with Docker. Offline by default, cloud optional.
🔬 Parallel AI research orchestration tool - Deploy multiple cursor-agent workers simultaneously to conduct comprehensive research with specialized focus areas and intelligent synthesis.
🧠 Orchestrate parallel AI research with Cursor CLI Heavy to synthesize diverse insights into comprehensive reports on any topic.
🧠 Conduct parallel research using multiple AI assistants for a comprehensive analysis on any topic, synthesizing findings into unified reports.
A lightweight, Spring Boot–friendly plugin framework** to orchestrate complex AI workflows — integrating multiple LLMs, tools, and [MCP (Model Context Protocol)](https://modelcontextprotocol.io/) services with zero boilerplate.
Add a description, image, and links to the ai-orchestrator topic page so that developers can more easily learn about it.
To associate your repository with the ai-orchestrator topic, visit your repo's landing page and select "manage topics."