Easily generate a Model Context Protocol (MCP) server from any OpenAPI specification!
This tool helps you bootstrap new MCP servers for any API with an OpenAPI spec.
- ⚡ Automatic MCP server generation from OpenAPI specs
- 📝 Supports JSON and YAML formats
- 🔍 Auto-detects spec file type (
.json
,.yaml
,.yml
) - 🛠️ Tool modules for each API endpoint
- 🤖 API client code generation
- 📋 Logging & error handling setup
- ⚙️ Configuration files (
pyproject.toml
,.env.example
) - 📚 Comprehensive documentation generation
- 🤖 Enhanced Docstrings via LLM integration
- 🚀
--generate-agent
flag – additionally produces a LangGraph React agent (with A2A server, Makefile, README and .env.example) alongside the generated MCP server.
pipx run --spec git+https://github.com/cnoe-io/openapi-mcp-codegen.git@main openapi_mcp_codegen \
--spec-file examples/petstore/openapi_petstore.json \
--output-dir examples/petstore/mcp_server \
--generate-agent
pipx run --spec git+https://github.com/cnoe-io/openapi-mcp-codegen.git@v0.1.0 openapi_mcp_codegen \
--spec-file examples/petstore/openapi_petstore.json \
--output-dir examples/petstore/mcp_server \
--generate-agent
Use the --enhance-docstring-with-llm
flag if you want to improve generated docstrings with an LLM. This option leverages your LLM provider's configuration via environment variables.
To set up your LLM provider, refer to this guide.
pipx run --spec git+https://github.com/cnoe-io/openapi-mcp-codegen.git@main openapi_mcp_codegen \
--spec-file examples/petstore/openapi_petstore.json \
--output-dir examples/petstore/mcp_server \
--generate-agent \
--enhance-docstring-with-llm # Optional: enhances docstrings using LLM (see guide)
Prefer to run locally or contribute?
git clone https://github.com/cnoe-io/openapi-mcp-codegen
cd openapi-mcp-codegen
poetry install
- The generator will create code in a new directory called
mcp_<server-name>
in the directory you specify. - Follow the setup instructions printed by the generator.
Option 1:
make generate -- --spec-file examples/petstore/openapi_petstore.json --output-dir examples/petstore/mcp_server
Option 2:
poetry run openapi_mcp_codegen --spec-file examples/petstore/openapi_petstore.json --output-dir examples/petstore/mcp_server
mcp_petstore/
├── mcp_petstore/
│ ├── __init__.py
│ ├── api/
│ │ ├── __init__.py
│ │ └── client.py
│ ├── models/
│ │ ├── __init__.py
│ │ └── [models].py
│ ├── server.py
│ ├── tools/
│ │ ├── __init__.py
│ │ └── [tools].py
│ └── utils/
│ └── __init__.py
├── poetry.lock
├── pyproject.toml
└── README.md
When you run the generator with --generate-agent
, the output directory
also contains:
agent.py
– LangGraph wrapperprotocol_bindings/a2a_server/
– runnable A2A serverMakefile
,README.md
,.env.example
Example:
cd examples/petstore/mcp_server
cp .env.example .env # add {{ MCP_NAME | upper }}_API_URL & TOKEN
make run-a2a # start the A2A server
# In another terminal:
make run-a2a-client # docker chat client
- ✏️ Modify generated tool modules in
tools/
- 🧩 Add custom models in
models/
- 🔌 Extend the API client in
api/client.py
- 🛠️ Add utility functions in
utils/
See MAINTAINERS.md
See CONTRIBUTING.md
If you discover a security vulnerability, please see our SECURITY.md for responsible disclosure guidelines.
This project follows the Contributor Covenant Code of Conduct.
- MCP on PyPI, the official Model Context Protocol (MCP) Python package
- Thanks to the MCP community for their support
- Built with Poetry, uv, and open source love 💜