Generate an MCP server from any OpenAPI specification. Let AI assistants interact with your REST APIs instantly.
npm install -g @bahridev/mcpify
copy
# Start an MCP server from any OpenAPI spec mcpify ./openapi.yaml # With authentication mcpify ./api.yaml --bearer-token $TOKEN # Filter specific operations mcpify ./api.yaml --include "get*,list*" --exclude "delete*" mcpify v1.0.0 — serving 24 tools from "My API"
Zero config required. Just point it at your spec.
Point it at any OpenAPI 3.x spec and get a fully functional MCP server. No setup, no boilerplate.
Include or exclude operations with glob patterns. Filter by tags. Expose only what you need.
Bearer tokens, API keys, environment variables. Auto-detects auth type from your spec.
Use as a CLI tool or import as a library. Full TypeScript types included.
Automatically marks GET as read-only, DELETE as destructive. AI knows what's safe.
Load specs from local files or remote URLs. Supports YAML and JSON.
From spec to server in four steps.
Reads and dereferences your OpenAPI spec, resolving all $ref pointers
Converts each operation into an MCP tool with JSON Schema validation
Starts an MCP server over stdio that AI assistants can connect to
When a tool is called, builds and sends the HTTP request, returns the response
Add to your AI assistant config and start using your API in seconds.
{ "mcpServers": { "my-api": { "command": "mcpify", "args": [ "./openapi.yaml", "--bearer-token", "sk-..." ] } } }
import { parseSpec, generateTools, startServer } from '@bahridev/mcpify' const spec = await parseSpec('./api.yaml') const tools = generateTools(spec.operations) await startServer({ spec, tools, operations: spec.operations, baseUrl: spec.defaultServerUrl, auth: { type: 'none' }, transport: 'stdio', port: 3100, maxResponseSize: 50 * 1024, })