@
Install Command
npx -y @cognigy/mcp-serverClaude Desktop Config
{
"mcpServers": {
"cognigy-mcp-server": {
"command": "npx",
"args": [
"-y",
"@cognigy/mcp-server"
]
}
}
}@cognigy/mcp-server is a community MCP server that connects AI assistants like Claude to model context protocol server for cognigy.ai rest api. It runs locally on your machine, keeping your data private and giving you full control over the connection. AI engineers can use it to chain models and pipelines into more powerful workflows.
About @cognigy/mcp-server
Overview
Model Context Protocol server for Cognigy.AI REST API
Links
Topics
mcp, cognigy, ai, model-context-protocol, mcp-server, conversational-ai
Who Should Use @cognigy/mcp-server?
- 1Chain AI models and pipelines through a unified MCP interface
- 2Let Claude orchestrate other AI tools and models
- 3Integrate embeddings, image generation, or speech APIs into your workflow
- 4Build multi-model workflows without writing custom integration code
How to Install @cognigy/mcp-server
Before you start
You will need Node.js (v18 or later) installed on your machine — download it from nodejs.org if you haven't already.
- 1Open a terminal (Terminal on Mac, Command Prompt or PowerShell on Windows).
- 2Paste the install command above and press Enter — Node.js will download and run the server automatically.
- 3Add the server to your Claude Desktop config file (see the JSON snippet above) and restart Claude.
The Claude Desktop config snippet above can be copied and pasted directly into your claude_desktop_config.json file — no editing required.
How @cognigy/mcp-server Compares
✦It runs entirely on your local machine, so no data leaves your environment — important for teams with privacy or compliance requirements.
✦It is distributed as an npm package, making version management and updates straightforward with a single `npm update` command.
Tags
cognigyaiconversational-ai