Connect your data sources once, use them from any AI client.
Works with Claude Desktop, Claude API, and Claude.ai. Add the MCP server URL to your Claude configuration and start querying your data in natural language.
{
"mcpServers": {
"conexor": {
"url": "https://app.conexor.io/mcp/{org}/{endpoint}",
"headers": {
"Authorization": "Bearer YOUR_API_KEY"
}
}
}
}Integrate via ChatGPT Actions and Tools using the OpenAI plugin specification. Conexor exposes a compatible endpoint that ChatGPT can call to query your connected data sources.
Cursor has native MCP support. Add your Conexor MCP server URL in Cursor settings and your AI coding assistant gains access to your live database schema and data.
Windsurf and Codeium support MCP servers natively. Point them at your Conexor endpoint to give your AI coding assistant real-time database context.
MCP is an open protocol. Any client that implements the Model Context Protocol can connect to Conexor — n8n, Continue, custom agents, and more. The configuration is always the same: your MCP server URL and API key.
Don’t need an MCP client? Use the Conexor REST API directly. Execute queries, list data sources, fetch schema metadata, and read audit logs via standard HTTP endpoints with Bearer token authentication.
If your AI client supports the Model Context Protocol, it works with Conexor. Check our protocol documentation to get started.
Read the MCP Protocol docs →