INTEGRATIONS

Works with every AI assistant that supports MCP

Connect your data sources once, use them from any AI client.

🟣

Claude

Anthropic

Works with Claude Desktop, Claude API, and Claude.ai. Add the MCP server URL to your Claude configuration and start querying your data in natural language.

claude_desktop_config.json
{
  "mcpServers": {
    "conexor": {
      "url": "https://app.conexor.io/mcp/{org}/{endpoint}",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY"
      }
    }
  }
}
🟢

ChatGPT

OpenAI

Integrate via ChatGPT Actions and Tools using the OpenAI plugin specification. Conexor exposes a compatible endpoint that ChatGPT can call to query your connected data sources.

Cursor

Cursor IDE

Cursor has native MCP support. Add your Conexor MCP server URL in Cursor settings and your AI coding assistant gains access to your live database schema and data.

🌊

Windsurf / Codeium

Codeium

Windsurf and Codeium support MCP servers natively. Point them at your Conexor endpoint to give your AI coding assistant real-time database context.

🔌

Any MCP-compatible client

Open standard

MCP is an open protocol. Any client that implements the Model Context Protocol can connect to Conexor — n8n, Continue, custom agents, and more. The configuration is always the same: your MCP server URL and API key.

🔗

REST API

Direct integration

Don’t need an MCP client? Use the Conexor REST API directly. Execute queries, list data sources, fetch schema metadata, and read audit logs via standard HTTP endpoints with Bearer token authentication.

Your tool not listed?

If your AI client supports the Model Context Protocol, it works with Conexor. Check our protocol documentation to get started.

Read the MCP Protocol docs →
Relay

Quick questions

Relay

Quick questions