MCP Server

Connect Cograph to AI agents via the Model Context Protocol.

What is MCP?

The Model Context Protocol (MCP) lets AI agents call tools on external systems. Cograph exposes a MCP server so agents like Claude, Cursor, Windsurf, and custom agents can query your knowledge graphs directly.

Setup

Add this to your MCP client configuration (e.g., Claude Code settings, Cursor config, or ~/.claude/mcp.json):

{
  "mcpServers": {
    "cograph": {
      "command": "python",
      "args": ["-m", "omnix.mcp_server"],
      "env": {
        "OMNIX_API_URL": "http://localhost:8000",
        "OMNIX_API_KEY": "dev-key-001"
      }
    }
  }
}

Available Tools

ask

Ask a natural language question against a knowledge graph.

Parameters
questionstring, requiredThe question to ask
kg_namestring, optionalTarget knowledge graph name

list_knowledge_graphs

List all available knowledge graphs and their descriptions. No parameters.

ingest_csv

Ingest a CSV file into a knowledge graph. Schema is automatically inferred.

Parameters
file_pathstring, requiredAbsolute path to the CSV file
kg_namestring, requiredName for the knowledge graph

view_ontology

View the ontology (types, attributes, relationships) across all knowledge graphs. No parameters.

Example Usage

Once configured, you can interact with your knowledge graphs from any MCP-compatible agent:

> "What knowledge graphs do I have?"
  → calls list_knowledge_graphs()

> "How many events in San Francisco are free?"
  → calls ask(question="...", kg_name="events-sf")

> "Ingest this sales data"
  → calls ingest_csv(file_path="/path/to/sales.csv", kg_name="sales-2026")