Axiom MCP Server is a Model Context Protocol server implementation that enables AI agents to query your data using Axiom Processing Language (APL). Deploy Axiom MCP Server either in a hosted environment or locally on your machine:

Current capabilities

Axiom MCP Server supports the following MCP tools:
  • getDatasetSchema: Get dataset schema
  • getMonitors: List monitoring configurations
  • getMonitorsHistory: Get monitor execution history
  • getSavedQueries: Retrieve saved APL queries
  • listDatasets: List available Axiom datasets
  • queryApl: Execute APL queries against Axiom datasets
Axiom MCP Server supports the following MCP prompts:
  • correlate-events-across-datasets: Find patterns and correlations between events across multiple datasets
  • data-quality-investigation: Investigate data quality issues including missing data, inconsistencies, and collection problems
  • detect-anomalies-in-events: Generic anomaly detection using statistical analysis and pattern recognition across any dataset
  • establish-performance-baseline: Establish performance baselines for a dataset to enable effective monitoring and anomaly detection
  • explore-unknown-dataset: Exploration of an unknown dataset to understand its structure, content, and potential use cases
  • monitor-health-analysis: Comprehensive analysis of monitor health, alert patterns, and effectiveness
Axiom plans to support MCP resources in the future. Axiom MCP Server works with all AI agents that support MCP.

Prerequisites

Hosted setup

In your AI agent, add a remote MCP connection with the following details:
  • Name: Axiom
  • Server URL: https://mcp.axiom.co/mcp
For AI agents that require server-sent events (SSE), use the server URL https://mcp.axiom.co/sse. For more information, see the documentation of your AI agent. For example: When your AI agent first attempts to interact with Axiom MCP Server, authenticate the request in your browser. You can later revoke access on the Profile page of the Axiom web UI.

Local setup

Install

Run the following to install the latest built binary from GitHub:
go install github.com/axiomhq/axiom-mcp@latest
Axiom MCP Server is an open-source project and welcomes your contributions. For more information, see the GitHub repository.

Create API token

Create an API token in Axiom with permissions to update the dataset you have created.

Configure

Configure your AI agent to use the MCP server in one of the following ways:

Config file

  1. Create a config file where you specify the authentication and configuration details. For example:
    config.txt
    token API_TOKEN
    url https://AXIOM_DOMAIN
    query-rate 1
    query-burst 1
    datasets-rate 1
    datasets-burst 1
    monitors-rate 1
    monitors-burst 1
    
    Replace API_TOKEN with the Axiom API token you have generated. For added security, store the API token in an environment variable.Replace AXIOM_DOMAIN with api.axiom.co if your organization uses the US region, and with api.eu.axiom.co if your organization uses the EU region. For more information, see Regions.
    Optionally, configure the rate and the burst limits at the query, dataset, and monitor levels. The rate limit is the maximum number of requests that your AI agent can make per second. The burst limit is the maximum number of requests that your AI agent can make in a short period (burst) before the rate limit applies to further requests.
  2. In the settings of your AI agent, reference the binary file of Axiom MCP Server and the config file you have previously created. For example, if you use the Claude desktop app, add axiom to the mcpServers section of the Claude configuration file. If you use a Mac, the default path is ~/Library/Application Support/Claude/claude_desktop_config.json.
    claude_desktop_config.json
    {
      "mcpServers": {
        "axiom": {
          "command": "PATH_AXIOM_MCP_BINARY",
          "args" : ["--config", "PATH_AXIOM_MCP_CONFIG"]
        }
      }
    }
    
    Replace PATH_AXIOM_MCP_BINARY with the path the binary file of Axiom MCP Server. By default, it’s ~/go/bin/axiom-mcp.Replace PATH_AXIOM_MCP_CONFIG with the path the config file you have previously created.

Environment variables

In the settings of your AI agent, use environment variables to specify the authentication details and the binary file of Axiom MCP Server. For example, if you use the Claude desktop app, add axiom to the mcpServers section of the Claude configuration file. If you use a Mac, the default path is ~/Library/Application Support/Claude/claude_desktop_config.json. For example:
claude_desktop_config.json
{
  "mcpServers": {
    "axiom": {
      "command": "PATH_AXIOM_MCP_BINARY",
      "env": {
        "AXIOM_TOKEN": "API_TOKEN",
        "AXIOM_URL": "https://AXIOM_DOMAIN",
        "AXIOM_QUERY_RATE": "1",
        "AXIOM_QUERY_BURST": "1",
        "AXIOM_DATASETS_RATE": "1",
        "AXIOM_DATASETS_BURST": "1",
        "AXIOM_MONITORS_RATE": "1",
        "AXIOM_MONITORS_BURST": "1"
      }
    }
  }
}
Replace PATH_AXIOM_MCP_BINARY with the path the binary file of Axiom MCP Server. By default, it’s ~/go/bin/axiom-mcp.Replace API_TOKEN with the Axiom API token you have generated. For added security, store the API token in an environment variable.Replace AXIOM_DOMAIN with api.axiom.co if your organization uses the US region, and with api.eu.axiom.co if your organization uses the EU region. For more information, see Regions.
Optionally, configure the rate and the burst limits at the query, dataset, and monitor levels. The rate limit is the maximum number of requests that your AI agent can make per second. The burst limit is the maximum number of requests that your AI agent can make in a short period (burst) before the rate limit applies to further requests.

Ask your AI agent about your data

Ask your AI agent questions about your Axiom data. For example:
  • “List datasets.”
  • “Get the data schema for the dataset logs.”
  • “Get the most common status codes in the last 30 minutes in the dataset logs.”
For more information about the types of questions Axiom MCP Server can help you answer, see Current capabilities.