Author
Mano Toth
Senior Technical Writer
Back in December 2024, the Axiom MCP server started as a simple way to ask natural language questions and get powerful answers from your event data. Now, it’s becoming a lot more than that.
We’ve added four new tools to expand what’s possible when connecting Axiom to your favorite LLM clients:
getDatasetSchema
helps you explore unfamiliar datasets by revealing their structure up front.getSavedQueries
taps into your team’s existing knowledge, giving LLMs access to your most-used patterns.getMonitors
lets models understand what’s critical to your operations by pulling in your current monitors.getMonitorsHistory
provides visibility into how those monitors have behaved over time, making trends and anomalies easier to spot.
These additions turn the Axiom MCP server from a basic querying interface into a well-rounded observability companion. Whether you’re debugging issues, summarizing incidents, or guiding an AI assistant through a dataset, you now have the tools to bring in the right context, right when it’s needed.
Start building with the Axiom MCP server today. Check out the open-source implementation on GitHub.
Update default version of OpenTelemetry semantic conventions
We have updated the default version of OpenTelemetry semantic conventions from 1.32 to 1.33. This means that the shape of your data can change when you don’t specify the version of the OTel semantic conventions in the data you send to Axiom. For more information, see Semantic conventions.