Axiom AI SDK is an open-source project and welcomes your contributions. For more information, see the GitHub repository.
Prerequisite
Follow the procedure in Quickstart to set up Axiom AI SDK in your TypeScript project.Instrument AI SDK calls
Axiom AI SDK provides helper functions for Vercel’s AI SDK to wrap your existing AI model client. ThewrapAISDKModel
function takes an existing AI model object and returns an instrumented version that automatically generates trace data for every call.
Add context
ThewithSpan
function allows you to add crucial business context to your traces. It creates a parent span around your LLM call and attaches metadata about the capability
and step
that you execute.
/src/app/page.tsx
Instrument tool calls
For many AI capabilities, the LLM call is only part of the story. If your capability uses tools to interact with external data or services, observing the performance and outcome of those tools is critical. Axiom AI SDK provides thewrapTool
and wrapTools
functions to automatically instrument your Vercel AI SDK tool definitions.
The wrapTool
helper takes your tool’s name and its definition and returns an instrumented version. This wrapper creates a dedicated child span for every tool execution, capturing its arguments, output, and any errors.
/src/app/generate-text/page.tsx
Complete example
Example of how all three instrumentation functions work together in a single, real-world example:/src/app/page.tsx
wrapAISDKModel
: Automatically captures telemetry for the LLM provider callwrapTool
: Instruments the tool execution with detailed spanswithSpan
: Creates a parent span that ties everything together under a business capability
What’s next?
After sending traces to Axiom:- View your traces in Console
- Set up monitors and alerts based on your AI telemetry data
- Learn about developing AI features with confidence using Axiom