Quickly start capturing telemetry data from your generative AI apps. After installation and configuration, follow the Axiom AI engineering workflow to create, measure, observe, and iterate on your capabilities.
This page explains how to set up instrumentation with Axiom AI SDK. Expand the section below to chooose the right instrumentation approach for your needs.

Prerequisites

Install

Install Axiom AI SDK into your TypeScript project:
pnpm i axiom
The axiom package includes the axiom command-line interface (CLI) for managing your AI assets, which will be used in later stages of the Axiom AI engineering workflow.

Configure tracer

To send data to Axiom, configure a tracer. For example, use a dedicated instrumentation file and load it before the rest of your app. An example configuration for a Node.js environment:
  1. Install dependencies:
    pnpm i \
      dotenv \
      @opentelemetry/exporter-trace-otlp-http \
      @opentelemetry/resources \
      @opentelemetry/sdk-node \
      @opentelemetry/sdk-trace-node \
      @opentelemetry/semantic-conventions \
      @opentelemetry/api
    
  2. Create instrumentation file:
    /src/instrumentation.ts
    
    import 'dotenv/config'; // Make sure to load environment variables
    import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-http';
    import { resourceFromAttributes } from '@opentelemetry/resources';
    import { NodeSDK } from '@opentelemetry/sdk-node';
    import { SimpleSpanProcessor } from '@opentelemetry/sdk-trace-node';
    import { ATTR_SERVICE_NAME } from '@opentelemetry/semantic-conventions';
    import { trace } from "@opentelemetry/api";
    import { initAxiomAI, RedactionPolicy } from 'axiom';
    
    const tracer = trace.getTracer("my-tracer");
    
    // Configure the NodeSDK to export traces to your Axiom dataset
    const sdk = new NodeSDK({
      resource: resourceFromAttributes({
        [ATTR_SERVICE_NAME]: 'my-ai-app', // Replace with your service name
      },
      {
        // Use the latest schema version
        // Info: https://opentelemetry.io/docs/specs/semconv/
        schemaUrl: 'https://opentelemetry.io/schemas/1.37.0',
      }),
      spanProcessor: new SimpleSpanProcessor(
        new OTLPTraceExporter({
          url: `https://api.axiom.co/v1/traces`,
          headers: {
            Authorization: `Bearer ${process.env.AXIOM_TOKEN}`,
            'X-Axiom-Dataset': process.env.AXIOM_DATASET!,
          },
        }),
      ),
    });
    
    // Start the SDK
    sdk.start();
    
    // Initialize Axiom AI SDK with the configured tracer
    initAxiomAI({ tracer, redactionPolicy: RedactionPolicy.AxiomDefault });
    

Store environment variables

Store environment variables in an .env file in the root of your project:
.env
AXIOM_TOKEN="API_TOKEN"
AXIOM_DATASET="DATASET_NAME"
OPENAI_API_KEY="OPENAI_API_KEY"
GEMINI_API_KEY="GEMINI_API_KEY"
Replace API_TOKEN with the Axiom API token you have generated. For added security, store the API token in an environment variable.Replace DATASET_NAME with the name of the Axiom dataset where you send your data.

What’s next?

  • Explore the AI engineering workflow: Start building systematic AI capabilities beginning with Create.
  • Continue with Axiom AI SDK: Learn about instrumenting your AI model and tool calls in Observe.