This guide provides the steps to install and configure the @axiomhq/ai SDK. Once configured, you can follow the Rudder workflow to create, measure, observe, and iterate on your AI capabilities.

Prerequisites

Before you begin, ensure you have the following:

  • An Axiom account. Create one here.
  • An Axiom dataset. Create one here.
  • An Axiom API token. Create one here.

Installation

Install the Axiom AI SDK into your TypeScript project using your preferred package manager.

pnpm i @axiomhq/ai

The SDK is open source. You can view the source code and examples in the axiomhq/ai GitHub repository.

The @axiomhq/ai package also includes the axiom command-line interface (CLI) for managing your AI assets, which will be used in later stages of the Rudder workflow.

Configuration

The Axiom AI SDK is built on the OpenTelemetry standard and requires a configured tracer to send data to Axiom. This is typically done in a dedicated instrumentation file that is loaded before the rest of your application.

Here is a standard configuration for a Node.js environment:

// src/instrumentation.ts

import 'dotenv/config'; // Make sure to load environment variables
import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-http';
import { Resource } from '@opentelemetry/resources';
import { NodeSDK } from '@opentelemetry/sdk-node';
import { SimpleSpanProcessor } from '@opentelemetry/sdk-trace-node';
import { ATTR_SERVICE_NAME } from '@opentelemetry/semantic-conventions';
import { initAxiomAI, tracer } from '@axiomhq/ai';

// Configure the NodeSDK to export traces to your Axiom dataset
const sdk = new NodeSDK({
  resource: new Resource({
    [ATTR_SERVICE_NAME]: 'my-ai-app', // Replace with your service name
  }),
  spanProcessor: new SimpleSpanProcessor(
    new OTLPTraceExporter({
      url: `https://api.axiom.co/v1/traces`,
      headers: {
        Authorization: `Bearer ${process.env.AXIOM_TOKEN}`,
        'X-Axiom-Dataset': process.env.AXIOM_DATASET!,
      },
    }),
  ),
});

// Start the SDK
sdk.start();

// Initialize the Axiom AI SDK with the configured tracer
initAxiomAI({ tracer });

Environment variables

Your Axiom credentials and any frontier model API keys should be stored as environment variables. Create a .env file in the root of your project:

# Axiom Credentials
AXIOM_TOKEN="<YOUR_AXIOM_API_TOKEN>"
AXIOM_DATASET="<YOUR_AXIOM_DATASET_NAME>"

# Frontier Model API Keys
OPENAI_API_KEY="<YOUR_OPENAI_API_KEY>"
GEMINI_API_KEY="<YOUR_GEMINI_API_KEY>"

What’s next?

Now that your application is configured to send telemetry to Axiom, the next step is to start instrumenting your AI model calls.

Learn more about that in the Observe page of the Rudder workflow.