Send (ingest), transport, and fetch data from different sources such as Relational database, web logs, batch data, real-time, app logs, streaming data, etc. for later usage with the Axiom API.

You can also collect, load, group, and move data from one or more sources to Axiom where it can be stored and further analyzed.

Before ingesting data, you need to generate an API token from the Settings > Tokens page on the Axiom Dashboard. See API tokens documentation for more detail.

Once you have an API token, there are different ways to get your data into Axiom:

  1. Using the Ingest API;
  2. Using a Data Shipper (Logstash, Filebeat, Metricbeat, Fluentd, etc.);
  3. Using the Elasticsearch Bulk API that Axiom supports natively.
  4. Using the Apps we support
  5. Using Endpoints

Ingest method

Select the method to ingest your data. Each ingest method follows a particular path.

Client libraries

Library extensions

Apps

Other

Ingest API

Axiom exports a simple REST API that can accept any of the following formats:

Ingest using JSON

  • application/json - single event or JSON array of events

Example

curl -X 'POST' 'https://api.axiom.co/v1/datasets/$DATASET_NAME/ingest' \
  -H 'Authorization: Bearer $API_TOKEN' \
  -H 'Content-Type: application/json' \
  -d '[
        {
          "_time":"2021-02-04T03:11:23.222Z",
          "data":{"key1":"value1","key2":"value2"}
        },
        {
          "data":{"key3":"value3"},
          "attributes":{"key4":"value4"}
        },
        {
          "tags": {
            "server": "aws",
            "source": "wordpress"
          }
        }
      ]'

Ingest using NDJSON

  • application/x-ndjson- Ingests multiple JSON objects, each represented as a separate line.

Example

curl -X 'POST' 'https://api.axiom.co/v1/datasets/$DATASET_NAME/ingest' \
  -H 'Authorization: Bearer $API_TOKEN' \
  -H 'Content-Type: application/x-ndjson' \
  -d '{"id":1,"name":"machala"}
  {"id":2,"name":"axiom"}
  {"id":3,"name":"apl"}
  {"index": {"_index": "products"}}
  {"timestamp": "2016-06-06T12:00:00+02:00", "attributes": {"key1": "value1","key2": "value2"}}
  {"queryString": "count()"}'

Ingest using CSV

  • text/csv - this should include the header line with field names separated by commas

Example

curl -X 'POST' 'https://api.axiom.co/v1/datasets/$DATASET_NAME/ingest' \
      -H 'Authorization: Bearer $API_TOKEN' \
      -H 'Content-Type: text/csv' \
      -d 'user, name
         foo, bar'

Data shippers

Configure, read, collect, and send logs to your Axiom deployment using a variety of data shippers. Data shippers are lightweight agents that acquire logs and metrics enabling you to ship data directly into Axiom.

Apps

Send logs and metrics from Vercel, Netlify, and other supported apps.

Get started with apps here

Endpoints

Endpoints enable you to easily integrate Axiom into your existing data flow by allowing you to use tools and libraries that you are already familiar with.

You can create an Endpoint for your favorite services like Honeycomb, Jaeger, Grafana Loki, or Splunk and eventually send the logs from these Services directly into Axiom.

Limits and requirements

Axiom applies certain limits and requirements on the ingested data to guarantee good service across the platform. Some of these limits depend on your pricing plan, and some of them are applied system-wide. For more information, see Limits and requirements.

The most important field requirement is about the timestamp.

All events stored in Axiom must have a _time timestamp field. If the data you ingest doesn’t have a _time field, Axiom assigns the time of the data ingest to the events. To specify the timestamp yourself, include a _time field in the ingested data.

If you include the _time field in the ingested data, ensure the _time field contains timestamps in a valid time format. Axiom accepts many date strings and timestamps without knowing the format in advance, including Unix Epoch, RFC3339, or ISO 8601.