Send Data
Send (ingest), transport, and fetch data from different sources such as Relational database, web logs, batch data, real-time, application logs, streaming data, etc. for later usage with the Axiom API.
You can also collect, load, group, and move data from one or more sources to Axiom where it can be stored and further analyzed.
Before ingesting data, you need to generate an API Token from the Settings->Tokens page on the Axiom Dashboard. See API Tokens documentation for more detail.
Once you have an ingest token, there are three ways to get your data into Axiom:
- Using the Ingest API;
- Using a Data Shipper (Logstash, Filebeat, Metricbeat, Fluentd, etc.);
- Using the Elasticsearch Bulk API that Axiom supports natively.
Ingest API
Axiom exports a simple REST API that can accept any of the following formats:
Ingest using JSON
application/json
- single event or json array of events
Example
curl -X 'POST' 'https://cloud.axiom.co/api/v1/datasets/$DATASET_NAME/ingest' \
-H 'Authorization: Bearer $API_TOKEN' \
-H 'Content-Type: application/json' \
-d '[
{
"_time":"2021-23-04302:11:23.222Z",
"data":{"key1":"value1","key2":"value2"}
},
{
"data":{"key3":"value3"},
"attributes":{"key4":"value4"}
},
{
"tags": {
"server": "aws",
"source": "wordpress"
}
}
]'
Ingest using NDJSON
application/nd-json
- structured data processed at a time
Example
curl -X 'POST' 'https://cloud.axiom.co/api/v1/datasets/$DATASET_NAME/ingest' \
-H 'Authorization: Bearer $API_TOKEN' \
-H 'Content-Type: application/x-ndjson' \
-d '{"id":1,"name":"machala"}
{"id":2,"name":"axiom"}
{"id":3,"name":"apl"}
{"index": {"_index": "products"}}
{"timestamp": "2016-06-06T12:00:00+02:00", "attributes": {"key1": "value1","key2": "value2"}}
{"queryString": "count()"}'
Ingest using CSV
text/csv
- this should include the header line with field names separated by commas
Example
curl -X 'POST' 'https://cloud.axiom.co/api/v1/datasets/$DATASET_NAME/ingest' \
-H 'Authorization: Bearer $API_TOKEN' \
-H 'Content-Type: text/csv' \
-d 'user, name
foo, bar'
Supported Libraries
If you would like to instead use a language binding, currently our client libraries are available:
Limits
Kind | Limit |
---|---|
Maximum Event Batch Size | 1000 |
Maximum Event Fields | 250 |
Maximum Array Field Items | 100 |
Field Restrictions
There are some restrictions on the field names in Axiom:
-
Field names should not be more than 200 bytes.
-
It must be a valid UTF-8.
_time Condition
_time
must conform to a valid timestamp or not be present at all, If_time
is not present, the server will assign a timestamp.
Data Shippers
Configure, read, collect, and send logs to your Axiom deployment using a variety of data shippers. Data shippers are lightweight agents that acquire logs and metrics enabling you to ship data directly into Axiom.
AWS CloudFront
AWS CloudWatch Logs
Elastic Beats
Fluent Bit
FluentD
Heroku Log Drains
Kubernetes
Logstash
Loki Multiplexer
Syslog Proxy
Vector