Send Logstash logs to Axiom
Logstash
Logstash is an open-source log aggregation, transformation tool, and server-side data processing pipeline that ingests data from a multitude of sources simultaneously. With Logstash, you can collect, parse, send, and store logs for future use on Axiom.
Logstash works as a Data pipeline tool with Axiom, where, from one end, the data is input from your servers and system and, from the other end, Axiom takes out the data and converts it into useful information.
It can read data from various input
sources , filter data for the specified configuration and eventually stores the data.
Logstash sits between your data and where you want to store it.
Installation
Visit the Logstash download page to install Logstash on your system.
You'd need to specify the org-id header if you are using Personal Token, it's best to use an API Token to avoid the need to specify the org-id header.
Learn more about API and Personal Token
Configuration
To configure the logstash.conf
file, you have to define the source, set the rules to format your data and also set Axiom as the destination where the data will be forwarded to.
The logstash configuration works with opensearch, so you can use the opensearch syntax to define the source and destination.
The Logstash Pipeline has three stages:
- Input stage: which generates the event & Ingest Data of all volumes, Sizes, forms and Sources
- Filter stage: modifies the event as you specify in the filter component
- Output stage: shifts and sends the event into Axiom.
OpenSearch Output
For installation instructions for the plugin check out the OpenSearch documentation
In logstash.conf
, configure your logstash pipeline
to collect and send data logs to Axiom
The example below shows Logstash configuration that sends data to Axiom:
input{
exec{
command => "date"
interval => "1"
}
}
output{
opensearch{
hosts => ["https://api.axiom.co:443/v1/datasets/$DATASET_NAME/elastic"]
# api_key should be your API token
user => "axiom"
password => "$TOKEN"
}
}