Splunk and Axiom are powerful tools for log analysis and data exploration. The data explorer interface uses Axiom Processing Language (APL). There are some differences between the query languages for Splunk and Axiom. When transitioning from Splunk to APL, you will need to understand how to convert your Splunk SPL queries into APL.

This guide provides a high-level mapping from Splunk to APL.

Basic Searching

Splunk uses a search command for basic searching, while in APL, simply specify the dataset name followed by a filter.

Splunk:

search index="myIndex" error

APL:

['myDatasaet']
| where FieldName contains “error”

Run in Playground

Filtering

In Splunk, perform filtering using the search command, usually specifying field names and their desired values. In APL, perform filtering by using the where operator.

Splunk:

Search index=”myIndex” error 
| stats count

APL:

['myDataset']
| where fieldName contains “error”
| count 

Run in Playground

Aggregation

In Splunk, the stats command is used for aggregation. In APL, perform aggregation using the summarize operator.

Splunk:

search index="myIndex" 
| stats count by status

APL:

['myDataset'] 
| summarize count() by status

Run in Playground

Time Frames

In Splunk, select a time range for a search in the time picker on the search page. In APL, filter by a time range using the where operator and the timespan field of the dataset.

Splunk:

search index="myIndex" earliest=-1d@d latest=now

APL:

['myDataset']
| where _time >= ago(1d) and _time <= now()

Run in Playground

Sorting

In Splunk, the sort command is used to order the results of a search. In APL, perform sorting by using the sort by operator.

Splunk:

search index="myIndex" 
| sort - content_type

APL:

['myDataset'] 
| sort by countent_type desc

Run in Playground

Selecting Fields

In Splunk, use the fields command to specify which fields to include or exclude in the search results. In APL, use the project operator, project-away operator, or the project-keep operator to specify which fields to include in the query results.

Splunk:

index=main sourcetype=mySourceType
| fields status, responseTime

APL:

['myDataset'] 
| extend newName = oldName
| project-away oldName

Run in Playground

Renaming Fields

In Splunk, rename fields using the rename command, while in APL rename fields using the extend, and project operator. Here is the general syntax:

Splunk:

index="myIndex" sourcetype="mySourceType"
| rename oldFieldName AS newFieldName

APL:

['myDataset']
| where method == "GET"
| extend new_field_name = content_type
| project-away content_type

Run in Playground

Calculated Fields

In Splunk, use the eval command to create calculated fields based on the values of other fields, while in APL use the extend operator to create calculated fields based on the values of other fields.

Splunk

search index="myIndex" 
| eval newField=field1+field2

APL:

['myDataset'] 
| extend newField = field1 + field2

Run in Playground

Structure and Concepts

The following table compares concepts and data structures between Splunk and APL logs.

ConceptSplunkAPLComment
data cachesbucketscaching and retention policiesControls the period and caching level for the data.This setting directly affects the performance of queries.
logical partition of dataindexdatasetAllows logical separation of the data.
structured event metadataN/AdatasetSplunk doesn’t expose the concept of metadata to the search language. APL logs have the concept of a dataset, which has fields and columns. Each event instance is mapped to a row.
data recordeventrowTerminology change only.
typesdatatypedatatypeAPL data types are more explicit because they are set on the fields. Both have the ability to work dynamically with data types and roughly equivalent sets of data types.
query and searchsearchqueryConcepts essentially are the same between APL and Splunk

Functions

The following table specifies functions in APL that are equivalent to Splunk Functions.

SplunkAPL
strcatstrcat()
splitsplit()
ififf()
tonumbertodouble(), tolong(), toint()
upper, lowertoupper(), tolower()
replacereplace_string() or replace_regex()
substrsubstring()
tolowertolower()
touppertoupper()
matchmatches regex
regexmatches regex (in splunk, regex is an operator. In APL, it’s a relational operator.)
searchmatch== (In splunk, searchmatch allows searching the exact string.)
randomrand(), rand(n) (Splunk’s function returns a number between zero to 231 -1. APL returns a number between 0.0 and 1.0, or if a parameter is provided, between 0 and n-1.)
nownow()

In Splunk, the function is invoked by using the eval operator. In APL, it’s used as part of the extend or project.

In Splunk, the function is invoked by using the eval operator. In APL, it can be used with the where operator.

Filter

APL log queries start from a tabular result set in which a filter is applied. In Splunk, filtering is the default operation on the current index. You may also use the where operator in Splunk, but we don’t recommend it.

ProductOperatorExample
SplunksearchSample.Logs=“330009.2” method=“GET” _indextime>-24h
APLwhere[‘sample-http-logs’]
| where method == “GET” and _time > ago(24h)

Get n events or rows for inspection

APL log queries also support take as an alias to limit. In Splunk, if the results are ordered, head returns the first n results. In APL, limit isn’t ordered, but it returns the first n rows that are found.

ProductOperatorExample
SplunkheadSample.Logs=330009.2
| head 100
APLlimit[‘sample-htto-logs’]
| limit 100

Get the first n events or rows ordered by a field or column

For the bottom results, in Splunk, use tail. In APL, specify ordering direction by using asc.

ProductOperatorExample
SplunkheadSample.Logs=“33009.2”
| sort Event.Sequence
| head 20
APLtop[‘sample-http-logs’]
| top 20 by method

Extend the result set with new fields or columns

Splunk has an eval function, but it’s not comparable to the eval operator in APL. Both the eval operator in Splunk and the extend operator in APL support only scalar functions and arithmetic operators.

ProductOperatorExample
SplunkevalSample.Logs=330009.2
| eval state= if(Data.Exception = “0”, “success”, “error”)
APLextend[‘sample-http-logs’]
| extend Grade = iff(req_duration_ms >= 80, “A”, “B”)

Rename

APL uses the project operator to rename a field. In the project operator, a query can take advantage of any indexes that are prebuilt for a field. Splunk has a rename operator that does the same.

ProductOperatorExample
SplunkrenameSample.Logs=330009.2
| rename Date.Exception as execption
APLproject[‘sample-http-logs’]
| project updated_status = status

Format results and projection

Splunk uses the table command to select which columns to include in the results. APL has a project operator that does the same and more.

ProductOperatorExample
SplunktableEvent.Rule=330009.2
| table rule, state
APLproject[‘sample-http-logs’]
| project status, method

Splunk uses the field - command to select which columns to exclude from the results. APL has a project-away operator that does the same.

ProductOperatorExample
Splunkfields -Sample.Logs=330009.2`
| fields - quota, hightest_seller
APLproject-away[‘sample-http-logs’]
| project-away method, status

Aggregation

See the list of summarize aggregations functions that are available.

Splunk operatorSplunk exampleAPL operatorAPL example
statssearch (Rule=120502.*)
| stats count by OSEnv, Audience
summarize[‘sample-http-logs’]
| summarize count() by content_type, status

Sort

In Splunk, to sort in ascending order, you must use the reverse operator. APL also supports defining where to put nulls, either at the beginning or at the end.

ProductOperatorExample
SplunksortSample.logs=120103
| sort Data.Hresult
| reverse
APLorder by[‘sample-http-logs’]
| order by status desc

Whether you’re just starting your transition or you’re in the thick of it, this guide can serve as a helpful roadmap to assist you in your journey from Splunk to Axiom Processing Language.

Dive into the Axiom Processing Language, start converting your Splunk queries to APL, and explore the rich capabilities of the Query tab. Embrace the learning curve, and remember, every complex query you master is another step forward in your data analytics journey.