December 16, 2024

#product, #company

From burden to asset: reimagining logs at scale


Blog Screenshot
Author
Christina Noren

Advisor

Several colleagues recently shared Omer Singer’s astute analysis of Splunk’s innovator's dilemma. While his insights are spot-on, we’re watching something even more fundamental unfold in the event data space – a story that goes beyond classic disruption theory.

The narrative begins with a simple truth: what started as a bargain has become a burden. Early adopters of Splunk, where I was head of product for their first seven years, remember our initial value proposition. Splunk’s search approach dramatically simplified making sense of machine data. Splunk’s pricing was reasonable, and there was no serious alternative.

A textbook case of innovator’s dilemma

But as data volumes exploded and use cases multiplied, something shifted. What was once a cost-effective option has become, for many organizations, their most expensive operational line item. This isn’t just about inflation or scaling costs. It’s about a fundamental misalignment between legacy architectures and modern data realities.

When you’re a market leader with an established revenue model, optimizing for customer cost savings isn't just difficult — it can be existentially threatening. Your investors expect consistent growth. Your top salespeople rely on those large deals. Your entire organization is built around maintaining and growing that revenue structure. Splunk’s aging bones will bring new owner Cisco several billion dollars in inertial license renewals this coming year. They won’t risk refactoring it.

Divided tools, diminished returns

Market consolidation also whittles down the scope and ambitions of vendors. In my years at Splunk, we adamantly opposed it being pigeonholed solely as a security tool, because the ability to search all your logs has so many other valuable uses. But the day Splunk was acquired by Cisco in late 2023, it was recast as a “cybersecurity and observability” brand.

Why? Because for your org’s other goals, Cisco has another acquired product to sell you. Want business context for your logs? You need an AppDynamics integration. Need real-time network visibility? That’s ThousandEyes territory. Each need is met by a specialized tool, rather than a unified approach to event data.

Too many challengers also market their products as single purpose tools. They say that they’re all about observability, or security, or product analytics, etc., as if each use case needs a redundant data path.

We’re not competing. We’re obsoleting.

But here’s the thing: organizations are already abandoning this siloed thinking. They don’t want a hodgepodge of different engines for different teams, with redundant fees and incompatible formats. They want to phase these out and consolidate on a unified foundation that will handle all current and future use cases, from traditional monitoring and security to emerging needs like AI training and product analytics. And they want vendors that will continuously optimize cloud compute and storage to deliver more value at lower cost.

Axiom isn’t about competing with legacy platforms or their shortsighted challengers. It’s about making them obsolete.

That’s why I joined Axiom as an advisor. Axiom’s approach transcends incremental improvements to reimagine what’s possible. They don’t hide disconnected architectures behind a UI happy face, they build everything upon a unified core to solve today’s problems without limiting future growth.

Rather than cobbling together legacy code and new acquisitions, Axiom redesigned the entire logging pipeline from scratch as a cohesive and extensible whole. Its storage compression is variable, adjusting in realtime to incoming event content, to deliver 95% compression or more for most data, while optimized for maximum query efficiency, and the ability to apply schema at read time to interpret events as needed.

Axiom is designed to serve as the sole long-term repository of 100% of an org’s event data, reshaping and routing some to specialized tools where there’s value in doing so. And as cloud tech evolves, Axiom’s engineers continuously optimize every component to keep up. Their goal is to enable organizations to wring value from their growing log volumes in every way possible.

The future of event data won’t be determined by who has what market share today, but by who best transforms customers’ data from burden to advantage.

Share
Get started with Axiom

Learn how to start ingesting, streaming, and
querying data into Axiom in less than 10 minutes.