July 31, 2024

#product, #engineering

Introducing Flow: Redefining event data processing


Blog Screenshot
Author
Dominic Chapman

Head of Product

Today we are releasing a preview of Axiom Flow, that adds onward event processing, including filtering, shaping and routing. Unlike other solutions' lossy filtering and routing, it works after persisting data in a highly efficient queryable store, and uses the same language (APL, or Axiom Processing Language) to define stream and at-rest processing. No data is lost, all data can be flowed and queried as needed, and there are no performance tradeoffs.

The power of Flow

Flow introduces a streamlined approach to event data processing:

  1. Collect: Ingest 100% of your events into Axiom’s hyper-optimized store.
  2. Transform: Apply advanced filtering, transformation, and enrichment using our Axiom Processing Language (APL).
  3. Process: Run one-time, scheduled, or continuous processing jobs.
  4. Dispatch: Send processed data to your chosen destinations.

What sets Flow apart is its seamless integration with Axiom’s existing capabilities. You can now use the same powerful language to query data at rest and shape data in motion, providing a unified experience that streamlines your data workflows.

Neil Jagdish Patel

Co-founder and CEO, Axiom

└ Example of flow creation workflow.

Transforming data management

We designed Flow to address several key challenges faced by organizations. Here are three primary use cases where our design partners have seen significant value:

Cost optimization without compromise

Imagine you’re a Platform Engineer at a fast-growing company. Your data volumes have exploded over the past few years, and with them, the license costs for your log management vendor. You’ve since added specialized tools for observability, cloud monitoring, and more. You’re tasked with providing greater visibility for your engineering teams while simultaneously reducing costs. It feels like an impossible challenge.

This is where Flow comes in. Here’s how you can implement a cost optimization strategy:

  1. Begin ingesting 100% of your events into Axiom’s cost-effective store.
  2. Use Flow to create selective data streams for your existing tools.
  3. Gradually reduce retention periods in downstream tools as teams start analyzing events directly in Axiom.
  4. Monitor usage patterns and costs, adjusting your flows as needed to optimize spend.
  5. Over time, evaluate which specialized tools can be entirely replaced by Axiom, further reducing tool sprawl.

Democratizing data processing

Consider a scenario where your engineering team constantly grapples with raw, unstructured event data. Your existing tools lack robust processing capabilities, forcing your team to rely on custom scripts and error-prone workflows. It’s slowing down time-to-insight and frustrating your team, all while risking data breaches due to overly broad permissions or creating bottlenecks from overly restrictive access controls.

Flow changes this entirely. Here’s how you can implement a more democratic data processing approach:

  1. Centralize your raw event data in Axiom.
  2. Use APL to create standardized data processing flows.
  3. Set up a library of reusable APL snippets for common operations.
  4. Train your team on APL basics, emphasizing its similarity to familiar query languages.
  5. Encourage experimentation by setting up sandbox environments where team members can test new flows without affecting production data.

Unifying your data estate

For many organizations, the data landscape resembles a sprawling, disconnected estate. Different teams use different tools, creating silos and inconsistencies. You’re left wondering how to bring it all together without disrupting workflows or losing valuable data.

Flow provides a clear answer. Here’s a step-by-step approach to unifying your data estate:

  1. Audit your current data landscape, identifying all data sources and destinations.
  2. Set up Axiom as your central data repository, beginning with your most critical or problematic data sources.
  3. Use Flow to recreate existing data pipelines.
  4. Implement role-based access controls in Axiom to maintain proper data governance.
  5. Introduce teams to Axiom’s querying capabilities, showing how they can reach answers faster than before.
  6. As adoption grows, consolidate redundant tools, using Flow to ensure no critical data is lost in the process.

└ Reviewing an active flow.

Key benefits of Flow

  • Lower Total Cost of Ownership compared to alternatives like Kafka-based pipelines
  • Single language for querying and processing data
  • Comprehensive metrics on all processing runs for pipeline observability
  • Flexible role-based access control for complete governance
  • Zero maintenance as part of our cloud SaaS offering
  • Risk-free experimentation with all data safely recorded
  • Freedom from vendor lock-in

The future of Flow

While Flow already offers significant value, we’re just beginning to explore its potential. We envision Flow becoming the core processing engine for all event data across organizations. Future possibilities include:

  • Integration with monitoring and alerting systems: Imagine real-time anomaly detection, dynamic alert thresholds, and contextually rich notifications that evolve with your data patterns.

  • Feeding data into AI agents: Envision seamless data preparation for AI models, enabling continuous learning pipelines and multi-model orchestration for advanced decision-making processes.

  • Triggering service-specific functions: Picture automated incident responses, dynamic resource allocation, and personalized user experiences - all driven by real-time data flows.

Frequently asked questions

As we launch the preview of Flow, we anticipate you may have some questions. Here are answers to some of the most common queries:

  • Is there a cost for using Flow during the preview period? No, Flow is completely free during the early access preview. We want you to explore its full potential without any cost constraints.

  • What about data security and privacy? Your security is our priority. Flow adheres to the same rigorous security standards as the rest of Axiom, including SOC 2 and GDPR compliance. See here for more.

  • Which destinations are supported? During the preview, Flow supports other Axiom datasets, HTTP endpoints, and S3 buckets as destinations. We’ll expand destination options based on feedback from preview customers.

  • Which flow types will be available during the preview? We’re launching with support for one-time flows. Scheduled and continuous flows will be made available shortly after the initial preview launch, expanding the capabilities to support recurring and real-time data processing needs.

  • How long will the preview period last? We expect the preview period to last no more than 3 months, but this may adjust based on user feedback and feature development. Early access users will receive advance notice before general availability.

Experience Flow today

To join the preview, visit the Flows tab in your Axiom organization and express your interest.

Flow represents a significant step forward in event data processing, and we’re eager to see how you’ll use it to unlock new insights for your organization. As we continue to evolve and expand Flow’s capabilities, we’ll keep you updated on new developments. We’re also excited to share a deep dive on the technology behind Flow soon. Stay tuned!

Share
Get started with Axiom

Learn how to start ingesting, streaming, and
querying data into Axiom in less than 10 minutes.