Cribl allows you to take any data collected from anywhere and reliably deliver it to where it will provide maximum value to your business.
Given the costs and complexity of managing log analytics at scale, there cannot be simply one store which will take all the data. Some data is destined straight for an archive where it can affordably be laid to rest and likely never accessed. Other data is best put into a batch analytics system. Yet more is destined for log search.
All Logs Are Not Created Equal
Many logs are of dubious quality. Bad logs contain sensitive information, are overly verbose, overly voluminous, hard to parse, and missing key context for operations, security and analytics.
Log analytics admins have been left with only two choices: ingest as-is or go back to developers and vendors, ask for fixes, and wait weeks or months for new releases.
More Depth. Less Noise.
To deliver maximum value, the next generation of log analytics will require users to think differently about machine data. What’s valuable? What’s noise? What needs to be enriched? What’s in scope? Which systems should receive this data?
Cribl puts admins in control
Cribl LogStream gives you the right data, with the right context, delivered to the right systems to enable operations, security and analytics without pushing every requirement back to the source systems.
Send all your high volume data like Netflow logs and keep only a sample. Redact PII without software changes. Transform overly verbose messages to keep only what you need. Lookup an IP address at ingestion time instead of days later. Leave debug on all the time and ingest only when troubleshooting. Alert and notify of problems in true real time.
Cribl LogStream is purpose-built for real-time log management. Cribl enables enterprises to collect 100% of data that might be interesting and determine at ingestion time what is interesting, and then secure, enrich, and route that data to maximize the value of machine data for their business.
Cribl LogStream is a real time log processor which enables customers to transform machine data in motion before routing to any system, including Splunk, ElasticSearch, S3, or Kafka. LogStream is first of its kind, purpose built for logs, and helps customers re-use their existing investments in proprietary log pipelines to send data to their multitude of tools, while securing the contents of their data, and controlling costs. For more information, check out our website at https://www.cribl.io/.
Cribl LogStream is a real time log processor which enables customers to control costs, secure the contents of their log data, and send the right data to and from any system, including Splunk, ElasticSearch, S3, or Kafka. Here are some example use cases:
- Data enrichment: Look up events against threat lists or even DNS to bring in richer logs
- Smart sampling: Keep all errors or other interesting events and sample the rest
- Data routing: Put full fidelity data in S3 while putting only interesting data in an index
- Filter noise: Drop noisy events using out of the box content for known noisy sources
For more info, including additional use case ideas and deployment options, check out the website at https://www.cribl.io/. The product is free below 100 GB/Day of daily ingestion, and it’s downloadable on our website.
Longer Form Mini-Case Study
Cribl is a machine data engineering tool, purpose built to process log and metric data at high scale, in real time, before forwarding it onto your existing analysis tools such as Splunk or ElasticSearch, your event streaming platforms such as Kafka or Kinesis, or your data lakes such as Snowflake, Hadoop, or S3. Cribl allows you to maintain your investments in your existing deployed footprint of log agents, but by processing the data in real time before storage we help our customers route the data to and from any system, redact potentially sensitive information in their data, enrich their data from sources such as Threat Lists & Service Discovery, and control costs by aggregating, sampling, and filtering data before storage.
Cribl was engaged by a Fortune 50 manufacturing conglomerate which was processing log data from CrowdStrike into Splunk at more than 7 Terabytes a Day. CrowdStrike data presents two problems common amongst many of our use cases: the consumer is receiving overly verbose logs that contain data they are not interested in, and changing the contents of the log message is either impossible or a very slow enhancement request to developers. Using Cribl, our customer was able to drop verbose fields that weren’t of interest to them and reduce the volume of data ingestion from 7 Terabytes a Day to less than 3, reducing their monthly AWS spend by over $40,000. All told, between software and hardware, Cribl offset more than $1,000,000 a year in software and infrastructure spend on this single use case.