Levels of engagement
Add streaming to legacy
Many of our customers begin with legacy systems for data collection, sometimes using a store-then-query model, and are upgrading to a stream environment. With Kafka as a connector, they can easily feed legacy and new data streams into Blaze for processing.
Some clients are already doing stream processing, but need performance improvements. Kafka makes it easy to plugin our capabilities to a streaming architecture and bring the processing speed up to 1million records per second per core.
If you're assembling a new stream computing system from scratch and are looking for best-of-breed solution from the start, Blaze is the only architecture built from the ground up for complete streaming data processes.
SQLstream Blaze on Kafka
Since SQLstream Blaze processes data at scale and Kafka processes messages at scale, the two technologies naturally complement each other. The resulting distributed data management architecture enables real-time processing for Fast & Big Data.
Streaming SQL applications can to be distributed across multiple processing cores in a server, multiple servers in a data center, and multiple data centers in a Cloud.
Seamless integration with existing machine data collection and enterprise systems.
High-availability, fault-tolerant operations for applications that can be run continuously, while data is in motion, without interrupting execution.
High-performance ingestion and load of data, at rates of millions of records per second.
A complete, centralized, resilient, and SQL standards-compliant pipework.