Latest and greatest

What’s new

AWS presents...

After being licensed by AWS for Amazon Kinesis Analytics, Blaze is now available (and in good company) on AWS Marketplace.

read more

Thursday demos

The streaming ingestion, analytics, and technical demos are back by popular demand and will air every Thursday until 3/23.

details and registrations


Amazon Web Services (AWS) has licensed and implemented a subset of core technology from SQLstream Blaze to power its Kinesis service.

read release


A collection of past and future webinars, live shows, and industry conferences where you can find us.

  • Thursday demos | Live webinars with SQLstream

    Since our Real time for the bottom line webinar series was such a success, we decided to bring it back in smaller, recurring installments. Starting Thursday, 1/26, we’ll alternate streaming ingestion, streaming analytics, and more technical tutorials, so that you can pick the timing that works best for you. 


    Streaming ingestion and continuous ETL

    January 26th | February 16th | March 9th

    Let’s say you DO produce a lot of streaming data, so you know intelligence is to be had. But if you can’t capture it all, and direct it to where it’s needed in real time, how can the picture be 100% complete?

    We maintain that Big Data processing and traditional ETL solutions are too slow, too complicated, and too conditional to support real-time results. And simply put, without streaming ingestion, there’s no streaming analytics, no action, and no results.


    No streaming analytics? Sorry, no action for you

    February 2nd | February 23rd | March 16th

    Let’s say you CAN capture all the streaming data you produce, and you CAN integrate it with your stored data in one architecture. But if you can’t analyze it continuously and in real time, how can the results be 100% reliable?

    The batch-oriented, collect-store-contemplate model employed by Big Data Analytics technologies is incomplete because it does not make use of live data in real time. At the same time, most Fast Data technologies don’t integrate with stored data- so they’re missing the historical context to their insight.


    A SQL architecture for streaming

    February 9th | March 2nd | March 23rd

    We’re often asked why we chose to build a standard-compliant platform (over 2M lines of code, and growing). Here’s our answer:
    1. SQL performs beautifully for both scale-up and scale-out implementations.
    2. SQL is the only language that can seamlessly integrate streaming and stored data for streaming analytics.
    3. Streaming technologies, DBMSs, and Hadoop are friends, not foes.

    In this webinar, we’ll demo a wide range of operations ran on our 100% SQL-compliant streaming analytics architecture.


  • Webinar | Streaming Action means Transaction

    While streaming ingestion can help you capture the right data and streaming analytics can make sure you understand it, being able to say “I got this” does NOT improve the business process. You need more.

    Recording coming soon

  • Webinar recording | Streaming ingestion with Teradata and ECaTS

    10.25 installment of the Real time for the bottom line webinar series hosted by SQLstream.

    Learn more

  • Webinar recording | Streaming analytics with Forrester

    9.13 installment of the Real time for the bottom line webinar series hosted by SQLstream.

    Learn more

Latest posts

A collection of our own thoughts, commentary, and official SQLstream news.

March 20 2017

Strata 2017 Wrap-up: 7 Trends We Can’t Afford to Ignore

Read More
March 3 2017

5 Ways to Measure if Fast Analytics on Hadoop Are Fast Enough

Read More
February 16 2017

Why we need SQL for data stream processing and real-time streaming analytics

Read More
January 25 2017

SQLstream brings back Thursday demos

Read More


The most recent editions of our bulletin board, with news on new partnerships, our customers, and the market-at-large.