Real-time business context (and the money-making decisions it dictates) depend entirely on making use of a massive amount of fast and big data within milliseconds.  However, use of real-time data and its projected ROI are vitally connected to discovering, wrangling, and delivering all live data available into the applications and services that need them. And while Big Data processing and traditional ETL solutions are too slow, too complicated, and too conditional to support real time results, streaming ingestion is on the rise. Without streaming ingestion, there’s no streaming analytics, no action, and no results.

In this webinar, SQLstream and our guests explore, explain, and exemplify:

  • the creation and management of data pipelines through automatic discovery and transformation of any data format from or to any data format, interfacing with a wide array of sources and destinations including Amazon Kinesis and Firehose, Hadoop, data warehouses, message buses (including Kafka), files, and devices
  • the delivery of accurate, complete, and consistent data flows through operations like continuous and real-time: LOAD; data wrangling, parsing, and filtering
  • the transformation of both live data streams and historical data streams brought live through streaming ingestion
  • the integration of multiple, disparate, data streams, concurrently, continuously and at rates of millions of records per second per CPU core.

Live demos and case studies

  • Damian Black demos the painless ingestion of data into and out of Amazon Web Services using SQLstream Blaze: data is moved continuously and in real time from data sources, to the cloud, and from the cloud through a high performance bi-directional SQLstream Blaze Kinesis Adapter.
  • Chris Duxler showcases the ingestion of continuous and real-time 911 data:  ECaTS uses continuous and real-time time data ingestion for the live analysis and dashboard visualization of data that literally helps save lives.
  • Altan Khendup discusses real-time big data ingestion, Teradata Unified Data Architecture, and the evolution of ETL.

Watch the demo


  • Without streaming ingestion, there's no streaming analytics

    While Big Data processing and traditional ETL solutions are too slow, too complicated, and too conditional to support real time results, streaming ingestion is on the rise.