EVENTS | SQLstream webinar series continues with Live Action event

Real-time for the bottom line

Business success depends on a company’s know-how to make the right decision, at the right time, all the time. But with Fast Data on the rise and an economic environment that’s changing by the second, the ability to take action relies heavily on data management technologies—and the company’s inspiration to pick the best.
In the Real-time for the bottom line webinar series, SQLstream and our esteemed guests look into how true real-time technologies can tool financial success by empowering people and machines to make the right decision, at the right time, all the time.

Upcoming

EPISODE III: Streaming Action Means Transaction or How to operationalize real-time insight

December 20th 2016 | 10AM PST | REGISTER

While streaming ingestion can help you capture the right data and streaming analytics can make sure you understand it, being able to say “I got this” does not improve the business process.  State-of-the-art, real-time reporting or aggregations are great, but it’s not until streaming processes are integrated with transactions that value comes to applications— continuously and in real time.

In this final installment of the Real Time for the Bottom Line series, SQLstream will discuss and exemplify how streaming technologies can push insights into business operations, so that:

– information is timely, contextually relevant, and actionable every moment;

– action is real-time, intelligent, and push-based;

– there’s continuous load to storage or other systems (i.e., visualizations, Hadoop, etc.);

– there’s live, continuous, and instant testing and development;

– operations, infrastructure and services are adaptive, predict needs, and improve automatically.


Past events

EPISODE I: Streaming analytics or How to stop wasting money on unactionable analytics

September 13, 2016 | 10AM PST | RECORDING

GUEST SPEAKER | Mike Gualtieri, VP, Principal Analyst @Forrester

SQLstream and Forrester explored how streaming analytics applications can be built in minutes, to:

  • – Aggregate, enrich, and analyze a high throughput of data from multiple, disparate live data sources and in any format to identify patterns, detect opportunities, automate actions, and dynamically adapt
  • – Easily ingest streaming data from multiple disparate sources to multiple sources, within and between cloud and on-premises environments
  • – Analyze and act on data as it arrives, without needing to store, eliminating unnecessary security risks and storage costs
  • – Enable real-time analytics with existing business intelligence and data assets.

EPISODE II: Way Beyond ETL and Micro-batch: Continuous and Real-time Data Ingestion or Ingestion into Amazon Kinesis, data pipelines, flows, and the integration of fast and big data now

October 25th 2016 | 10AM PST | RECORDING

GUEST SPEAKERS | Altan Khendup, Global Practice Leader, Unified Data Architecture @ Teradata and Chris Duxler, Operations Director @ECaTS

SQLstream, Teradata and ECaTS explored, explained, and exemplified:

  • – the creation and management of data pipelines through automatic discovery and transformation of any data format from or to any data format, interfacing with a wide array of sources and destinations including Amazon Kinesis and Firehose, Hadoop, data warehouses, message buses (including Kafka), files, and devices
  • – the delivery of accurate, complete, and consistent data flows through operations like continuous and real-time: LOAD; data wrangling, parsing, and filtering
  • – the transformation of both live data streams and historical data streams brought live through streaming ingestion
  • – the integration of multiple, disparate, data streams, concurrently, continuously, and at rates of millions of records per second per CPU core.