Writing to Apache Kafka

<< Click to Display Table of Contents >>

Navigation:  Integrating SQLstream Blaze with Other Systems > Writing Data Out of s-Server > Writing to Other Destinations >

Writing to Apache Kafka

Previous pageReturn to chapter overviewNext page

Show/Hide Hidden Text

The Kafka ECDA adapter writes batches of data to a Kafka server. In order to write to a Kafka server, you must first define a server object for the Kafka server. This topic describes setting up and performing an INSERT into a foreign stream in order to write data to a Kafka server.

To write data, you first define a server object with connection information, including the directory and information on file rotation. Once you define this server object, you can write to the file system by referencing it. See the topic CREATE SERVER in the s-Server Streaming SQL Reference Guide for more details.

For adapters, you configure and launch the adapter in SQL, using either server or foreign stream/table options. For agents, you configure such options using a properties file and launch the agent at the command line. Many of the options for the ECD adapter and agent are common to all I/O systems. The CREATE FOREIGN STREAM topic in the Streaming SQL Reference Guide has a complete list of options for the ECD adapter.

Note: Because of the nature of streaming data, you will need to set up a pump in order to move rows continually from an s-Server stream to another stream, file, Kafka topic, RDBMS table or other location. Pumps are INSERT macros that continually pass data from one point in a streaming pipeline to the other. A model for setting up a pump is provided below. See the topic CREATE PUMP in the s-Server Streaming SQL Reference Guide for more details.

Sample Code

Like all streams (but unlike server objects or data wrappers), all streams must be defined within a schema. The following code first creates a schema called "kafkaOutput," then creates a foreign stream called "KafkaOutputStream" with the predefined server "KafkaServer" as a server option. To transfer data into Kafka from this stream, you will need to INSERT into it. This step simply sets up the stream, with named columns and Kafka-specific options. (These options are discussed below.)

Here is an example of the SQL used to define a foreign stream for the Kafka adapter:

CREATE OR REPLACE SCHEMA "KafkaWriterSchema"

SET SCHEMA 'KafkaWriterSchema';

 

CREATE OR REPLACE FOREIGN STREAM "KafkaWriterStream"

(

"ts" TIMESTAMP NOT NULL,

"partition" INT NOT NULL,

"zipcode" CHAR(5) NOT NULL,

"transactionTotal" DOUBLE NOT NULL,

"transactionCount" INT NOT NULL)

SERVER "KafkaServer"

OPTIONS

(topic 'AggregatedData',

"metadata.broker.list" 'localhost:9092',

formatter 'CSV',

row_separator '',

character_encoding 'UTF-8');

 

CREATE OR REPLACE SCHEMA "Pumps";

SET SCHEMA '"Pumps"';

 

CREATE OR REPLACE PUMP "writerPump" STOPPED AS

--We recommend creating pumps as stopped

--then using ALTER PUMP "Pumps"."writerPump" START to start it

INSERT INTO "KafkaWriterSchema"."KafkaWriterStream"

SELECT STREAM * FROM "MyStream";

--where "MyStream" is a currently existing stream

 

To start writing data, use the following code:

ALTER PUMP "Pumps"."writerPump" START;

 

Format Type Options

Other options are specific to format type.

hmtoggle_plus1Formatting Files as CSV

hmtoggle_plus1Formatting as XML

hmtoggle_plus1Formatting as JSON