Writing to Snowflake

<< Click to Display Table of Contents >>

Navigation:  Integrating SQLstream Blaze with Other Systems > Writing Data Out of s-Server > Writing to Other Destinations >

Writing to Snowflake

Previous pageReturn to chapter overviewNext page

5_2_indicator Version 5.2 Feature

The Snowflake ECDA adapter writes batches of data to a Snowflake warehouse. Currently, the Snowflake ECDA adapter works with CSV only. In order to write to a Snowflake warehouse, you must first define a server object for the Snowflake server. This topic describes setting up and performing an INSERT into a foreign stream in order to write data to a Snowflake warehouse. The minimum credentials required to write to a Snowflake warehouse are the warehouse name, user name/password, account, database, schema, and table.

For adapters, you configure and launch the adapter in SQL, using either server or foreign stream/table options. For agents, you configure such options using a properties file and launch the agent at the command line. Many of the options for the ECD adapter and agent are common to all I/O systems. The CREATE FOREIGN STREAM topic in the Streaming SQL Reference Guide has a complete list of options for the ECD adapter.

Note: Because of the nature of streaming data, you will need to set up a pump in order to move rows continually from an s-Server stream to another stream, file, Kafka topic, RDBMS table or other location. Pumps are INSERT macros that continually pass data from one point in a streaming pipeline to the other. A model for setting up a pump is provided below. See the topic CREATE PUMP in the s-Server Streaming SQL Reference Guide for more details.

Sample Code

CREATE OR REPLACE SERVER "Snowflake_Server" TYPE 'snowflake'

FOREIGN DATA WRAPPER ECDA;

 

CREATE FOREIGN STREAM "SnowflakeWriterSchema"."SnowflakeWriterStream" (

   "id" INTEGER,

   "shift_no" DOUBLE,

   "reported_at" TIMESTAMP NOT NULL,

   "trip_no" VARCHAR(10),

   "latitude" DOUBLE,

   "longitude" DOUBLE,

   "speed" INTEGER,

   "bearing" INTEGER,

   "active" BOOLEAN)

   SERVER "Snowflake_Server"

   OPTIONS (

   --credentials for Snowflake table    

   "USER" 'myname',

   "PASSWORD" 'password',

   "ACCOUNT" 'sqlstream',

   "WAREHOUSE" 'DEMO_WH',

   "DB" 'TEST_DB',

   "SCHEMA" 'PUBLIC',

   "DTABLE" 'demo',

   --formatting for CSV data

   "FORMATTER" 'CSV',

   "CHARACTER_ENCODING" 'UTF-8',

   "QUOTE_CHARACTER" '',

   "SEPARATOR" ',',

   "WRITE_HEADER" 'false',

   "DIRECTORY" '/tmp',

   "FILENAME_PREFIX" 'output-',

   "FILENAME_SUFFIX" '.log',

   "FILENAME_DATE_FORMAT" 'yyyy-MM-dd-HH:mm:ss',

   "MAX_BYTES_PER_FILE" '1073741',

   "FORMATTER_INCLUDE_ROWTIME" 'false');

 

CREATE OR REPLACE SCHEMA "Pumps";

SET SCHEMA '"Pumps"';

 

CREATE OR REPLACE PUMP "writerPump" STOPPED AS

--We recommend creating pumps as stopped

--then using ALTER PUMP "Pumps"."writerPump" START to start it

INSERT INTO "SnowflakeWriterSchema"."SnowflakeWriterStream"

SELECT STREAM * FROM "MyStream";

--where "MyStream" is a currently existing stream

 

To start writing data, use the following code:

ALTER PUMP "Pumps"."writerPump" START;