Tutorial: Streaming applications: Geospatial Visualization – Part 3
The Tutorial blog series helps SQLstream developers build streaming SQL applications. This blog is the third and final part of the Geospatial Visualization tutorial. The first blog in the series set out the streaming use case for connecting SQLstream to a Google Earth visualization, and described the initial steps required to capture the data and create a display list using Rails. The second part of this tutorial presented the core of the application – how to render the display list. And in this the concluding part of the visualization tutorial, the final key element is discussed – how to get the data flowing.
Getting the Data Flowing
The last step is to tie SQLstream and Postgres together. First we need to give SQLstream the credentials to access Postgres. Create a new file:
$QUAKES/quake.postgres.properties
1 URI=jdbc:postgresql://localhost/quakekml_development
2 DRIVER=org.postgresql.Driver
3 CONNPARAMPREFIX=dbConn_
4 dbConn_databaseName=quakekml_development
5 dbConn_user=USER 6 dbConn_password=PASSWORD 7 dbConn_applicationName=SQLstream TableReader Adapter
On lines 5 and 6 insert the Postgres user and password you set up (same as in $QUAKES/quakekml/config/database.yml
). Create a directory under $SQLSTREAM_HOME/plugin
called jndi
(if it’s not there already), copy your quake.postgres.properties
file there and restart the SQLstream server.
linux> cd $SQLSTREAM_HOME/plugin
linux> mkdir jndi
linux> cp $QUAKES/quake.postgres.properties jndi/
linux> cd $SQLSTREAM_HOME
linux> bin/sqlstreamd
If you haven't loaded webfeed.sql
and usgs.sql
as described in the beginning of this post, you should do so now. Now we need to write the SQL that bridges the databases: $QUAKES/viz.sql
1 SET SCHEMA '"WebFeed"';
2
3 CREATE OR REPLACE FOREIGN DATA WRAPPER "TableUpdate"
4 LIBRARY 'class com.sqlstream.plugin.tableupdate.TableUpdateStreamControlPlugin'
5 LANGUAGE java
6 DESCRIPTION 'adapter for doing insert/update/merge/delete to an external database';
7
8 --
9 -- Create SQL/MED foreign server and foreign stream
10 --
11 CREATE OR REPLACE SERVER "Postgres_TableUpdate"
12 FOREIGN DATA WRAPPER "TableUpdate"
13 OPTIONS (
14 connParams 'quake.postgres',
15 sqlDialect 'Postgres 8.x',
16 pollingMillis '5000',
17 commitCount '1000',
18 commitMillis '2000')
19 DESCRIPTION 'Postgres database with visualization';
20
21 --
22 -- Display list for quake events
23 --
24 CREATE OR REPLACE FOREIGN STREAM "QuakeEventsDB" (
25 "SQLS_opcode" CHAR(2) NOT NULL,
26 "SQLS_chg" VARBINARY(32),
27 "id" INTEGER options("insert" 'skip', "update" 'skip'),
28 "when" TIMESTAMP,
29 "lat" DOUBLE,
30 "lon" DOUBLE,
31 "mag" DOUBLE,
32 "created_at" TIMESTAMP,
33 "updated_at" TIMESTAMP
34 )
35 SERVER "Postgres_TableUpdate"
36 OPTIONS (
37 TYPE 'tableUpdates',
38 MASTER 'true',
39 updatesTable 'quake_events')
40 DESCRIPTION 'table updated with quake events';
41
42 --
43 -- Pump quake events into display list
44 --
45 CREATE OR REPLACE PUMP "1000-QuakeEventsPump" STOPPED
46 DESCRIPTION 'pump from "SmallQuakesDay" view to "QuakeEventsDB" foreign stream' AS
47 INSERT INTO "QuakeEventsDB" (
48 "SQLS_opcode", "when", "lat", "lon", "mag",
49 "created_at", "updated_at")
50 SELECT STREAM 'IN',
51 q.ROWTIME,
52 CAST(SUBSTRING("point", 1, POSITION(' ' IN "point") - 1) AS DOUBLE),
53 CAST(SUBSTRING("point", POSITION(' ' IN "point") + 1) AS DOUBLE),
54 "mag",
55 CURRENT_TIMESTAMP, CURRENT_ROW_TIMESTAMP
56 FROM "SmallQuakesDay" as q;
The first 20 lines set up the table updater (note the reference to our properties file on line 14). We then create a foreign stream to describe the Postgres table quake_events
. When Rails created the table, it added an auto-incrementing id field and two timestamps, created_at
and updated_at
. The options specified on line 27 cause SQLstream to ignore the id
column and let Postgres maintain it. At line 45 we describe the pump that reads from SmallQuakesDay
and inserts into the stream we defined above. In the select clause we use the 'IN' opcode to indicate that we're inserting and the ROWTIME
of the quake record to set the 'when' column in the display list. We parse the 'point' column the USGS provides, which is in the format "lat<space>lon"
, on lines 52 and 53. To set the Rails timestamps correctly we use CURRENT_TIMESTAMP
to get the creation time and CURRENT_ROW_TIMESTAMP
to get the time of this update. Load this file into SQLstream with:
linux> sqllineClient < viz.sql
Now all that's left to do is start the pump. Create this one-line file:
$QUAKE/start-pump.sql
1 ALTER PUMP "1000-QuakeEventsPump" START;
and start the pump:
linux> sqllineClient < start-pump.sql
Data should now be flowing from the USGS web service, through SQLstream, into Postgres, and rendered to Google Earth from Postgres via Rails. To verify that data is flowing, you can use the 'Edit quake events' in your web app. When you follow the Google Earth link, you should see pins scattered across the globe indicating the past day's earthquakes, and updated every 60 seconds. You'll probably have to zoom out a bit to see them, unless San Francisco is having another bad day. This is what I saw when I ran it:
And finally
That concludes the streaming data visualization tutorial. In case you've missed the earlier posts, the complete series can be found using these links:
Part 1: Capture the data and create the display list
Part 2: Rendering the display list
Part 3: Making it work with flowing data
Please contact us if you have any questions.