datahub-transportation-gov/advanced-messaging-concept-development-basic-wqch-9e2e
Icon for Socrata external plugin

Query the Data Delivery Network

Query the DDN

The easiest way to query any data on Splitgraph is via the "Data Delivery Network" (DDN). The DDN is a single endpoint that speaks the PostgreSQL wire protocol. Any Splitgraph user can connect to it at data.splitgraph.com:5432 and query any version of over 40,000 datasets that are hosted or proxied by Splitgraph.

For example, you can query the advanced_messaging_concept_development_basic table in this repository, by referencing it like:

"datahub-transportation-gov/advanced-messaging-concept-development-basic-wqch-9e2e:latest"."advanced_messaging_concept_development_basic"

or in a full query, like:

SELECT
    ":id", -- Socrata column ID
    "event_triggering", -- The requested event trigger type. 	Bit0 = Unused Bit1 = Unused Bit2 = ABS Activated Bit3 = Traction Control Loss Bit4 = Stability Control Activated Bit5 = Unused Bit6 = Unused Bit7 = Hard Braking Bit8 = Lights Changed Bit9 = Wipers Changed Bit10 = Unused Bit11 = Unused Bit12 = Unused Bit13 = Unused Bit14 = Unused Bit15 = Unused  
    "time_received", -- The time at which the message was received by the OBU in milliseconds UTC time. Empty
    "test_no", -- The AMCD test number the data was collected during
    "triggering_status", -- The status of the requested triggering control parameter. 0 = None 1 = Start Only 2 = Stop Only 3 = Start and Stop  
    "triggering_latitude", -- The latitude of the center of the triggering geo area in degrees. 
    "periodic_triggering", -- The requested rate of trigger evaluation frequency. 	0 = 0 sec (off) 1 = 300 sec 2 = 120 sec 3 = 90 sec 4 = 60 sec	 5 = 30 sec 6 = 15 sec 10 = 1 sec (1Hz) 11 = 0.5 sec (2Hz) 12 = 0.2 sec (5Hz) 13 = 0.1 sec (10Hz) 14 = 0.01 sec (100Hz)  
    "message_id", -- : BMMs are sent from the OBU packed into a container message that can contain up to four separate data snapshots. This is the ID of the container message that contained the BMM as it was transmitted to the server.
    "time_sent", -- The time at which the message was sent from the VCC Cloud server to the OBU in milliseconds UTC time. 
    "obu_id", -- The ID of the OBU that received the message for the current communication sequence.
    "requested_bmm_data", -- The status of all possible BMM data element request flags. This data item is a 24-bit field where, if a bit is set equal to one, the data element will be returned in BMMs generated by the receiving OBE. 	Bit0 = Lights Status 	Bit1 = Unused 	Bit2 = Wiper Status 	Bit3 = Braking System, Traction Control System, and Stability Control System Status 	Bit4 = Unused 	Bit5 = Unused 	Bit6 = Unused 	Bit7 = Precipitation Sensor Status 	Bit8 = Ambient Air Temperature 	Bit9 = Atmospheric Pressure 	Bit10 = Unused 	Bit11 = Unused 	Bit12 = Unused 	Bit13 = Unused 	Bit14 = Unused 	Bit15 = Unused 	Bit16 = Unused 	Bit17 = Unused 	Bit18 = Unused 	Bit19 = Unused 	Bit20 = Unused 	Bit21 = Unused 	Bit22 = Unused 	Bit23 = Unused 
    "bmcm_timeout", -- The number of seconds the BMCM request shall remain active. 
    "triggering_longitude", -- The longitude of the center of the triggering geo area in degrees. 
    "requested_transmission_mode", -- The type of data transmission mode requested. 	0 = None 	1 = DSRC 	2 = Cellular 	3 = DSRC and Cellular  
    "bmm_pack", -- The number of BMMs per packet (valid values are 1–4).
    "triggering_range", -- The range from the center (radius) of the trigger geo area in which the trigger will fire in centimeters. 
    "mode_of_transmission" -- An indication of which mode of transmission was used to transmit the message (DSRC or Cellular). If received via DSRC: 	1 through 114 = The ID of the RSU used to forward the message to the server  If received via cellular:  	999999 = Cellular  
FROM
    "datahub-transportation-gov/advanced-messaging-concept-development-basic-wqch-9e2e:latest"."advanced_messaging_concept_development_basic"
LIMIT 100;

Connecting to the DDN is easy. All you need is an existing SQL client that can connect to Postgres. As long as you have a SQL client ready, you'll be able to query datahub-transportation-gov/advanced-messaging-concept-development-basic-wqch-9e2e with SQL in under 60 seconds.

This repository is an "external" repository. That means it's hosted elsewhere, in this case at datahub.transportation.gov. When you querydatahub-transportation-gov/advanced-messaging-concept-development-basic-wqch-9e2e:latest on the DDN, we "mount" the repository using the socrata mount handler. The mount handler proxies your SQL query to the upstream data source, translating it from SQL to the relevant language (in this case SoQL).

We also cache query responses on the DDN, but we run the DDN on multiple nodes so a CACHE_HIT is only guaranteed for subsequent queries that land on the same node.

Query Your Local Engine

Install Splitgraph Locally
bash -c "$(curl -sL https://github.com/splitgraph/splitgraph/releases/latest/download/install.sh)"
 

Read the installation docs.

Splitgraph Cloud is built around Splitgraph Core (GitHub), which includes a local Splitgraph Engine packaged as a Docker image. Splitgraph Cloud is basically a scaled-up version of that local Engine. When you query the Data Delivery Network or the REST API, we mount the relevant datasets in an Engine on our servers and execute your query on it.

It's possible to run this engine locally. You'll need a Mac, Windows or Linux system to install sgr, and a Docker installation to run the engine. You don't need to know how to actually use Docker; sgrcan manage the image, container and volume for you.

There are a few ways to ingest data into the local engine.

For external repositories (like this repository), the Splitgraph Engine can "mount" upstream data sources by using sgr mount. This feature is built around Postgres Foreign Data Wrappers (FDW). You can write custom "mount handlers" for any upstream data source. For an example, we blogged about making a custom mount handler for HackerNews stories.

For hosted datasets, where the author has pushed Splitgraph Images to the repository, you can "clone" and/or "checkout" the data using sgr cloneand sgr checkout.

Mounting Data

This repository is an external repository. It's not hosted by Splitgraph. It is hosted by datahub.transportation.gov, and Splitgraph indexes it. This means it is not an actual Splitgraph image, so you cannot use sgr clone to get the data. Instead, you can use the socrata adapter with the sgr mount command. Then, if you want, you can import the data and turn it into a Splitgraph image that others can clone.

First, install Splitgraph if you haven't already.

Mount the table with sgr mount

sgr mount socrata \
  "datahub-transportation-gov/advanced-messaging-concept-development-basic-wqch-9e2e" \
  --handler-options '{
    "domain": "datahub.transportation.gov",
    "tables": {
        "advanced_messaging_concept_development_basic": "wqch-9e2e"
    }
}'

That's it! Now you can query the data in the mounted table like any other Postgres table.

Query the data with your existing tools

Once you've loaded the data into your local Splitgraph engine, you can query it with any of your existing tools. As far as they're concerned, datahub-transportation-gov/advanced-messaging-concept-development-basic-wqch-9e2e is just another Postgres schema.