Webhook Relay
Searchโ€ฆ
GCP BigQuery
BigQuery is one of the most popular data warehouses in the world. Webhook Relay functions allow directly inserting data into them.
โ€‹
Webhook Relay to BigQuery
Prerequisites:
Webhook Relay provides a helper package bigquery that can stream writes into Google Cloud BigQuery. To start ingesting data from webhooks straight into your BigQuery table, create a new Function and just import โ€˜bigqueryโ€™ package:
1
-- Import BigQuery helper package
2
local bigquery = require('bigquery')
Copied!
A new tab should appear that will ask you to set up credentials:
Configure GCP credentials
Go to that tab and it will ask you to:
  1. 1.
    Create new service accounts with BigQuery Editor permissions
  2. 2.
    Download the JSON file. Once you have the JSON file
  3. 3.
    Copy & paste contents into the form and click save.

Streaming data to GCP BigQuery

1
-- Import BigQuery helper package
2
local bigquery = require('bigquery')
3
local json = require("json")
4
โ€‹
5
-- Parsing payload
6
local rowData, err = json.decode(r.RequestBody)
7
if err then error(err) end
8
โ€‹
9
-- Initializing BigQuery client
10
err = bigquery.initialize('your-project-id', 'dataset-id', 'table-id')
11
if err then error(err) end
12
โ€‹
13
-- Receiving payload:
14
-- {
15
-- "hub_user_id": "user-id-here",
16
-- "category": "signup",
17
-- "action": "click",
18
-- "label": "github auth"
19
-- }
20
โ€‹
21
-- Insert row:
22
err = bigquery.insert(rowData)
23
if err then error(err) end
Copied!

BigQuery package API reference

At the moment thereโ€™s a single client method that bigquery package exposes:
Method name
Parameter Type
Description
insert(rowData)
Table
A table [key]value that represents a row data.

Limitations

Currently our package doesnโ€™t support nested objects. That means that a table with a JSON structure such as:
1
{
2
"hub_user_id": "user-id-here",
3
"category": "signup",
4
"action": "click",
5
"label": "github auth",
6
"nested_data": {
7
"location": "GB",
8
"date": "2020-05-10"
9
}
10
}
Copied!
will not be successfully inserted. Therefore, flatten the structure in the function before inserting it.

Troubleshooting

Few things to note:
  • Ensure that project ID, dataset ID and table ID are there.
  • BigQuery table schema is defined by the user. You donโ€™t have to write all the fields (most of the can be nullable) but if you try to write a field that doesnโ€™t exist, BigQuery will refuse to write.