Webhook Relay
PricingLogin
  • Introduction
  • Quick Start - Forwarding
  • Quick Start - Tunnels
  • 🛠️Installation
    • Relay CLI
      • Install
      • Auto-start
      • Run config.yaml reference
    • Containerized
      • Kubernetes Installation
      • Podman
      • Docker
      • Docker Compose
  • Products
    • 🛰️Webhook Forwarding
      • Glossary
      • WebSocket Server
      • Authentication
      • Custom Domains
    • ⚡Functions
      • Managing functions
      • Edit request/response
      • Working with JSON
      • 🦾Advanced
        • Working with time
        • Testing functions in CLI
        • Making HTTP Requests
        • Multipart Form Data
        • URLEncoded Form Data
        • GCP BigQuery
        • Sending Emails
        • JWT authentication
        • Base64, Hashes, Encryption
      • 🤖Integrating into CI/CD
    • 🔃Tunnels
      • Using tunnels
      • Custom Domains
      • Encryption (HTTPS)
      • Regions
  • 📝Examples
    • Intro to examples
    • Webhooks
      • Receiving webhooks on localhost
      • Receive webhooks inside your JavaScript app
      • Execute shell scripts on remote machines
    • Functions
      • Enrich webhooks from 3rd party APIs
      • Convert DockerHub webhook to Slack notification
      • Allowing only POST requests through
      • Manipulate webhook request body
    • Tunnels
      • Ingress for any Kubernetes environment
      • Demoing your website
    • 🏠Home Automation
      • Home Assistant
      • Node-RED
      • Raspberry Pi
  • Platform
    • CLI Basics
    • Using CLI Behind Proxy
    • Self-hosting Server
      • Server deployment
      • Client configuration
    • Security & Tech
Powered by GitBook
On this page
  • Streaming data to GCP BigQuery
  • Check if record exists
  • Execute any command
  • BigQuery package API reference
  • Limitations
  • Troubleshooting

Was this helpful?

  1. Products
  2. Functions
  3. Advanced

GCP BigQuery

BigQuery is one of the most popular data warehouses in the world. Webhook Relay functions allow directly inserting data into them.

PreviousURLEncoded Form DataNextSending Emails

Last updated 2 years ago

Was this helpful?

Webhook Relay to BigQuery

Prerequisites:

  • (free trial available)

  • Google Cloud project with BigQuery enabled (there’s a generous free tier available for BigQuery)

  • Dataset and table in BigQuery -

-- Import BigQuery helper package
local bigquery = require('bigquery')

A new tab should appear that will ask you to set up credentials:

Go to that tab and it will ask you to:

  1. Download the JSON file. Once you have the JSON file

  2. Copy & paste contents into the form and click save.

Streaming data to GCP BigQuery

-- Import BigQuery helper package
local bigquery = require('bigquery')
local json = require("json")

-- Parsing payload
local rowData, err = json.decode(r.RequestBody)
if err then error(err) end

-- Initializing BigQuery client
err = bigquery.initialize('your-project-id', 'dataset-id', 'table-id')
if err then error(err) end

-- Receiving payload:
-- {
--     "hub_user_id": "user-id-here",
--     "category": "signup",
--     "action": "click",
--     "label": "github auth"
-- }

-- Insert row:
err = bigquery.insert(rowData)
if err then error(err) end

Check if record exists

A simple query to check whether a row exists by matching a column with a value:

bigquery = require('bigquery')
err =	bigquery.initialize('your-project-id', 'dataset-id', 'table-id')
if err then error(err) end

local exists, err = bigquery.record_exists('name', 'john')
if err then error(err) end

if exists then
  -- OK
else
  error('Record not found')
end

Execute any command

To execute any SQL command on your table:

bigquery = require('bigquery')
err =	bigquery.initialize('your-project-id', 'dataset-id', 'table-id')
if err then error(err) end

-- Delete old records of the matching category. Method 'exec' can take an arbitrary 
-- number of arguments, depending on how many ? you have in your query.
err = bigquery.exec('DELETE dataset-id.table-id WHERE category = ? AND country = ?', 'movies', 'US')
if err then error(err) end

BigQuery package API reference

At the moment there’s a single client method that bigquery package exposes:

Method name

Parameter Type

Description

insert(rowData)

Table

A table [key]value that represents a row data.

record_exists(column, value)

String, String

Checks if a row with the matching column exists

Limitations

Currently our package doesn’t support nested objects. That means that a table with a JSON structure such as:

{
    "hub_user_id": "user-id-here",
    "category": "signup",
    "action": "click",
    "label": "github auth",
    "nested_data": {
      "location": "GB",
      "date": "2020-05-10"
    }
}

will not be successfully inserted. Therefore, flatten the structure in the function before inserting it.

Troubleshooting

Few things to note:

  • Ensure that project ID, dataset ID and table ID are there.

  • BigQuery table schema is defined by the user. You don’t have to write all the fields (most of the can be nullable) but if you try to write a field that doesn’t exist, BigQuery will refuse to write.

Webhook Relay provides a helper package bigquery that can stream writes into . To start ingesting data from webhooks straight into your BigQuery table, create a new Function and just import ‘bigquery’ package:

Configure GCP credentials

Create new with BigQuery Editor permissions

⚡
🦾
Google Cloud BigQuery
service accounts
Google Cloud Platform account
https://cloud.google.com/bigquery/docs/tables