Google BigQuery Datasource Integration

Knowi enables data discovery, query, aggregation, visualization and reporting automation from Google BigQuery along with other unstructured and structured datasources.

Overview

  1. Connect, extract and transform data from your Google BigQuery, using one of the following options:

    a. Through our UI to connect directly.

    b. Using our Cloud9Agent. This can securely pull data inside your network. See agent configuration for more details.

  2. Visualize and Automate your Reporting instantly.

UI Based Approach

Connecting

In a UI-based approach, Knowi provides two methods to establish a connection:

I. Connecting by OAuth
II. Connecting by Credentials File

Method I: Connecting by OAuth

  1. Log in to Knowi and select Queries from the left sidebar.

  2. Click on New Datasource + button and select Google BigQuery from the list of datasources.

  3. After navigating to the New Datasource page, either use the pre-configured settings into Cloud9 Chart's own demo Google BigQuery database or follow the prompts and configure the following details to set up connectivity to your own Google BigQuery database:

    a. Datasource Name: Enter a name for your datasource
    b. Authentication Type: Select authorization type OAuth from the dropdown menu
    c. Google BigQuery Project Name: Select the project Name associated with this account (It appears if OAuth is selected)
    d. Refresh Token: This is used to connect and pull your GA reports (It appears if OAuth is selected)

  4. Click on Save button and start Querying.

adding-googlebigquery

Method II: Connecting by Credential File

The credential file is referred to as a service account key or JSON key file. This file contains the necessary credentials and authentication information required to set up a connection with GCP services & APIs.

Before you can connect to a BigQuery project, it is necessary to create a service account in the GCP console with added access permissions and download the credentials file by following the below steps:

Create Service Account

To create a service account in the GCP console, please refer to this Create Service Accounts guide.  

While creating a service account, you will need to grant it specific access permissions called Roles.

Have a look at the three basic roles to choose from below:

Role Description
Editor View, create, update, and delete most Google Cloud resources
Owner Full access to most Google Cloud resources
Viewer View most Google Cloud resources

adding-googlebigquery

Download Credentials File

Step 1: Launch your web browser and navigate to the Google Cloud Console (https://console.cloud.google.com). Make sure you are logged in with the appropriate Google account that has access to the desired GCP project.

adding-googlebigquery

Step 2: Select the project for which you want to download the credentials file from the top navigation.

adding-googlebigquery

Step 3: Click on the Navigation Menu and select IAM from the drop-down menu given under the IAM & Admin section.

adding-googlebigquery

You will be redirected to the Project page.

adding-googlebigquery

Step 4: Click on the Service Accounts option from the left sidebar.

adding-googlebigquery

Step 5: Locate the service account for which you want to download the credentials file and click on it.

adding-googlebigquery

Step 6: In the Service account details page, click on the KEYS tab from the top navigation.

adding-googlebigquery

Step 7: Click on the ADD KEY button and select Create new key from the dropdown menu.

adding-googlebigquery

A Create Private Key dialog will appear.

adding-googlebigquery

Step 8: Select the key type as JSON and click on the CREATE button to download the credentials file to your local system.

The file will have a .json extension and contains the necessary credentials for authenticating your applications with GCP services.

adding-googlebigquery

Connect Datasource

  1. Log in to Knowi and select Queries from the left sidebar.

  2. Click on New Datasource + button and select Google BigQuery from the list of datasources.

  3. After navigating to the New Datasource page, either use the pre-configured settings into Cloud9 Chart's own demo Google BigQuery database or follow the prompts and configure the following details to set up connectivity to your own Google BigQuery database:

    a. Datasource Name: Enter a name for your datasource
    b. Authentication Type: Select authorization type Credential File from the dropdown menu
    c. Credential File: Name of your uploaded credential JSON file (It appears if Credential File is selected)
    d. Upload file: Click on the Upload file option to upload the credential JSON file.

  4. Click on the Save button to start querying.

adding-googlebigquery

Query

Step 1: Query using a visual builder or query editor

Visual Builder: After connecting to the Google BigQuery datasource, Knowi will pull out a list of tables along with field samples. Using these tables, you can automatically generate queries through our visual builder in a no-code environment by either dragging and dropping fields or making your selections through the drop-down.

visual-builder

Tip: You can also write queries directly in the Query Editor, a versatile text editor that offers more advanced editing functionalities like BigQuery Query, support for multiple language modes, Cloud9QL, and more.

Furthermore, you can also use the Format button in the query editor to auto-format the query text for indentation, spaces, and more.

Query PostgreDB

Step 2: Define data execution strategy by using any of the following two options:

  • Direct Execution: Directly execute the Query on the original Datasource, without any storage in between. In this case, when a widget is displayed, it will fetch the data in real time from the underlying Datasource.

  • Non-Direct Execution: For non-direct queries, results will be stored in Knowi's Elastic Store. Benefits include- long-running queries, reduced load on your database, and more.

Non-direct execution can be put into action if you choose to run the Query once or at scheduled intervals. For more information, feel free to check out this documentation- Defining Data Execution Strategy

data-strategy

Step 3: Click on the Preview button to analyze the results of your Query and fine-tune the desired output, if required.

preview

The result of your Query is called Dataset. After reviewing the results, name your dataset and then hit the Create & Run button.

create-and-run


Cloud9Agent Configuration

As an alternative to the UI based connectivity above, you can use Cloud9Agent inside your network to pull from Google BigQuery securely. See Cloud9Agent to download your agent along with instructions to run it.

Highlights:

  • Pull data using SQL.
  • Execute queries on a schedule, or, one time.

The agent contains a datasource_example_bigquery.json and query_example_bigquery.json under the examples folder of the agent installation to get you started.

  • Edit those to point to your database and modify the queries to pull your data.
  • Move it into the config directory (datasource_XXX.json files first if the Agent is running).

Datasource Configuration:

Parameter Comments
name Unique Datasource Name.
datasource Set value to bigquery
authRefreshToken OAuth Offline Token generated by Google BigQuery
projectId Project ID is a Google Unique identifier for the Google BigQuery enabled project. To determine project ID, login to your Google Cloud Console and navigate to your [Project](https://console.cloud.google.com/project) page.

Query Configuration:

Query Config Params Comments
entityName Dataset Name Identifier
identifier A unique identifier for the dataset. Either identifier or entityName must be specified.
dsName Name of the datasource name configured in the datasource_XXX.json file to execute the query against. Required.
queryStr BigQuery query to execute. Required.
frequencyType One of minutes, hours, days,weeks,months. If this is not specified, this is treated as a one time query, executed upon Cloud9Agent startup (or when the query is first saved)
frequency Indicates the frequency, if frequencyType is defined. For example, if this value is 10 and the frequencyType is minutes, the query will be executed every 10 minutes
startTime Optional, can be used to specify when the query should be run for the first time. If set, the the frequency will be determined from that time onwards. For example, is a weekly run is scheduled to start at 07/01/2014 13:30, the first run will run on 07/01 at 13:30, with the next run at the same time on 07/08/2014. The time is based on the local time of the machine running the Agent. Supported Date Formats: MM/dd/yyyy HH:mm, MM/dd/yy HH:mm, MM/dd/yyyy, MM/dd/yy, HH:mm:ss,HH:mm,mm
c9QLFilter Optional post processing of the results using Cloud9QL. Typically uncommon against SQL based datastores.
overrideVals This enables data storage strategies to be specified. If this is not defined, the results of the query is added to the existing dataset. To replace all data for this dataset within Knowi, specify {"replaceAll":true}. To upsert data specify "replaceValuesForKey":["fieldA","fieldB"]. This will replace all existing records in Knowi with the same fieldA and fieldB with the the current data and insert records where they are not present.

Examples

Datasource Example:

[
    {
        "name":"demoBigQuery",
        "datasource":"bigquery",
        "authRefreshToken":"1/zLmlvsB3phRobEL39hTMlwxQzlF0kGsHNwE_-Czg4nM",
        "projectId":"cloud9chartsproject"
    }
]

Query Examples:

[
    {
        "entityName" : "demoBigQuery",
        "dsName" : "demoBigQuery",
        "queryStr" : "select sum(Sent) as Sent, sum(Opened) as Opened, Week\nfrom demo.data\ngroup by Week\nlimit 1000",
        "c9QLFilter" : "",
        "overrideVals" : {
            "replaceAll" : true
        }
    }
]