PostgreSQL Datasource Integration

Knowi enables visualization, warehousing and reporting automation from PostgreSQL along with other unstructured and structured datasources.

Overview

  1. Connect, extract and transform data from your PostgreSQL database, using one of the following options:

    a. Through our UI to connect directly, if the database is accessible from the cloud.

    b. Using our Cloud9Agent. This can securely pull data inside your network. See agent configuration for more details.

  2. Visualize and Automate your Reporting instantly.

UI Based Approach

Connecting

  1. Log in to Knowi and select Queries from the left sidebar.

  2. Click on New Datasource + button and select PostgreSQL from the list of datasources.

  3. After navigating to the New Datasource page, either use the pre-configured settings into Cloud9 Chart's own demo PostgreSQL database or follow the prompts and configure the following details to set up connectivity to your own PostgreSQL database:

    a. Datasource Name: Enter a name for your datasource
    b. Host Name: Enter the host name to connect to
    c. Port: Enter the database port
    d. Database Name: Enter the name of the database
    e. Schema Name: Enter the schema name
    f. User ID: Enter the User ID to connect
    g. Password: Enter the password to connect to the database
    h. Database Properties: Additional database connection properties/url parameters. For example, sslmode=require&anotherProp=anotherVal. If you use date or time fields without Timezone in Postgres, by default we will use UTC. To override it, use serverTimeZone. For example: serverTimeZone=US/Central

  4. Establish Network connectivity and click on the Test Connection button.

    Note 1: The connection validity of the network can be tested only if it has been established via Direct Connectivity or an SSH tunnel. For more information on connectivity and datasource, please refer to the documentation on- Connectivity & Datasources.

    Note 2: In case you are connecting through an agent, check Internal Datasource to assign it to your agent. The agent (running inside your network) will synchronize with it automatically. Alternatively, configure the datasource and queries directly through the agent.

  5. Click on Save and start Querying.

Postgresql

Query

Set up Query using a visual builder or query editor

Visual Builder

After connecting to the PostgreSQL datasource, Knowi will pull out a list of collections along with field samples.

Step 1: Generate queries through our visual builder in a no-code environment by either dragging and dropping fields or making your selections through the drop-down.

Query PostgreDB

Tip: You can also write queries directly in the Query Editor, a versatile text editor that offers more advanced editing functionalities like PostgreSQL Query, support for multiple language modes, Cloud9QL, and more.

Furthermore, you can also use the Format button in the query editor to auto-format the query text for indentation, spaces, and more.

Query PostgreDB

Step 2: Define data execution strategy by using any of the following two options:

  • Direct Execution: Directly execute the Query on the original Datasource, without any storage in between. In this case, when a widget is displayed, it will fetch the data in real time from the underlying Datasource.

  • Non-Direct Execution: For non-direct queries, results will be stored in Knowi's Elastic Store. Benefits include- long-running queries, reduced load on your database, and more.

Non-direct execution can be put into action if you choose to run the Query once or at scheduled intervals. For more information, feel free to check out this documentation- Defining Data Execution Strategy.

Data Strategy PostgreDB

Step 3: Click on the Preview button to analyze the results of your Query and fine-tune the desired output, if required.

Preview Results

The result of your Query is called Dataset. After reviewing the results, name your dataset and then hit the Create & Run button.

Create and Run

Query Editor

A versatile text editor designed for editing code that comes with a number of language modes including PostgreSQL Query Language (PQL) and add-ons like Cloud9QL, and AI Assistant which empowers you with powerful transformations and analysis capabilities like prediction modeling and cohort analysis if you need it.

Create and Run

AI Assistant

AI assistant query generator automatically generates queries from plain English statements for searching the connected databases and retrieving information. The goal is to simplify and speed up the search process by automatically generating relevant and specific queries, reducing the need for manual input, and improving the probability of finding relevant information.

Step 1: Select Generate Query from AI Assistant dropdown and enter the details of the query you'd like to generate in plain English. Details can include table or collection names, fields, filters, etc.

Example: Show me the country, state and currency of all customer

Note: The AI Assistant uses OpenAI to generate a query and only the question is sent to OpenAI APIs and not the data.

Create and Run

Step 2: Define data execution strategy by using any of the following two options:

  • Direct Execution: Directly execute the Query on the original Datasource, without any storage in between. In this case, when a widget is displayed, it will fetch the data in real time from the underlying Datasource.

  • Non-Direct Execution: For non-direct queries, results will be stored in Knowi's Elastic Store. Benefits include- long-running queries, reduced load on your database, and more.

Non-direct execution can be put into action if you choose to run the Query once or at scheduled intervals. For more information, feel free to check out this documentation- Defining Data Execution Strategy.

Data Strategy PostgreDB

Step 3: Click on the Preview button to analyze the results of your Query and fine-tune the desired output, if required.

Data Strategy PostgreDB

Note 1: The OpenAI must be enabled by the admin before using the AI Query Generator. 

{Account Settings > Customer Settings > OpenAI Integration}

Note 2: The user can copy the API key from the personal OpenAI account and use the same or use the default key provided by Knowi.

Furthermore, AI Assistant offers you additional features that can be performed on top of the generated query as listed below:

  • Explain Query
  • Find Issues
  • Syntax Help
Explain Query

Provides explanations for your existing query. For example, an explanation requested for the query generated below AI Assistant has returned the description-

This PostgreSQL query is selecting the country, state, and currency columns from the customer table (c). The query is returning all of the data from these three columns.

Find Issues

Helps in debugging and troubleshooting the query. For example, finding issues in the query generated below returns this error- The country name is misspelled (should be "country")

Syntax Help

Ask questions around query syntax for this datasource. For example, suggesting the syntax for the requested query returned the response- SELECT * from customers

Cloud9Agent Configuration

As an alternative to the UI based connectivity above, you can use Cloud9Agent inside your network to pull from PostgreSQL securely. See Cloud9Agent to download your agent along with instructions to run it.

Highlights:

  • Pull data using SQL.
  • Execute queries on a schedule, or, one time.

The agent contains a datasource_example_postgres.json and query_example_postgres.json under the examples folder of the agent installation to get you started.

  • Edit those to point to your database and modify the queries to pull your data.
  • Move it into the config directory (datasource_XXX.json files first if the Agent is running).

Datasource Configuration:

Parameter Comments
name Unique Datasource Name.
datasource Set value to postgres
url URL with host, port and database name to connect to. Example for PostgreSQL: localhost:5432/cloud9demo
userId User id to connect, where applicable.
Password Password, where applicable
userId User id to connect, where applicable.

Query Configuration:

Query Config Params Comments
entityName Dataset Name Identifier
identifier A unique identifier for the dataset. Either identifier or entityName must be specified.
dsName Name of the datasource name configured in the datasource_XXX.json file to execute the query against. Required.
queryStr PostgreSQL SQL query to execute. Required.
frequencyType One of minutes, hours, days,weeks,months. If this is not specified, this is treated as a one time query, executed upon Cloud9Agent startup (or when the query is first saved)
frequency Indicates the frequency, if frequencyType is defined. For example, if this value is 10 and the frequencyType is minutes, the query will be executed every 10 minutes
startTime Optional, can be used to specify when the query should be run for the first time. If set, the the frequency will be determined from that time onwards. For example, is a weekly run is scheduled to start at 07/01/2014 13:30, the first run will run on 07/01 at 13:30, with the next run at the same time on 07/08/2014. The time is based on the local time of the machine running the Agent. Supported Date Formats: MM/dd/yyyy HH:mm, MM/dd/yy HH:mm, MM/dd/yyyy, MM/dd/yy, HH:mm:ss,HH:mm,mm
c9QLFilter Optional post processing of the results using Cloud9QL. Typically uncommon against SQL based datastores.
overrideVals This enables data storage strategies to be specified. If this is not defined, the results of the query is added to the existing dataset. To replace all data for this dataset within Knowi, specify {"replaceAll":true}. To upsert data specify "replaceValuesForKey":["fieldA","fieldB"]. This will replace all existing records in Knowi with the same fieldA and fieldB with the the current data and insert records where they are not present.

Datasource Example:

[
  {
     "name":"demoPostgres",
     "url":"localhost:5432/cloud9demo",
     "datasource":"postgresql",
     "userId":"cloud9demo",
     "password":"cloud92014"
  }
]

Query Examples:

[
  {
    "entityName":"Errors",
    "dsName":"demoPostgres",
    "queryStr":"select error_condition as 'Error', count 'Count' from errors",
    "frequencyType":"minute",
    "frequency":10,
    "overrideVals":{
      "replaceAll":true
    }
  },
  {
    "entityName":"Queues",
    "dsName":"demoPostgreSQL",
    "queryStr":"select Name, size as 'Queue Size', Type from queue",
    "overrideVals":{
      "replaceValuesForKey":["Type"]
    },
    "startTime":"07:20",
    "frequencyType":"daily",
    "frequency":1
  }
]

The first query is run every 10 minutes at the top of the hour and replaces all data for that dataset in Knowi. The second is run once a day at 07:20 AM and updates existing data with the same Type field, or inserts new records otherwise.