Modern data workflows are often riddled with complexity—requiring multiple tools for transporting, transforming, and storing data before it becomes usable for business users. With DBWrite, Knowi introduces a powerful write-back capability that simplifies this entire process by allowing you to write curated datasets directly back into your database.
Let’s break down how DBWrite transforms your analytics stack and how to get started.
What is DBWrite?
DBWrite is Knowi’s native write-back functionality that allows you to persist Knowi Datasets into your own data destinations—whether it’s Amazon Redshift, Snowflake, MySQL, BigQuery, or PostgreSQL (with more coming soon).
This means you can:
- Write results of transformed queries back into your database
- Make curated datasets accessible to both Knowi users and downstream systems
- Reduce your dependency on external tools like FiveTran or dbt
Why It Matters
Traditional data engineering workflows involve:
- Extracting raw data into a warehouse (using tools like FiveTran)
- Transforming data using tools like dbt
- Making the cleaned data available for business use
While powerful, this setup introduces multiple tools, higher costs, and delays.
Knowi simplifies this with an end-to-end approach:
- Query your raw datastores
- Join across multiple sources
- Apply transformations
- Store the final dataset directly in your destination using DBWrite
Knowi’s Three Data Strategy Options
When working with datasets in Knowi, you can choose from three flexible strategies:
- Direct
Run live queries with optional runtime parameters. Enable caching with TTL for faster performance. - ElasticStore
Store query results in Knowi’s native ElasticStore. Great for run-once or scheduled queries with overwrite strategies (append, upsert, TTL). - Write Back (DBWrite)
Store datasets in your own data warehouse instead of Knowi’s ElasticStore.
How to Set Up DBWrite
1. Define Your Destination
- Go to Datasource Listings
- Edit the datasource you want to write to (or create a new one)
- Check the option- Writable Destination and click on the Save button.
2. Enable Custom Store in Query Settings
- Create a new query on your preferred datasource
- Set Data Execution Strategy to either:
- Run Once
- Scheduled Intervals
- Triggered Query
- Run Once
- Check Use Custom Store
- Select the database set as your DBWrite destination
Based on the datasource (e.g., Amazon Redshift), configure the schema and table name. Then preview the result and click Create and Run to write data back to the database.
Visualize Your Written Data
Once your dataset is written back:
- Use it to create a visualization in Knowi
- Add the widget to your dashboard
- Navigate to the dashboard to view and interact with the data
Knowi will automatically add a column named knowits, which includes the timestamp of when each row was written.
Support for Redshift SUPER Type
Knowi supports Redshift’s SUPER datatype for storing semi-structured or document-like data in your tables. Perfect for advanced use cases requiring schema-less data storage.
🔍 Querying the Written Data
You have two options to query the data written back into your destination:
Method 1 (Recommended): Query from Redshift/Snowflake
- Create a new query directly on your destination database (e.g., Redshift)
- Select the table written via Custom Store
- Apply filters at the database level for faster performance
Method 2: Linked Dataset via ElasticStore
- Add the dataset as a Linked Dataset from Knowi ElasticStore
- Filtering happens after the full dataset loads into memory, which may impact performance for large datasets
With DBWrite, Knowi redefines the analytics workflow by making it simpler, faster, and smarter. You no longer need multiple tools to manage your data pipeline. From querying and transformation to visualization and storage—Knowi brings it all together in one seamless platform.
Ready to streamline your data stack? Set up a demo call with our team today to know more about DBWrite.