This page provides you with instructions on how to extract data from PostgreSQL and analyze it in Superset. (If the mechanics of extracting data from PostgreSQL seem too complex or difficult to maintain, check out Stitch, which can do all the heavy lifting for you in just a few clicks.)
What is PostgreSQL?
PostgreSQL, also called Postgres, is an open source object-relational database management system that runs on all major operating systems. It's known for its stability and its ability to handle high volumes of transactions.
What is Superset?
Apache Superset is a cloud-native data exploration and visualization platform that businesses can use to create business intelligence reports and dashboards. It includes a state-of-the-art SQL IDE, and it's open source software, free of cost. The platform was originally developed at Airbnb and donated to the Apache Software Foundation.
Getting data out of PostgreSQL
Most people retrieve data from relational databases by writing SQL queries. If you're just looking to export data in bulk, however, you can use the command-line tool
pg_dump to export data from a PostgreSQL database as a CSV file or a script that you can run to restore the database on any PostgreSQL server.
Loading data into Superset
You must replicate data from your SaaS applications to a data warehouse before you can report on it using Superset. Superset can connect to almost 30 databases and data warehouses. Once you choose a data source you want to connect to, you must specify a host name and port, database name, and username and password to get access to the data. You then specify the database schema or tables you want to work with.
Keeping PostgreSQL data up to date
The script you have now should satisfy all your data needs for PostgreSQL – right? Not yet. How do you load new or updated data? It's not a good idea to replicate all of your data each time you have updated records. That process would be painfully slow; if latency is important to you, it's not a viable option.
Instead, you can identify some key fields that your script can use to bookmark its progression through the data, and pick up where it left off as it looks for updated data. Auto-incrementing fields such as updated_at or created_at work best for this. When you've built in this functionality, you can set up your script as a cron job or continuous loop to get new data as it appears in PostgreSQL.
From PostgreSQL to your data warehouse: An easier solution
As mentioned earlier, the best practice for analyzing PostgreSQL data in Superset is to store that data inside a data warehousing platform alongside data from your other databases and third-party sources. You can find instructions for doing these extractions for leading warehouses on our sister sites PostgreSQL to Redshift, PostgreSQL to BigQuery, PostgreSQL to Azure Synapse Analytics, PostgreSQL to PostgreSQL, PostgreSQL to Panoply, and PostgreSQL to Snowflake.
Easier yet, however, is using a solution that does all that work for you. Products like Stitch were built to move data automatically, making it easy to integrate PostgreSQL with Superset. With just a few clicks, Stitch starts extracting your PostgreSQL data, structuring it in a way that's optimized for analysis, and inserting that data into a data warehouse that can be easily accessed and analyzed by Superset.