How do you sync pre-aggregated table in Postgresql?

EliiotM

New member
Local time
Tomorrow, 01:48
Joined
Jan 24, 2022
Messages
9
I have a problem with aggregating data in one table. Without caching it takes around a minute, so I've decided
So everything seems to be going great, but I have a small question. Our backend is inserting data to the main table, lets call it "A". My pre-aggregated table "A1" is being created by cron job and is populated in midnight. But what about after? Do i just insert the data twice into two tables?
 
Hello, EliiotM, ... I can't answer your question, but making by this post, I will move your question back up to the top for someone to take a shot at answering for you. I'm going to guess that we don't have a LOT of Postgresql folks here.
 
Do i just insert the data twice into two tables?
If the data in table A1 is aggregated, just inserting a new record will probably not produce the desired results.

There are two basic options.
1. Make it clear to the users that the aggregated data may be up to 24hrs out-of-date.
2. Whenever a records is modified in the main table A, rebuild the aggregated A1 table. Of course, if this happens frequently, it will invalidate the purpose of the aggregated table.

There may be additional options specific to PostgreSQL, which I do not know. A simple Google search shows several results, which may be of interest.
 
When you have files that need to be downloaded every day, the quickest method is to use truncate to get rid of the old data and then an append query to transfer the downloaded file to the table.

Do not try to update rows. Just replace the entire table.
 
in SQL Server, there are cases where an "Upsert" is more efficient, but it all depends on the AMOUNT of data - frankly, I've never had a good case for it, but there are those who deal with large data in SQL Server that swear by the usefulness of the MERGE statement.

Maybe your RDBMS has something like that.
 
I have found SQL Merge to be very useful in maintaining data lists that change frequently, but as you say it's quite an unusual requirement.
 
EliiotM,

As Doc said there aren't too many Postgresql people here. Others have provided some advice, but I'd like to hear a little more about the requirement and the timing of the data and the "refresh/update" of the database for users.

We had a set up where we had
-a corporate database
-various point-in-time databases (I think these relate to your aggregated data)
-some local, custom "personal" databases.

The point-in-time were "aggregations" for Month-end/quarter-end/ year-end type things.
These were derived from corporate and structured/frozen as ReadOnly.
These offered consistent data for analysis/reporting.

Good luck with your project.
 

Users who are viewing this thread

Back
Top Bottom