Postgresql – Using pg_dump to create 144 backups per day

backuppg-dumppostgresql

Imagine the following scenario (this may not be a viable or sensible idea but please imagine that it is).

Using Ubuntu, a cron job calls pg_dump and backs up a Postgres database every 10 minutes. In other words: 144 times per day.

The goal is to ensure that data which is deleted from the original "live" database is maintained in the backups. All deleted data is preserved in the backups. Since the data is deleted regularly, there needs to be backups every 10 minutes.

Would the cron job calling pg_dump create 144 individual backups?

Can these 144 backups be merged together after every 24 hour period and information extracted with SELECT?

The issue also is that one might only want to check the data in a specific table yet the whole database is backed up.

What would be the best way to achieve this?

Thank you.

Best Answer

That is silly You should instead take one pg_basebackup per day and archive the transaction log with archive_command. That way. you can use point-in-time-recovery to restore the database to any given point in time.

See the documentation for more.