PostgreSQL – How to Divide Log Files for Specific DB Queries

loggingplpgsqlplpythonpostgresqlunix

Currently my db has a couple of triggers for some (not all) tables that get executed when a specific number of columns has been updated. I keep track if the triggers for those rows on a boolean column in the tables.

Id like to be able to log UPDATE and INSERT queries and the errors/warnings coming from them onto different files in my postgresql server as a singular log file can get quite big.

Ive thought of a couple of approaches to this problem and am looking for a better approach than these;

  • have the triggers (plpython3) create seperate log files when they only get called (not a lot but they insert millions of rows onto different tables.

  • write additional triggers that insert logging information onto another table (either by explicit means or a single JSON column)

thank you for your time and answers!

Best Answer

There is only a single PostgreSQL log file per database cluster, so you cannot have that out of the box.

Adding triggers that log to a file is an option, but a trigger (and the logging itself) can add significant performance overhead, certainly more than the PostgreSQL logging itself.

The route I would explore is this:

  • Try to perform the actions you want to log with a different database user. If that is not easy, be creative: perhaps you can execute SET ROLE/RESET ROLE commands at strategic points in debug mode to become a different user.

  • Enable logging for that user only:

    ALTER ROLE myuser SET log_min_duration_statement = 0;
    

Then it will be easier to evaluate the log, because it will be smaller.