Postgresql – Importing LARGE quantity of data into a PostgreSQL database

importpostgresql

I am importing a large amount of data into a PostgreSQL server (across multiple databases on the server).

When I last attempted to import the data, it effectively "crashed" my machine (Ubuntu 10.0.4), because /var/log/postgres contained over 360G of files (and I was only about 1 half way through the import).

I am not sure what the log files are for (I am guessing it must be for ACID compliancy etc). But I would like to know if there is a way to reduce the size of the log files generated during import of a large amount of data.

Best Answer

Log files in /var/log/postgres are only for your information (in a standard installation) and do not serve the system itself, especially not "ACID compliancy". They need to be writable once configured, that's all the server needs.

There are a number of settings in your postgresql.conf that govern what is logged. Most of them can also be set at the command line to overrule the setting in the config file. We are not talking about WAL files, which are not usually placed in /var/log/postgres.

I would drastically reduce the verbosity in your case. Among others, I would set (and reload the server):
log_statement = none

Depending how you import the data, you probably need to do more than that. Look at:
log_min_messages

To skip writing log files altogether you can set:
log_destination = 'stderr' ...

More in the manual.