Our project needs to load 100k+ rows into a Postgres table. We have a huge .CSV
file with all these 100K+ rows and we need to load this data into a table effectively and quicker (may be in the form of batches so that unloaded data/batch can be revisited, corrected and loaded again).
Our project involves a Java application and we initially thought of using JDBC
, but reading each row and committing to database slows down the application.
Please suggest if you know an approach to get this job done.
Best Answer
You can use the COPY command for this.
See: http://www.postgresql.org/docs/9.3/static/sql-copy.html https://stackoverflow.com/questions/2987433/how-to-import-csv-file-data-into-a-postgres-table