Postgresql – How to load 100k + Records into progres database from a CSV file

bulkloadpostgresql

Our project needs to load 100k+ rows into a Postgres table. We have a huge .CSV file with all these 100K+ rows and we need to load this data into a table effectively and quicker (may be in the form of batches so that unloaded data/batch can be revisited, corrected and loaded again).

Our project involves a Java application and we initially thought of using JDBC, but reading each row and committing to database slows down the application.

Please suggest if you know an approach to get this job done.