Postgresql – Can multiple cores be used by PostgreSQL to run code in sections (cursors) of a table effecting parallelism

parallelismpostgresql

Let's say I have a 1 million record table. Each record is uniquely identifiable by id. If the table was divided into 10 cursors and 10 requests were made by a web client for each cursor to say change the last name field to upper, and 10 cores were available, would the work get done 10x faster? Or any time faster, compared to a straight for .. loop regular processing of the entire table?

I know PostgreSQl can't run parallel queries that make changes. I'm trying to get around that by using code instead. Note that these records are independent. All changes are strictly particular to the record.

Best Answer

Actually, PostgreSQL 9.6 can run parallel (read) queries. Parallel writes aren't supported yet.

The approach you suggest would work, but some care would be required to ensure it was consistent, since each cursor would be a different session with a different snapshot of the database.

You could instead just run 10 UPDATEs from 10 different clients. Same effect, simpler, and probably faster. Just make sure the targeted row ranges don't overlap.