How to Improve System Liveness with Long-Running Transactions

concurrencydatabase-designisolation-levellockingtransaction

Is there any recommended approach for improving the application liveness when having, over the same database object, multiple reading transactions, and one long-running transaction which is updating the data?

For example, a transaction Ti is updating several records of [CUSTOMER] table (which can last hours long), and multiple Tj transactions are trying to read the number of orders of each client. I'm assuming that if Ti gets an exclusive lock, all Tj transactions will be suspended resulting in a bad performance (low liveness).

My application is at design time, and I haven't chosen a concrete DBMS yet.

My goal is that Tj are served with the newest version of [CUSTOMER] before Ti began and that Ti doesn't block Tj.

Best Answer

You could switch to an optimistic concurrency control (OCC). Typically writers do not block readers. The throughput will definitely improve in the scenario you describe. Depending how you define "liveness" it may be worse than present.

OCC gives each transaction a snapshot of the data. Typically this represents committed values at the point the transaction started though details vary. So a transaction may not read the most recently committed value since that value was written after the transaction started. It is a trade-off between latency and correctness. Choose which is preferable for this scenario.