I'm about to design a DB for a finance institution and the last normalization of tables shows that the necessity of creating a single table for all the bank's transaction.
As per the calculations, several thousand transactions per day and this table would hit few million entires withing next few months.
1) When querying and writing data, would server perform very slow?
2) Is there any best way to handle the situation?
Best Answer
I have seen a billion-row table hum along fine.
I have seen a thousand-row table cause the bones of the server to creak and groan.
It all depends on the queries.
"several thousand transactions per day" -- No Problem. "several thousand transactions per second" -- This gets somewhat challenging, but probably doable.
What to do?
SHOW CREATE TABLE
COUNT(*)
If this is really a financial institution, be sure to hire a security consultant. You do not want the hassle you will get if the data is lost/stolen.