A lot of data in the table is only accessed for 4 days or so, how to handle it afterwards

oracletable

We have a table that accumulates messages from local air traffic. It's approximately 5k-20k data rows a day that will only be actively accessed for a week at most. Afterwards it's just a log in case of some emergency (like inspection) and basically a very dead chunk of data. Like, archaeologically dead.

What is the optimal way to deal with it? Our "DB guy" argues that "it's all indexed" and there is no harm in having millions upon millions of accumulated unused records, but I am not really convinced. Is this really a good idea to allow the table to grow without limits?

UPD: We are using Oracle 11. The access patterns are like this:

1) A couple of applications keep track of the new rows added, querying on id > lastId and indexed TM interval of last day each approximately 2-3 times a minute

2) A couple of rows per minute are added to this table and then actively parsed into other tables

3) every time a user selects a message or does a search (I guess 10-20 times a minute per app running) a query is done into this table with custom where on approx. 10 indexed fields.

Best Answer

Yes, there is a cost to having lots of cold data in actively-queried tables. One obvious case is that bigger tables have deeper indexes (i.e. more pages between the root and the leaves) so require more IO to read and write.

I'd suggest you use partitioning. Keep the active partitions in the current table and swap the others out. Ideally you'd then use a maintenance task to move the historical data to a separate DB for archive. Make it clear to your external users what the time cut-off is and allow them to query either accordingly.