I'm working with a Rails app backed by a MySQL database. One of the more-frequently used models is backed by a table with over 200,000,000 rows (and growing). We've been aware for a while that the way it's set up isn't scalable.
The feature I'm working on at the moment requires me to add a couple new columns to that table, as well as a compound index. My coworker did a test-run on her local dev machine and it took about an hour per column (didn't even test adding the index yet).
We're considering taking a different approach, building a new table with the needed columns, dumping the data from the old table to the new one, then pointing the Rails model at the new table instead. Feels like a short-term solution for a large looming problem, but I'm curious if anyone has insight on how to navigate this problem.
We've also talked about archiving the table's old records. It's a table that gets written to and read from a lot, but only with recent records (the app probably never needs to do anything with records more than a month old).
Best Answer
ALTER TABLE table1 ADD col1 INT, ADD col2 CHAR(2), ADD INDEX (col1, col5);
. More about ALTER