I'm backing up production data locally to perform experiments and investigate production bugs.
I need only a handful of tables from product DB. I'm using generic mysqldump
and now facing a problem where backing up & restoring takes around 20 minutes.
My question is: is there a way to incrementally amend the data I already have with what appeared on production since the last time I dumped the selected tables?
Best Answer
You have few options to consider but first I hope you're not troubling your production server but may be a slave of it to dump out things? (things may get bad there.)
mysqdump --where=' modified_date_column>="Last_bkup_dttime"' XYZ tabll1 table2 table3 > tables.sql
How about setting a slave replicating only tables of choice?
Did you know about mydumper/loader? That can take faster dumps than mysqldump.
Are you backing up your data daily? If yes then why don't you setup an automatic nigtly restore process that restores your dev with latest backup?
You may also choose to play the binary logs from production and discard other tables:
Generate sqls from each of the binlogs:
mysqlbinlog --db=dbofyourchoice binlog.000xxx > binlog.000xxx.sql
Play that all those sqls with --force to ignore errors.
mysql dbofyourchoice --force < binlog.000xxx.sql