Mysql – (Partially) clone a huge DB for DEV purposes

backupMySQLmysqldump

I want to setup a dev and test environment for website that has a huge MySQL database. Since this DB is so big, it is quite inconvenient to clone it 1-on-1, because it simply takes too long and uses too much storage.

My ideas was to only dump / copy certain records, matching some SQL. For example, only the last 1000 users and last 1000 products in the DB, assuming we only need this much data for testing and development.

My questions are:

  1. How to best go about this, keeping in mind the DB is live at the time of the dumping (referential integrity), and keep it from going offline
  2. The data is complex and has many joins, making insert into … select … statements a bit complex, but still possible.

My ideas was to use mysqldump to set up only the structure (no data), then use a long list of insert into … select … statements to fill the tables 1 by 1.

Only issues I am uncertain of is the referential integrity and processing the tables in the right order, and maybe there is a better way?

The DEV environment is my personal laptop, so not a server with huge disks.

Best Answer

Answers originally left in comments

A test environment should probably have a full copy and could be populated form the backups (and it double checks they are working right). – danblack

My ideas was to use mysqldump to set up only the structure (no data), then use a long list of insert into … select ... statements to fill the tables 1 by 1.

Use SELECT .. INTO OUTFILE and LOAD DATA INFILE instead of building SELECTs. Plus processing the tables in the right order of course. – Akina