MySQL – MySQLDump problems with large table

backupMySQLmysqldump

I have a MySQL database that is 10.6GB – that has both MyISAM & InnoDB tables.
One particular table (called 'responses') is 8Gb – this table is InnoDB.

The live database is replicated to a different server & I take backups using MySQLDump from the replication server… the live server runs Windows, but the replication server runs CentOS.

The MySQLDump command is:

mysqldump --verbose --lock-tables=true --max-allowed-packet=1024M --host=192.168.1.182 --user=myusername --password=mypassword --opt --databases databasename > databasename.sql

The command runs until half-way through the 'responses' table – when we get:
Error 2013 – Lost connection to MySQL server during query when dumping table responses at row 12891212

After some Googling, I tried added the following commands to the MySQLDump (independently & at the same time):

--net-buffer-length=32704

and

--skip-extended-insert

The error still occurred at the same place.

For completeness, I've tried dumping the database in NaviCat and MySQL Workbench – both return the same error.

So, I wondered if it was an issue with corruption. I tried to run the following
(on the replication server):

CHECK TABLE responses;

and

CHECK TABLE responses EXTENDED;

Both returned the following:
Error 2013 – Lost connection to MySQL server during query.

The live database is VERY busy, so I'm a bit worried… I started CHECK TABLE on the live table, but unsurprisingly, it caused the application to stop so I had to stop it.

Any advise would be welcome here… the ultimate goal is to get the backups using MySQLDump working.

Thanks

Update: I tried copying the table (on the replication server) to a new database on another server using NaviCat – it failed with the same message (transferred records: 12,889,214):
Error 2013 – Lost connection to MySQL server during query

Best Answer

Your problem stems from network. In this instance, it is not the buffer used for data transmission over the network. It is the length of time the network tries to keep alive in so doing.

Please note these options:

  • net_read_timeout (Default 30) : The number of seconds to wait for more data from a connection before aborting the read. When the server is reading from the client, net_read_timeout is the timeout value controlling when to abort. When the server is writing to the client, net_write_timeout is the timeout value controlling when to abort. See also slave_net_timeout.
  • net_write_timeout (Default 60) : The number of seconds to wait for a block to be written to a connection before aborting the write. See also net_read_timeout.

You have a busy server and, at some moment in time, the data you need to read for the mysqldump to continue times out.

What I have done in the past is run the following on both source and target servers:

mysql> SET GLOBAL net_read_timeout = 31536000;
mysql> SET GLOBAL net_write_timeout = 31536000; 

Then, go run the mysqldump.

Setting these values very high forces the mysqldump DB Connection to wait a long time for data to be shipped over. In case you are wondering, The number 31536000 is the number of seconds in a year (365 * 24 * 60 * 60). The mysqldump DB connection will just have to stay around and wait, even for your busy server, until data comes (eventually). Please see my post from Jan 12, 2012 : Client times out, while MySQL query remains running?

GIVE IT A TRY !!!

UPDATE 2018-01-04 09:53 EST

I noticed your comment just now

Please note the error message you got from the mysqldump:

Error 2013 - Lost connection to MySQL server during query when dumping table responses at row 12891212

Please take note of the row number 12891212. If that row number comes back in the error message every single time your mysqldump fails, then maybe you can suspect data corruption. However, if your get a different row number with every mysqldump attempt, this suggests mysqldump connection is able to move along the large table and just timeout and some random point.

Please go set those values and try the mysqldump again. Then, come back back tell us what happened.