I want to update a large number of rows (more than 80k) from an external data source (from file) into an existing mysql table. This database is InnoDB.
Database structure:
CREATE TABLE IF NOT EXISTS `jobs` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`job_date` date DEFAULT NULL,
`client_code` varchar(100) DEFAULT NULL,
`department` varchar(100) DEFAULT NULL,
`job_order` float(10,2) DEFAULT NULL,
`completion_date` date DEFAULT NULL,
`achivement_date` date DEFAULT NULL,
`achivement_value` float(10,2) DEFAULT NULL,
`current_status` varchar(100) DEFAULT NULL,
`supervisor` varchar(100) DEFAULT NULL,
`job_no` varchar(50) DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1 AUTO_INCREMENT=1
Update QUERY
UPDATE jobs SET job_date='$job_date',client_code='$value[1]',department='$value[2]',job_order='$job_order',completion_date='$completion_date',achivement_date='$achivement_date',achivement_value='$achivement_value',current_status='$value[7]',supervisor='$value[8]' WHERE job_no='$value[9]
How to optimize this query ?
Best Answer
If the job_no is unique it could be much faster (adding a unique index to the job_no). but even if its not unique this query could be optimized just adding an index to the job_no
ALTER TABLE jobs ADD INDEX ind_job_no (job_no) USING BTREE;
adding this index will allow you to update quicker.