Mysql – How to optimise read/write performance a MySQL table with many rows

MySQLmysql-5.1performancequery-performance

I am developing a django application where I use a MySQL database. In my application there is one table for domains information which has only 2 columns

id
domain_name

but I have a problem when I have imported almost 14,150,000 rows in that table. I have not done any optimisations yet (using default settings) and there is also a possibility of importing another 14,123,500 records.

Right now I am not able read that data in my application.

Which steps should I follow to perform a read,search operation faster?

Best Answer

As you need to perform a read,search operation faster you can use the MyISAM as the table engine if the table will not have the High write's in future.

When you will use the MyISAM Engine for fast read/search for this table ,you need to set the Key_Buffer_size to some appropriate value depending upon the index size of your MyISAM tables and also the amount of available RAM.

As the Table having two columns id and domain name, so you can use the FULL TEXT index on domain name so it will make searching fast on the column domain name.

Look the structure which I have Created

mysql> show create table domain\G
*************************** 1. row ***************************
       Table: domain
Create Table: CREATE TABLE `domain` (
  `id` int(10) NOT NULL AUTO_INCREMENT,
  `domain_name` varchar(255) DEFAULT NULL,
  PRIMARY KEY (`id`),
  FULLTEXT KEY `domain_name` (`domain_name`)
) ENGINE=MyISAM DEFAULT CHARSET=latin1
1 row in set (0.00 sec)

Restore the data in table and find out index_length for MyISAM table and set the Key_Buffer_size accordingly.

See How to set Key Buffer Size.