Mysql – How would you optimize this 2 milion records table or queries

memory-optimized-tablesMySQLoptimizationperformancequery-performance

I have a relatively large InnoDB table that contains domain names, I want to track when they expire, their traffic rank, owner email, and many parameters like that.

Currently I have around 2 million records, I intend to keep it under 5 million.

I wonder if there is anything else I can do to optimize it, currently a query like this takes around 4-8 seconds:

SELECT * 
FROM domains 
WHERE domain_whois_date > 0 
ORDER BY domain_whois_status asc 
LIMIT 500

I consider that memory and caching related options I set in my.cnf work decently because if I quickly run the same query again it loads in few ms. But table updates very often and most queries are not cached and take longer (so I understand).

So far my best solution is to create a memory based table, a mirror of domains called domains_cache and run searches in that; but seems to crash often, I think it's because this memory table takes almost 100% of ram.
Is this the right approach ?

Main InnoDB table takes around 500MB, memory table with same data uses around 2GB.
Running on CentOS virtual machine, 2 cores, 4GB ram.

enter image description here

Best Answer

In any SQL queries, sequence will be next:

first, select target records:

WHERE domain_whois_date > 0 

then sort result:

ORDER BY domain_whois_status asc 

and only after - take some number of records for return to client:

LIMIT 500

so, in table with 2M records, big part of them will be have Unix date > 0 not many domain was created before - 1/1/1970 and You sort all this records without indexes

You can create or

index for - (domain_whois_date,domain_whois_status)

or at least index for

domain_whois_status

If most of Your domain_whois_date - positive