I have a relatively large InnoDB table that contains domain names, I want to track when they expire, their traffic rank, owner email, and many parameters like that.
Currently I have around 2 million records, I intend to keep it under 5 million.
I wonder if there is anything else I can do to optimize it, currently a query like this takes around 4-8 seconds:
SELECT *
FROM domains
WHERE domain_whois_date > 0
ORDER BY domain_whois_status asc
LIMIT 500
I consider that memory and caching related options I set in my.cnf work decently because if I quickly run the same query again it loads in few ms. But table updates very often and most queries are not cached and take longer (so I understand).
So far my best solution is to create a memory based table, a mirror of domains
called domains_cache
and run searches in that; but seems to crash often, I think it's because this memory table takes almost 100% of ram.
Is this the right approach ?
Main InnoDB table takes around 500MB, memory table with same data uses around 2GB.
Running on CentOS virtual machine, 2 cores, 4GB ram.
Best Answer
In any SQL queries, sequence will be next:
first, select target records:
then sort result:
and only after - take some number of records for return to client:
so, in table with 2M records, big part of them will be have Unix date > 0 not many domain was created before - 1/1/1970 and You sort all this records without indexes
You can create or
or at least index for
If most of Your domain_whois_date - positive