MySQL/MariaDB – Best Place to Store Very Large Text Fields

compressionjsonmariadb

I have an app where a queue is being processed, Each time an entry is processed the result is stuck in a separate table. The queue table gets inserts and updates, the result table inserts only. The result table has a text column containing a complete context of how the queue entry was handled encoded in JSON, average length 28,000 characters.

The table is now about 800k rows and 37gigs in size and really slow to deal with when running aggregates. But this particular field with the JSON data is rarely used – kept there just in case there is some important data not parsed into it's own field already.

Moving that field to another table is not out of the question if that is the right way to go, but it will require multiple application changes. I've moved the data from MySQL 5.1 into MariaDB 10.0.14 and put it in TokuDB but found no benefit in speed, but it's compression brought the size down by about 10 gigs though.

I've made no other significant changes to any settings. If there is any tip on how to handle this without making structural changes that would be great.

Best Answer

I'd suggest you split this large column off into its own table. While you're there take out any other columns which are only used once in a blue moon. (The smaller your row the faster your queries will run.)

I'll take your current table as MyData. Rename it to, say, MyData_Crucial. Let's call the new table MyData_Sometimes. Create this view:

create view MyData
select
    <all the columns>
from MyData_Crucial
inner join MyData_Sometimes
    on <primary key> = <primary key>

Now, assuming your optimizer is at least a little bit clever, you will not need to change your application code at all.