Mysql – Very Slow Query

MySQLmysql-5.6mysql-5.7performancequery-performance

Problem

Below is a query I put together in MySQL Workbench that I thought would be relatively simple. But when running it, on a larger data set (7000+ rows), I find it taking over 200 seconds, uses 1GB of RAM, and downloading data from the MySQL server at a rate of 10+ MB/s that entire time. There isn't 100KB worth of data in the entire table. So clearly I'm creating some kind of god awful loop here somehow. But I'm not seeing it.

What I'm doing

Effectively what this query is doing is taking the specified columns from each row, escaping single and double quotes, then contacting them together into a JSON format exactly as needed for a 3rd party API. After this part of the query is done, I then select the last row, and return it. (Not seen here it inserts this data into a "cache" table for quick access from the API)

My Query

set @RowNum := 0;
set @JSON := '';
set @RowTotal := (SELECT count(*) FROM(SELECT * FROM `Job1111`.CivilTrackerDetails GROUP BY BidID) tb);


SELECT (@RowNum := @RowNum + 1) AS ID,   
(@JSON := CONCAT(@JSON, CONCAT(
IF(@RowNum = 1, '[', ''),
'["',
ctd.BidID,
'","', 
REPLACE(REPLACE(ctd.Description, '"', '\\"'),"'", "\\'"), 
'","', 
REPLACE(REPLACE(ctd.FoundationDescription, '"', '\\"'),"'", "\\'"), 
'","', 
REPLACE(REPLACE(ctd.EngDrawingNumber, '"', '\\"'),"'", "\\'"), 
'","', 
REPLACE(REPLACE(ctd.DetailDrawingNumber, '"', '\\"'),"'", "\\'"),  
'","', 
ctd.TakeOffQuantity, 
'"]',
IF(@RowNum = @RowTotal, ']', ',')))) AS JsonData

FROM `Job1111`.CivilTrackerDetails  ctd

Question

Can you see why this is looping so badly? Do you know a better way to accomplish this task. Preferably within the given methodology of arranging the data within MySQL vs out of MySQL.

The explain shows nothing much.. just a derived ctd from a full table scan. (to be expected).

I'm using version 5.6.3. I have often wondered if some of the JSON compatibility that are supposed to be available in 5.7 may help with this scenario. I have not looked into what is all available in 5.7 yet though. I would be interested in input on that as well.

Ultimately the solution to this issue will be implemented in a stored procedure.

Best Answer

It seems to me you just need to use the GROUP_CONCAT aggregate function:

SET @JSON :=
  CONCAT(
    '[',
    (
      SELECT
        GROUP_CONCAT(
          CONCAT(
            '["', ctd.BidID,
            '","', REPLACE(REPLACE(ctd.Description,           '"', '\\"'),"'", "\\'"), 
            '","', REPLACE(REPLACE(ctd.FoundationDescription, '"', '\\"'),"'", "\\'"), 
            '","', REPLACE(REPLACE(ctd.EngDrawingNumber,      '"', '\\"'),"'", "\\'"), 
            '","', REPLACE(REPLACE(ctd.DetailDrawingNumber,   '"', '\\"'),"'", "\\'"),  
            '","', ctd.TakeOffQuantity, 
            '"]'
          )
          ORDER BY <some-column>
        )
      FROM
        `Job1111`.CivilTrackerDetails AS ctd
    ),
    ']'
  )
;

As pointed out by Rick James, applying GROUP_CONCAT to an entire table like that might result in exceeding the default group_concat_max_len value, which is 1024 in MySQL5.7. So, make sure you have set it big enough to accommodate the resulting string by, for instance, setting it session-wise using a SET statement just before the string assignment, like this:

SET group_concat_max_len = 20480;  -- for example
SET @JSON := ...