Mysql giving error while inserting 120 million rows “The total number of locks exceeds the lock table size”

MySQL

I am inserting approximately 120 million rows in table. Its really slow and giving error "The total number of locks exceeds the lock table size". Can anyone please guide me the quickest process to insert 120 million rows or to remove that error. Insertion table is ISAM.

Here is the query

INSERT INTO brokergenius.market_data  (
  currentPrice,
  SN_ID,
  quantity,
  seatNumbers,
  ZN_ID,
  listingAttributeList,
  listingAttributeCategoryList,
  deliveryTypeList,
  ticketClass,
  dirtyTicketInd,
  splitOption,
  ticketSplit,
  splitVector,
  sellerOwnInd,
  faceValue,
  serviceFee,
  deliveryFee,
  totalCost,
  score,
  seller_price,
  Event_id,
  RN_ID,
  SSN_ID,
  LI_ID ,
  TS_ID
)
SELECT DISTINCT
  currentPrice,
  sectionId,
  quantity,
  seatNumbers,
  zoneId,
  listingAttributeList,
  listingAttributeCategoryList,
  deliveryTypeList,
  ticketClass,
  dirtyTicketInd,
  splitOption,
  ticketSplit,
  splitVector,
  sellerOwnInd,
  faceValue,
  serviceFee,
  deliveryFee,
  totalCost,
  score,
  seller_price,
  event_id,
  0,
  0,
  0,
  0
FROM
    market_snapshot.marketsnapshot;

Best Answer

Quickest way to get around such an error is to split it up into multiple queries.

Option 1:

Take the original query and execute it repeatedly changing LIMIT bounds.
... LIMIT 0, 10000000;
... LIMIT 10000000, 10000000;
... LIMIT 20000000, 10000000;
... LIMIT 30000000, 10000000;
and so on...

Without an ORDER BY, the limit is not 100.00000% guaranteed to be consistent; but as long as the query is not overly complicated this is usually safe.

Option 2: Take the original query and execute it repeatedly. Instead of a shifting window from varying LIMIT clauses, devise a set of WHERE conditions that will discretely partition the data. For example: If you know the source is roughly even divided up by several sectionID values, you could have WHEREs based on those values (or non-overlapping ranges of them.)