'MySQL - Fetch millions of data in chunk with constantly changing data
Below is the schema of my table A.
CREATE TABLE A (
NAME VARCHAR(255),
CREATED_AT DATETIME DEFAULT CURRENT_TIMESTAMP,
UPDATED_AT DATETIME DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY(NAME)
);
CREATE INDEX I1 ON A (UPDATED_AT,CREATED_AT);
Now we have around 50M records in this table. I can't fetch all the data at once as it would cause memory issues in the backend. I was thinking to use OFFSET+LIMIT combination to fetch the data in chunks but the problem is that the data is updating after every 5 minutes which can alter the position of the record in the table as we have the index on UPDATED_AT.
We can't afford to read the record twice. Is there any other way to read a large amount of data in chunks with frequent data updates.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|