I have a scenario in which I have to fetch 60-80K records from one schema, process them and then insert those into 5 different tables in another schema.
My question is how can I do these as "batches" or "sets of 1000s".
Two alternatives I was thinking about:
1. Read all the 80K objects and store them in a Collection. Then from the collection iterate through the collection and perform batch inserts in batches of 1000/5000 if possible. The disadvantage is this whole 80K will remain in the memory although processing is happening in batches.
2. If possible from the database itself read as batches of 1000/5000, so that only 5000 objects are being worked on at one time. I am thinking that in this case only 5K will remain in memory at a time.
But I dont know how to say get me these 80K records from Oracle in batches of 1000/5000.
Could you all please provide your thoughts ?
Is option 1 better or option 2 ?
If option 2 please provide ideas on how will I fetch records as batches.