I'm reading a ResultSet that gets 1.5 million rows. I put the data into an ArrayList of objects and then do some processing on it. Before the processing, I close the ResultSet. I then get a connection to a different DB server and write the data in the ArrayList out to this DB. I then need to do this process again only reading and processing a different ResultSet. The problem is when I start the second process, I get an out of memory error. I have adjusted the heap size on the VM to -Xmx1024m. I close my ResultSet. I also tried giving back the memory of the ArrayList by parsing it in reverse and doing a remove() for each element. I'm not sure if this works or if I'm depending on the garbage collection. Any ideas on how I can manage the memory better?
You ought not to need to remove anything from your List. Simply allowing the List to go out of scope will permit it to be cleared by the garbage collector. You should consider whether it is necessary to download al 15000000 values simultaneously.
That's a huge amount of data to keep in memory. As long as there are no references to the members of the array list, you should just be able to set the array list reference to null--but you never know what JDBC drivers are doing under the covers. I agree with Campbell--is it *really* necessary to process things like this? How big is each entry?