I'm working with an oracle 11 database and need to process 53Million entries. This can take some time and my usual approach is to read the data into a Vector of records and then process them but this time I unfortunately run out of memory before all records are read.
you are right
Unless I'm grossly mistaken, everything that returns rows in Oracle is a cursor.
Opening an SQL query with prepared statement creates a cursor in Oracle database that is no different from, say, a ref cursor returned from a procedure.
As I understand it, the issue was not with the query but with the processing. Even if the data is used directly (not stored in a vector), the processing speed for a cursor in the DB procedure will be far more superior to a java program executing elsewhere. I am saying this out of personal experience, as I had to remove my java code and use procedures for processing of large volumes of records(of course I used java to call that procedure ). Db stored procedures will almost always have a better performance when compared to implementing the same logic java through jdbc. You are esentially eliminating a large number of IO calls and initiating just 1 IO call to start processing entirely in the DB.
So if there are problems with cursors returned from SQL query described in the initial post, I assume the same problems would appear with any other method that creates the same or similar cursor.
You can do this with a single connection. As you said you can loop over the records of the first resultset and for each record execute another statement. If implemented properly, you will not get the too many open cursors error. You might have got that error because you were creating the Statement/PreparedStatement inside the loop.
i mean i get a resultset from the 1st query , loop over the records and for each of them eventually make another query using a different connection. Since it's already happened to me to have an error about having to many opened ursors (oracle), due to using the same connection to execute multiple statement while looping on a large resultset, i tend to use a new connection for each query i send to db. I whished to know what's the best strategy in this case : in this case would be to retrieve large resultset outside the transaction , build some java collection and then do the updates/inserts into the transaction ?