i am doing a transfer of data from sql database to Oracle database using XML as transfer format as required by client. All rows in the tables are represented in blocks in XML. I capture these date from XML and create java objects to store each row of data and these objects are stored in vector. The objects in vector is read and data is inserted into database. When number of rows is of the order of 800000(i.e i create 800000 objects), Out of memory error is encountered.Below these row size the program works efficiently.Alternatively trying to insert the data in database immeditely after reading each XML block (i.e each row),the application performs very slowly.Can any one give a solution to this problem.i use SAX parser.
I am writing a Database Access Component and had the same problem when i tried to check the performance on a 1,000,000 (x 10 cols) records. (with vectors) I used string arrays. String arrays are cool. they wont give a problem even on a 128MB NT machine (for a million x 10 records). I can send you the code if you want.. if you want further info, mail me at firstname.lastname@example.org i'll be glad to help.