I need to make an enhancement in my existing Java application. Its a core java based application, which has two buttons "local" and "central" for search. In the existing application when the user click on "central", a request is sent to mainframe for data. Mainframe returns an XML response which is parsed and local database is populated.
The problem with the existing application is that if mainframe returns many records say 4000-5000 application might die due to out of memory exceptions. What I need to do is to fix this issue by reading response from mainframe in batches, instead of getting at one go.
Can any one please suggest what could be the best optimal design for this?
I don't usually think of XML documents as having "records", but I guess your XML is just an XML version of a database table? Anyway 5000 "records" doesn't seem like much at all to me. I would suggest you don't try to optimize the process until you know what there is to optimize.
My application uses DOM parser. Scenario is like, MF will send me an XML request which shall contain say some 'n' number of records. My code will parse these records and populate local database (MS SQL 2005). Suppose if there are 5000 records, MF will send them in batches of 1000 each. After processing one batch I have to send a request to MF to send the next batch and so on.
Problem with this is, entire process should be seamless to the user and has to be done in one transaction. I cannot use pagination in this feature since that would not serve the purpose for my business.