Hi, i have an requirement in which i retrive data from a java beans , The returned result set is huge say 10000 records , so display i need to have pagenation . Keeping such huge data in session nmakes it slow , so we have deceided to store it in request . when pagenation is done the the page is refreshed and new receords are shown , but i am losing the data as the when i am clicking on the links its a new request . Alternatively i am getting bach to the controller servlet and again setting the data to the request object and then forwarding it again to the same page , it works that way . but can any one suggest any simpler method , say by which i can show the next 100 records without getting back to the servlet . Is there any way by which i can keep the request object alive even when the page is refreshed or when a page is submitted , action being the same page .
As Jeoren as already said , You can use the request object across requests. It is valid only a single request. The container may reuse the request object for a new request. HttpSession is the object where you store data across multiple requests.
Keeping such huge data in session nmakes it slow , so we have deceided to store it in request .
If you have decided to store the data in request it will not be available in the next request. If you thinking about saving a reference to request object and retreieving the data from the saved reference in the next reuqest(PS:this does not work as I have already mentioned) why not store in the session , when your requirement can only be satisfied by HttpSession.
Use ArrayList to store the data. First retrive the first 100 records from DB and send the response to the client. immediately launch a thread that loads the remaining records or all the records in the session object(ArrayList). for that u will have to pass the session object to the Thread which asynchronously loads the data.
Lookup the LIMIT or ROWNUM keyword (depending on what database you're using) to limit the number of rows returned from the database.
This will prevent you from tying up lots of memory with rows that may never be read. Shonak's suggestion is an advanced twist on this idea that will prepare the next block of rows for you while you're reading the current block.
The following is not a direct response, the gentlemen who posted before me have already covered that...but I thought I'd throw in a related tip...
Make sure you use a prepared statement and this can allow the database depending upon how advanced it is (...I know this works with Oracle), it will improve your performance through cacheing and bind variables. In this way if the sql is frequently submitted the DBMS will get the results from memory many times rather than going back to the physical disk.