This week's book giveaway is in the OO, Patterns, UML and Refactoring forum. We're giving away four copies of Refactoring for Software Design Smells: Managing Technical Debt and have Girish Suryanarayana, Ganesh Samarthyam & Tushar Sharma on-line! See this thread for details.
System.gc(); To flush the Heap of JVM, But after 1,17000 records , it shows The Above Error.Is there any Other way to Flush the Heap space Of JVM. This Problem can be solved if I increase the JVM size explicitly, But I need to Flush the Heap space Instead of Increasing the Size Of JVM.
Thanks for your Response, I increase the Heap size of JVM , But do not think that it is a permanent solution. In my problem I m using Hibernate , And I got one Solution, i.e By using Pagination. After using Pagination I got The Same Problem again, so I used to a session for 1000 record at a time and then close the Session. and repeat this in Loop.But I do not know Whether I m in correct Path ???
I am not so sure about the Hibernate,but i feel we can fix this problem easily in case of JDBC by setting the fetch size with setFetchSize() method of ResultSet. I am also awaiting the response to this query.
Originally posted by Khirod Patra: Thanks for your Response, I increase the Heap size of JVM , But do not think that it is a permanent solution. In my problem I m using Hibernate , And I got one Solution, i.e By using Pagination. After using Pagination I got The Same Problem again, so I used to a session for 1000 record at a time and then close the Session. and repeat this in Loop.
If you are running out of memory you have two choices:
1) increase the amount of memory available to your JVM. It could be that you program just needs more memory so this is a quick and easy solution. This is very probably the case if you are doing stuff with large data sets. If you find your program keeps on needing more and more memory you may have a memory leak, in which case you need to profile your application and see what is happening. 2) Decrease the amount of objects you maintian in memory. Since Hibernate's Session is a cache, pagenation is a good technique to do this. Also, since this is Hibernate, you can check what data you are bringing back. Have you mapped any associated objects to be lazy? Are youeagerly fetching stuff you don't need?
I would also add that using an ORM to process 50,00000 records is probably not a good choice of tools. Why such a large result set? Are you doing data migration work? If so, you might consider the import and export tools that come with your database. They may be better suited to the task. [ June 23, 2008: Message edited by: Paul Sturrock ]