Hi guys. I am trying to do a query on a table with four million records. I've set everything to "lazy" and additionally have even tried setting cachemode to IGNORE.
Neither effort seems to be saving me. I'm still getting the following dreaded error : Exception in thread "main" java.lang.OutOfMemoryError: Java heap space.
This seems like it should be simple. Is there a configuration setting im missing ?
Joined: Oct 17, 2001
It is not clear if the table protein has 4 mil rows or the related tables have 4 million rows?
You should add filtering to your query. If you are interesting in a subset of the records use JPQL to build your entities.
You could also try to increase the max heap space on your jvm.
Joined: Aug 30, 2005
The table which i am querying directly has 4 million rows. I don't want to fix the max heap size, because I assume one day the table may have 10 million rows, etc. I would rather have a scalable solution which is lazy. Does hibernate not provide a lazy iterator for a result ?
I need to iterate through all "n" records, regardless of how large n is, so I cannot want to filter the query in any way.
Im shocked if this is the case !
Please let me know.
Joined: Dec 13, 2007
Hi Jay Vas,
My suggestions are avoid this scenario as follows:
* use Hibernate pagination concepts with page by page records showing.
If you are going to be iterating through tons of records, I highly recommend that you evict objects as you are finished reading them. This will clear stuff out of the Persistence Context maps and not store all the record in memory.
So since you know that the collection loads as you go through each record, because of lazy fetching and the fetchmode isn't eager or subselect.
So as you are looping through them and going to the database, after you are done with that record, then evict the object from the session.