• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

large collections in hibernate.

 
Ranch Hand
Posts: 407
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi guys. I am trying to do a query on a table with four million records.
I've set everything to "lazy" and additionally have even tried
setting cachemode to IGNORE.

Neither effort seems to be saving me. I'm still getting the following dreaded error : Exception in thread "main" java.lang.OutOfMemoryError: Java heap space.

This seems like it should be simple. Is there a configuration setting im missing ?


 
Ranch Hand
Posts: 153
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hey Jay,

It is not clear if the table protein has 4 mil rows or the related tables have 4 million rows?

You should add filtering to your query. If you are interesting in a subset of the records use JPQL to build your entities.

You could also try to increase the max heap space on your jvm.
 
jay vas
Ranch Hand
Posts: 407
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
The table which i am querying directly has 4 million rows.
I don't want to fix the max heap size, because I assume one day the table may have 10 million rows, etc. I would rather have a scalable solution which is lazy. Does hibernate not provide a lazy iterator for a result ?

I need to iterate through all "n" records, regardless of how large n is, so I cannot want to filter the query in any way.

Im shocked if this is the case !

Please let me know.
 
Greenhorn
Posts: 17
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi Jay Vas,

My suggestions are avoid this scenario as follows:

* use Hibernate pagination concepts with page by page records showing.

* Also Introduce DB VIEW and module for getting records using HQL.
[Check for this : ]https://coderanch.com/t/218145/ORM/java/hibernate-views]

Give your feedback...
 
Ranch Hand
Posts: 110
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
The link seems to be not working
 
Greenhorn
Posts: 15
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi,

You should limit the size of what is being returned from the database. Try appending a "where clause" to the SQL and see if that works. You should not be reading 4m records!!

Regards,

John
 
ranger
Posts: 17347
11
Mac IntelliJ IDE Spring
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
If you are going to be iterating through tons of records, I highly recommend that you evict objects as you are finished reading them. This will clear stuff out of the Persistence Context maps and not store all the record in memory.

So since you know that the collection loads as you go through each record, because of lazy fetching and the fetchmode isn't eager or subselect.

So as you are looping through them and going to the database, after you are done with that record, then evict the object from the session.

Mark
 
reply
    Bookmark Topic Watch Topic
  • New Topic