• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

Entity objects not getting unloaded after Transaction completion

 
Ranch Hand
Posts: 35
2
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi all,

This thread is related to OutOfMemory issue , but I am using Spring for Transactions . So added it here , please feel free to move it to proper topic if anybody feel so.

I am using Weblogic , JPA , JTA and Spring(at some places) for business logic in my project and facing out of memory issue very frequently. On analyzing hprof files (2-3), I can see lot of entity objects (around 3Lac) are loaded. We are using JTA for managing transactions , so in ideal case entities should gets unloaded on completion of a transaction but why they are not getting unloaded . Or this there any way we can unload them programmetically or at Transaction level we can do something.

Please advise.

Thanks.
 
Saloon Keeper
Posts: 27752
196
Android Eclipse IDE Tomcat Server Redhat Java Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I'm presuming that "3Lac" means 3 lahks (300,000). I'm afraid that lakhs are not understood by most people outside of India, so please use more common units of measurement.

I'm also not sure what you mean by "unloaded", but my guess is that your ORM transactions are retrieving very large working sets and it's causing you memory issues, compounded by not having the memory released (which is what I think you mean by "unloaded") after you use it.

First of all, pulling 300K worth of records into memory is not something you'd generally want to do. If you're manipulating that many records in a single operation, you should investigate ways to do it on the database server and not in a remote (java) application. And I say that as someone who doesn't like the trivial use of stored procedures and other server-side vendor-dependent operations. There is a time and a place for such things, and this sounds like one of them. If you're not actually doing anything with all those records, you may need to set up for lazy fetching or some other means of fetch control. Pulling large amounts of data from a database and then not using it is a horrible waste of resources, not only memory, but CPU and network.

Beyond that, the ORM isn't responsible for releasing the records in the working set. The records are stored in Java objects and they will be garbage-collected according to the usual rules. First and foremost being that ONLY when there are no references remaining to the objects in question will they be eligible for garbage collection (and thus freeing up their memory). If you cannot find what references remain, use the Java diagnostic tools to find where the links are.
 
shikhaj jainy
Ranch Hand
Posts: 35
2
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi Tim,

Sorry about not using more common units of measurement. I will make note of it for the future.
Here in our project we are doing auditing and because of that operation related entities getting loaded though we have used lazy initialization for all related entities
I have attached snapshot of Leak suspect from MAT, AuditReaderImpl from Hibernate is taking 2.14% of Heap which is called by custom Audit entities from project.
So my point is as all entities are getting lazily loaded so once transaction finishes(User is done with auditing query ) so how can we destroy(or mark entity objects of no use) loaded objects of related entities so that memory will get released.

Thanks a lot.
Capture.PNG
[Thumbnail for Capture.PNG]
 
Tim Holloway
Saloon Keeper
Posts: 27752
196
Android Eclipse IDE Tomcat Server Redhat Java Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Actually, I don't like that word "STUCK" so prominently displayed on the thread. I'd like to know what defines it as "STUCK" and what would be needed to to to unstick it.

I suspect that the ArrayList contains Entities which have Envers auditing constructs attached to them so that the auditing object count is actually secondary - if you released all the entities in the arrays, then likely the audit reader objects would get released as well.

The hard part is going to be in figuring out what those arrays are and who's holding them. That's hard to determine when you're dealing with a closed-source product.

One thing I do that helps keep stuff like this from being as likely to happen is that I detach my ORM objects except while actually working on them directly. I know it's very common for a lot of webapp developers to obtain entities in their processing servlet, and pass them (still attached) to a rendering JSP, but that has always made me nervous. It's a lot easier to damage something that way and a lot harder to figure out which side of the servlet/JSP fence the damage happened on. Plus, while objects are attached, they necessarily have live references to them, which means that they're taking up system resources and cannot be garbage-collected.
 
reply
    Bookmark Topic Watch Topic
  • New Topic