I am struggling with a problem for the last few days.
It is a web application in which we have options for uploading data from excel,doc .. files etc to database.
The excel file may contain more than 1000 records. I am using good processor and 1GB memory.
The problem is that after doing 300 records , i can see that memory almost reaching 1GB , after that it will error like Heap error.
I know this is the problem of memory leakage.In critical parts of the program i am given System.gc() for Garbage collection.
But is not useful and also i am planning for multithreading in our program.
Please let me know any solution for this issue.
That isn't memory "leakage". That's just plain old using up all your available memory. Like Minh Nam says, one solution is to give the server more memory. (Calling System.gc() is useless -- the system will already have cleaned up all potential garbage before it crashes.)
It's also possible that your system is storing the entire file in memory unnecessarily. If you're just uploading a file and storing it into (say) a blob column in your database, it normally isn't necessary to put all of it into memory.
sanuji pillai wrote:The excel file may contain more than 1000 records. I am using good processor and 1GB memory.
You need either a better algorithm, or more RAM.
These days, developers need about 4GB on a 32 bit system, and I use 8GB on all of my 64 bit systems.
Ram is cheap, get more.
But sometimes you are better off using a better algorithm. Such as reading only portions of the file at at time, rather than reading it all at once. Some files will are too big to read all at once. VIdeo files are often a couple of gigabytes each.
Joined: Jul 05, 2010
From this i found solutions
1. Increase VM max memory size for web server or RAM.
2. Use specific algorithm for this purpose.
3. Use better processor.
Your suggestions are very precious and helpful me.