Win a copy of Re-engineering Legacy Software this week in the Refactoring forum
or Docker in Action in the Cloud/Virtualization forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

out of memory

 
Howie Jiang
Greenhorn
Posts: 26
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I am processing very big files.. I get out of memory exception when I process every 1000 records in the file.. but processing 100 records is okay. but procesing every 100 records is using longer time than processing every 10 records.. I get two questions here.

1. how can I know the memory will be used out and can do some detection before it throw me an exception?

2. Why my processing every 10 record at a time is faster? how can I get the best performance..

Thanks very much!
 
David Harkness
Ranch Hand
Posts: 1646
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Can you clarify what "processing every 10/100/1000 records" means? As for memory usage, java.lang.Runtime has total/max/freeMemory() methods that you could monitor as you process records.
 
Joe Ess
Bartender
Pie
Posts: 9258
10
Linux Mac OS X Windows
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Java Platform Performance will help you gain an understanding of how to measure and compare performance.
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic