• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

Millions of unique strings without eating all memory?

 
Greenhorn
Posts: 5
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I'm rewriting an app that pulls a dozen data items for about six million rows in an Oracle database. About half of these are strings, and a majority are unique (physical addresses and 10-digit phone numbers).
I've been profiling my memory usage, and the garbage collector gets me back down to a fairly constant size every time it runs, but it gets behind at the threshold, which means it allocates a little extra space before clearing the memory, and the heap gets bigger every cycle.
I posted on the JDBC forum, hoping there was a way to keep strings out of the process, but alternately, is there a way to fiddle with the GC to prevent the reallocate before the collection? Maybe by boosting the priority of the thread?
I'm running System.gc() as often as is rational (by my definition , but the spikes keep getting taller.
Has anyone run into a problem like this before? Am I barking up the wrong tree?
Thanks in advance,
Bill
 
Ranch Hand
Posts: 18944
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Can you post the code with which you pull the strings?
That would help us to figure out the problem.
Thx
 
Bill Gathen
Greenhorn
Posts: 5
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
// Simplified for brevity
String qry = "select orig_num from v_sur_universal where inv_id = ?";
String orig_num;
try
{
Connection conn =
ConnectionFactory.getConnection("club", "DBproperties.txt");
PreparedStatement stmt = conn.prepareStatement(qry);
stmt.setInt(1, 427167);
ResultSet rset = stmt.executeQuery();
orig_num = rset.getString(1);
}
catch (SQLException sql) { sql.getMessage(); }
 
Bill Gathen
Greenhorn
Posts: 5
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Update (sparked by comment on JDBC cross-post):
The bottom of the memory-usage sawtooth is constant. It's not a textbook memory leak, where something is not being dereferenced and lives until the program dies.
It's the *top* of the sawtooth that I'm concerned with. If the initial allocated heap is 2 meg, it seems to let the objects pile up until just below 2 meg, then run gc. The used memory drops drastically, then starts building back up.
The core problem seems to be that between the time the gc decides to run again and the time it actually starts freeing memory, the main thread has added a couple more objects (running past 2 meg) and has to allocate more space for the heap. Now the allocated heap space is bigger, so it goes longer before gc'ing the next time, with same lag problem increasing the size yet again. Repeat x,000 times and the heap has gotten very large.
 
They worship nothing. They say it's because nothing lasts forever. Like this tiny ad:
a bit of art, as a gift, that will fit in a stocking
https://gardener-gift.com
reply
    Bookmark Topic Watch Topic
  • New Topic