• Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

static Hashmap variable having large data ...!!

 
Karn Kumar
Ranch Hand
Posts: 153
Eclipse IDE Java Tomcat Server
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi All ,

In my project i want some data to be accessed , which will not be change frequently .
So i decided to keep it like memcached (instead of calling at each time i will make single call and store it ) but my memcached is already full with some other data so i skipped that option.
The data will consist of the about say 14000 objects currently which will increase later having 14 member variables.
So i decided to write code on loading of servlet at startup and assign to some static hashmap in the class.

My question is , will it be good to store in such way....? Is there any default limit to that variable ? will it give any problem in the class load for other methods ?


Kindly suggest some options if you have faced any things like this.


Thank you all in advance.

Regards,
Chetan
 
Nomaan Butt
Ranch Hand
Posts: 54
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
i don't see any issue, i suppose you are keeping the reference of objects in hashmap. If there are such a large number of objects as you said 14000, so this can lead to heap memory unavailability, you can keep the objects serialized and deserialize them when necessary and put them in the hashtable. This will require another hashtable to store the serialized obejcts filenames.
 
Karn Kumar
Ranch Hand
Posts: 153
Eclipse IDE Java Tomcat Server
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
yes Nomaan , the problem of heap space will occur .Whatever you have suggested is also a good idea .The thing is that again the serialization/deserialization will take will take some time .
Thanks for your suggestion
 
Paul Clapham
Sheriff
Pie
Posts: 20742
30
Eclipse IDE Firefox Browser MySQL Database
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
And so you have the classic "Memory versus Access Time" decision. It isn't possible to make that decision without knowing what values you put on your memory and your time. You're the only one who knows that, and quite likely even you don't know it.

So at this point you already have all the information that outside parties can provide. Vague phrases like "large number of hashmap entries" aren't going to help any more. If you don't know the value of your memory and the cost of your access time -- and quite likely you don't -- then you're going to have to do some experiments. Try something and see what difference it makes to memory usage and response time. If you don't like the result, try something else.
 
Campbell Ritchie
Sheriff
Posts: 48386
56
  • Likes 1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Large amount of data? 14000000 objects? I don’t think 14000 is a large amount at all, on an up-to-date computer. Your Map will contain 14000 references, which occupy 56000 bytes in memory. That is not a lot at all.
 
Karn Kumar
Ranch Hand
Posts: 153
Eclipse IDE Java Tomcat Server
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi Campbell,

14000 is current figure and it will increase by time.

Regards,
Chetan
 
Campbell Ritchie
Sheriff
Posts: 48386
56
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I still think it will have to increase a great deal before you have any memory capacity problems.
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic