divya chowdhary wrote:"The load factor is a measure of how full the hash table is allowed to get before its capacity is automatically increased. "
Is the above statement means that until i put 8 entries into the collection, the collection won't increase automatically if we take default initial capacity value=10 and default load factor value=0.75 into consideration?
Yup.
"When the number of entries in the hash table exceeds the product of the load factor and the current capacity, the hash table is rehashed (that is, internal data structures are rebuilt) so that the hash table has approximately twice the number of buckets."
Does it mean that will my hash table gets rehashed when i have 8 entries in the hash table if the initial capacity is 10 and load factor is 0.75 of my hash table?
Yup. It's pretty much a re-statement of the first point.
"If the initial capacity is greater than the maximum number of entries divided by the load factor, no rehash operations will ever occur."
If i need to store a maximum of 50 entries into the collection, then initial capacity=50/0.75=66.6 based on the preceding formula. So will it be a good idea to keep initial capacity of my collection to 70?
Or 67, if you
know that you only want 50.
If yes, should i need to change the load factor
No.
or keep it to same value of 0.75?
Unless you have a very good reason not to (and to do that, you'd probably need to know something about what you're going to be storing).
Also under what circumstances should we need to increase the load factor value?
If space is at a premium you might want to raise it; if speed is absolutely critical and you suspect that the things you are storing might cause collisions (though how you'd know that I'm not quite sure), you could lower it, but I've never done it myself in 12 years of writing Java, and it will make the map bigger.
Winston