aspose file tools*
The moose likes Beginning Java and the fly likes HashMap, Load Facor, Rehash Operation Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Java » Beginning Java
Bookmark "HashMap, Load Facor, Rehash Operation" Watch "HashMap, Load Facor, Rehash Operation" New topic
Author

HashMap, Load Facor, Rehash Operation

Varuna Seneviratna
Ranch Hand

Joined: Jan 15, 2007
Posts: 167
The 4th paragraph of the HashMap in Java documentation is as follows


As a general rule, the default load factor (.75) offers a good tradeoff between time and space costs. Higher values decrease the space overhead but increase the lookup cost (reflected in most of the operations of the HashMap class, including get and put). The expected number of entries in the map and its load factor should be taken into account when setting its initial capacity, so as to minimize the number of rehash operations. If the initial capacity is greater than the maximum number of entries divided by the load factor, no rehash operations will ever occur.--

What is meant by "decrease the space overhead but increase the lookup time"

Then

If the initial capacity is greater than the maximum number of entries divided by the load factor, no rehash operations will ever occur


What does the above sentence mean
Initial capacity is the number of empty elements at time of creation, If that is so maximum number of entries is the initial capacity.Is that correct?How is it assured that a rehash will not ever occur?

Regards Varuna


Varuna Seneviratna
Campbell Ritchie
Sheriff

Joined: Oct 13, 2005
Posts: 39869
    
  28
What it means is that you start with an initial capacity of 16, and a load factor of 0.75f, so rehashing occurs when you reach 12. If you only intend to store 10 key-value pairs in this Map, then it will never re-hash. If you store 13 pairs, then it will re-hash once and if you store 25 pairs it will re-hash twice. So to store 25 pairs without re-hashing you would need an initial capacity of 64.

I don't know whether re-hashing occurs after 11, or before 12, or after 12. You could work that out by going through the code: find a file called src.zip in your Java installation folder and unzip it.

If you set your load factor lower, eg 0.5f you will get rehashing at 8, so there will be more empty space. If you set your load factor at 1.0f you won't get rehashing until 16, but there will be more chance of two pairs with the same hashCode % 16, so they will land in the same "bucket" and look-up will be slower.
Varuna Seneviratna
Ranch Hand

Joined: Jan 15, 2007
Posts: 167
So what I gather is "a rehash will not ever occur" is not correct is it?What ever the initial capacity is when ever the load factor is reached it will rehash.I am saying this with the assumption that if the number of entries that will have to be made can not be predetermined

but there will be more chance of two pairs with the same hashCode % 16, so they will land in the same "bucket" and look-up will be slower


I am not able to understand this.What is HashCode?Is it the Key?When entries are made up to 16(16th one also added) with the load factor 1.0f.Another 16 of empty elements will be added and the total number of elements will be 32, is it?

In the HashMap method list I couldn't find a method to get the number of total number entries possible. The size() returns only the number of entries already made.Is there a way to get the total number of possible entries?

By saying that if empty space increases lookup time will be slower, does this mean that when an search has to be carried out using a given Key all the elements even the empty ones will be looked up?

Regards Varuna
Henry Wong
author
Sheriff

Joined: Sep 28, 2004
Posts: 18996
    
  40

Hashing theory is actually something that you should have learned (or will learn) in your data structure class (Or algorithms depending on the school). So, you should review the course work (or pickup the book in advanced) from that class.

Trying to learn based on an implementation (even a Java implementation, that is highly used) is not as good as learning the theory directly.

Henry


Books: Java Threads, 3rd Edition, Jini in a Nutshell, and Java Gems (contributor)
Campbell Ritchie
Sheriff

Joined: Oct 13, 2005
Posts: 39869
    
  28
Varuna Seneviratna wrote:"a rehash will not ever occur" . . .
Please explain what you think that statement means. It looks correct to me, but I think I can only explain more if I know what you understand by it.
santhosh.R gowda
Ranch Hand

Joined: Apr 06, 2009
Posts: 296
If you set your load factor at 1.0f you won't get rehashing until 16, but there will be more chance of two pairs with the same hashCode % 16, so they will land in the same "bucket" and look-up will be slower.


can you expalin this quate more clearly


Creativity is nothing but Breaking Rules
Raju Goke
Greenhorn

Joined: May 01, 2007
Posts: 7
Varuna Seneviratna wrote:So what I gather is "a rehash will not ever occur" is not correct is it?What ever the initial capacity is when ever the load factor is reached it will rehash.I am saying this with the assumption that if the number of entries that will have to be made can not be predetermined

but there will be more chance of two pairs with the same hashCode % 16, so they will land in the same "bucket" and look-up will be slower


I am not able to understand this.What is HashCode?Is it the Key?When entries are made up to 16(16th one also added) with the load factor 1.0f.Another 16 of empty elements will be added and the total number of elements will be 32, is it?

In the HashMap method list I couldn't find a method to get the number of total number entries possible. The size() returns only the number of entries already made.Is there a way to get the total number of possible entries?

By saying that if empty space increases lookup time will be slower, does this mean that when an search has to be carried out using a given Key all the elements even the empty ones will be looked up?

Regards Varuna


HashCode is a random number generated by the JVM,its unique to every object.It's not the address of the object.The JVM returns same HashCode for the following case:

String s1= new String("hello");
String s2= new String("hello");

In this case s1.hashcode() and s2.hashcode() returns same value.


Regards,
Raju
Campbell Ritchie
Sheriff

Joined: Oct 13, 2005
Posts: 39869
    
  28
Not quite. A hash code is by no means random; it is calculated from features in the object. The Java™ representation requires that two objects returning true from their equals() method must also return the same value from their hashCode() method. I haven't ot time to go through how to write a hash code method, but you want to get the greatest variability in the "low order" bits. So a hash code of 1234abcd differs only slightly from 1234abce, but that tiny difference will almost guarantee those two objects go into different buckets.

You have a Map with 16 "buckets" an a load factor of 75% (0.75f). When you have 12 entries, then the Map will be enlarged (not sure whether that happens when you reach 12 or when you try to reach 13). So you have at all times at least 4 empty buckets. That reduces the risk of putting two objects into the same bucket. When the Map is enlarged, it has capacity 32 and contains 12 pairs, so at least 20 buckets are empty. When you add more pairs, then they have 20 spaces to go into

If you increase the load factor (150%) then you won't enlarge the Map until you have 24 pairs. That guarantees not less than 8 pairs will have to share buckets. When you increase it to 32 buckets, it will have to accommodate up to 48 objects.

Not sure, but I think rehashing occurs whenever a pair is added (put method). Also when the Map is enlarged, every pair is re-hashed. And the Map interface does not have a method to return the capacity
santhosh.R gowda
Ranch Hand

Joined: Apr 06, 2009
Posts: 296

That guarantees not less than 8 pairs will have to share buckets. When you increase it to 32 buckets, it will have to accommodate up to 48 objects.


please tell me this statement some more clearly with example
Henry Wong
author
Sheriff

Joined: Sep 28, 2004
Posts: 18996
    
  40

santhosh.R gowda wrote:

That guarantees not less than 8 pairs will have to share buckets. When you increase it to 32 buckets, it will have to accommodate up to 48 objects.


please tell me this statement some more clearly with example


First of all, you kinda took the original statement out of context (meaning this is just a snippet of the explanation), so it doesn't make much sense by itself. Second, the original statement was an example, and it was pretty clear. What is it that you don't understand?

Basically, from the original example... if the load factor is 150%, it can accommodate 150% the number of elements that it has buckets. So, with 32 buckets, it can accommodate 48 elements before it has to create more buckets and rehash.

Henry

Campbell Ritchie
Sheriff

Joined: Oct 13, 2005
Posts: 39869
    
  28
santhosh.R gowda wrote:please tell me this statement some more clearly with example
24 entries.
16 buckets.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: HashMap, Load Facor, Rehash Operation