Meaningless Drivel is fun!*
The moose likes Java in General and the fly likes Regarding traversal time in Map Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Java » Java in General
Bookmark "Regarding traversal time in Map" Watch "Regarding traversal time in Map" New topic
Author

Regarding traversal time in Map

saurabh kuma
Greenhorn

Joined: Aug 09, 2013
Posts: 14
Hi Friends,

I read somewhere

Traversal of a map is through one of its collection-views. For an underlying LinkedHashMap, the traversal time is proportional to the size of the map—regardless of its capacity. However, for an underlying HashMap, it is proportional to the capacity of the map.


I didn't get this line, Please help

Thanks
Saurabh
Richard Tookey
Ranch Hand

Joined: Aug 27, 2012
Posts: 1035
    
  10

If you "read somewhere" and can quote it surely you can elaborate 'somewhere' by posting a reverence.
saurabh kuma
Greenhorn

Joined: Aug 09, 2013
Posts: 14
I read it in

A Programmer's Guide to Java SCJP Certification
Third Edition by Khalid A. Mughal
chapter 15, pg 824 line 3

Thanks
Saurabh
Steve Luke
Bartender

Joined: Jan 28, 2003
Posts: 4165
    
  21

Do you understand the difference between a collection's capacity and its size?

The capacity is (generally) the number of Objects it can hold, while its size is the number of Objects it does hold. What the quote says is the LinkedHashMap's speed depends on the number of Objects it actually holds, while the HashMap is dependent more on the number of Objects it can hold.


Steve
Ivan Jozsef Balazs
Rancher

Joined: May 22, 2012
Posts: 867
    
    5
A vague guess:
A linked hash map offers a way to wade through the elements, so the traversal time corresponds to the number of the elements, that is, the size.
If you have to go through all slots, then the capacity comes into the picture.

Does it matter much?

Well, iet us take a map, let us stuff it with elements so both the size and the capacity go high.
Then let us delete most of them, so the size goes down - does the capacity remain high?
If the capacity gets adjusted, that is, it also goes down, then there is not much difference.

saurabh kuma
Greenhorn

Joined: Aug 09, 2013
Posts: 14

Hi Friends

Thanks for the help

Steve, as you said:

The capacity is (generally) the number of Objects it can hold, while its size is the number of Objects it does hold. What the quote says is the LinkedHashMap's speed depends on the number of Objects it actually holds, while the HashMap is dependent more on the number of Objects it can hold.


I understood that in LinkedHashMap traversal time is dependent on number of elements. But I have a doubt in case of HashMap. In HashMap if we have say 50 buckets(which is capacity) and 20 elements linked to this buckets, then traversal time will be 50+20 ?

Thanks
Saurabh
Volodymyr Levytskyi
Ranch Hand

Joined: Mar 29, 2012
Posts: 505
    
    1

In HashMap if we have say 50 buckets(which is capacity) and 20 elements linked to this buckets, then traversal time will be 50+20 ?

How HashMap stores element you should learn at first.
Then you will know that your mappings in hashmap are spread around the backing array god knows on each index and some Entries even occupy the same index.
This code is from HashIterator used by entrySet() method to return next entry:

At first we check if there is one more Entry at the same bucket by (next = e.next) == null and if there is no then we move inside to while loop. In this while loop we iterate backing array(called table) untill current index becomes equal to length or new Entry at index++ is found. If new Entry at new bucket is found this while loop breaks due to condition - (next = t[index++]) == null.

To know java collections profoundly and be confidently please read my tutorials


True person is moral, false is right!
Steve Luke
Bartender

Joined: Jan 28, 2003
Posts: 4165
    
  21

saurabh kuma wrote:In HashMap if we have say 50 buckets(which is capacity) and 20 elements linked to this buckets, then traversal time will be 50+20 ?

In a worst case scenario, that is about right. We know to traverse the table, you need to hit every bucket, so the traversal will be at minimum 50. In the worst case, all Entries will be in the same bucket, so after the first one in the bucket is touched by the bucket-traversal, you would have 19 additional operations, and you would get 50+19. In the best case scenario (well dispersed hashcodes) each of the 20 elements would be in its own bucket, and the traversal would be 50+0 additional traversal operations.
saurabh kuma
Greenhorn

Joined: Aug 09, 2013
Posts: 14
Thanks Steve
 
It is sorta covered in the JavaRanch Style Guide.
 
subject: Regarding traversal time in Map
 
Similar Threads
K&B Java 5 scjp ch7 page 24/114
LinkedList vs ArrayList
Speed: LinkedHashSet vs HashSet
Returning the Whole Bucket from a Hashtable
Vector.