For my program I have to read lines from a directory folder containing multiple text documents and each line in the text document has a name and a number. My program is supposed to read the line and assign the string variable key the name and the double variable value the number. Then create a hashcode using a hashcode function and then storing the data in a bucket array of linkedlists where the collisions are handled by seperate chaining.
The file contains text documents with the following data inside them: http://pastebin.com/6mAvpE6X. After running multiple tests my program works fine when there aren't any duplicate names but when the first duplicate name comes up it doesn't do anything so I thought maybe my program is stuck in a while loop in the addEntry class but I can't figure out how to fix it. What is supposed to happen is that if the program detects a duplicate name within the Linked list of a certain index its supposed to add the new value to the sum variable of the node with the same name that is already present in the linked list. It processes the first file fine but gets stuck at second text file and runs forever. Any help is appreciated. I tried to debug the code but at no success. The compilation of the program goes fine too. Thank You for your help
This is the only place that avg is calculated. This is at the point of initialization and at this time sum is zero and count is zero. Although your setSum() adds to sum and count those are never reapplied to compute a new avg. Seems like your getAvg() would always want to perform the calculation on the fly.
Can you post (cut and paste) some small samples of files that you are testing with?