I've been looking at sorting algorithms (just for fun), and even came up with one that I didn't find in the list of "known algorithms". Its a variant of insertion sort (but thats not of importance here). Its far from a good algorithm, but I observed a strange thing with it.
The first variant is how I wrote it originally, the second variant is how I began to re-write it now.
Basically I only pulled out some calculation from the loop tests, into loop variables. I expected the second form to run marginally faster, but to my surprise it runs significantly slower (for a long with 100000 elements its almost 1sec slower on my system). I'm wondering if anyone has an explanation for this behaviour?
Did you run each sort only once, or multiple times? Try timing the sorting an array of 100,000 elements five times in a row with each algorithm (obviously you need a new array of unsorted longs each time.) I could almost guarantee that by the third or fourth run through in a single running program, each algorithm will be taking the same amount of time. Initially, the times may be a bit different, but once the Hotspot engine optimizes the code, they should both come out just the same.
Hmm, you were right. With four runs each, the total amount of time was 1sec different, instead of ~4sec. With 9 runs, the difference in total time was negligible (with the second algorithm slightly in front). Thanks for the hint!
Still think its strange. If the bytecode of one form is shorter (less instructions to do), I would expect the JVM to execute the code faster BEFORE any internal optimizations. Unless operations like loading variables take longer than arithmetic operations..