You have a one millisecond difference, that's well within your margin of error.
Use System.nanoTime() for performance testing.
Empty loops have the potential for being optimized away.
For that many loop iterations, would 1 millisecond make a difference? Is that difference worth the possibility of making the code less readable?
You would need to run the test many times to see what the average duration is.
You would need to run one() followed by two() and also two() followed by one().
Lucian Whiteman wrote:How can I use it in my advantage ?
You shouldn't even try. First, you have no proof there IS a real difference. Did you follow the suggestions people gave - reversing the order, using nano time, etc.?
Next...what possible situation do you have where one millisecond in this loop makes any difference? Considering you'll probably be doing stuff inside a loop that takes time (disk I/O, network communication, DB access...something), the millisecond here will be lost in all that processing.
It is simply NOT WORTH IT to fret over something as meaningless as this at the cost of making your code harder to read/maintain.
There are only two hard things in computer science: cache invalidation, naming things, and off-by-one errors