• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

Trying to understand System.currentTimeMillis() in an animation class I'm using

 
Ranch Hand
Posts: 50
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
To put it simply, I got one variable storing values taken from the currentTimeMillis method. I got a second variable doing the exact same thing. One would think that by taking one value and subtracting it by itself millions of times, you get zero in the end.

I don't completely understand this myself. All I know is that it gives the correct time to put a thread in a game to sleep. If it doesn't have this mysterious value, a character might animate too fast on a fast computer.

My questions are:

1. I know that older Windows OSes tend to have higher differences between two currentTimeMillis values. I believe Win98 had 50 ms in difference and xp had 20 ms. I'm getting a difference of 0 in Windows 7 Starter though. What does this mean?

2. There is an animation loop used that has this code:


I would have thought that values would continue to be zero. But when I put in some codes of my own to keep track of the numbers in Eclipse's console, the numbers keep adding up until it becomes equal to DEMO_TIME. The book I'm got this code from makes little mention of this. Why does it keep adding up?


I've been trying to understand this on my own for a week. If anyone can help me, I would be very grateful!
 
Bartender
Posts: 4121
IntelliJ IDE Spring Java
  • Likes 2
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
1.) "differences" isn't really the way to think about this - it's really an issue of "timer resolution". System.currentTimeMillis() just returns the current count of milliseconds since midnight, January 1, 1970 - it gets this from the underlying OS. In some OSes (the versions of Windows you mentioned) the OS doesn't update the millisecond count every millisecond, but every 20 or 40 milliseconds. So if you had code like this:



On a system that updates every 1 millisecond, you'd end up with something like this:

0
0
0
1
0
0
1
.
.
.

Mostly 0s and 1s (maybe some higher numbers every so often if you had lots other programs running, or heavy processing going on in the background). The 0s are there because lines of code run faster than 1ms a line, the 1s show up when the system millisecond counter updates.

On a computer that has a 20ms timer resolution you'd end up with output like this:

0
0
0
.
.
.
0
15
0
0
.
.
.

*Lots* more 0s and rarely a 15 (again, usually around 15 - you might end up with some 16/17/etc. because other things are running, etc.). Again, the 0s show up both because operations take less than 1ms, but also because even though more than 1ms has passed in real time, the OS timer keeps returning the same value until it updates.

2.) To understand this code plug in some fake values that are easy to work with - let's say DEMO_TIME is 5 (which means we want it to run a duration of 5ms), and assume a 1ms timer resolution (System.currentTimeMillis() returns a new value every 1ms).

The code starts, System.currentTimeMillis() is called and returns 100. This value is saved in the variable startTime and copied into currTime.

We enter the loop and test the condition: currTime (100) - startTime (100) is 0, which is less than DEMO_TIME (5), so we enter the loop.

We call System.currentTimeMillis() again, this time we get 102. We subtract the value in currTime (100), so we get 2 and save it into elapsedTime (2ms have passed since we set currTime).

We then add elapsedTime (2) to currTime (100) and save 102 into currTime.

The loop tests the values again - currTime (102) - startTime (100) is 2, which is less than DEMO_TIME (5), so we start the loop code again.

We call System.currentTimeMillis() again and get 105. We subtract the value in currTime (102), and get 3, and save it into elapsedTime (3ms have passed since we last set currTime - last time in the loop.)

We then add elapsedTime (3) to currTime (102) and save 105 into currTime.

The loop test the values - currTime (105) - startTime (100) is 5 - this isn't less than DEMO_TIME (5), so the loop ends and exits.


I don't completely understand this myself. All I know is that it gives the correct time to put a thread in a game to sleep. If it doesn't have this mysterious value, a character might animate too fast on a fast computer.



Try reading about Time-Based vs. Frame-Based Animation - the example on the site is in ActionScript, but the concepts are the same no matter what language you're using (and ActionScript is pretty close to Java syntax-wise anyway).
 
John Quach
Ranch Hand
Posts: 50
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Thank you. Its much clearer now.

Do all computers with the same Operating System count on the same rate? In other words, will doing a timer call on two Windows Computers at the exact same time give the same values?
 
Nathan Pruett
Bartender
Posts: 4121
IntelliJ IDE Spring Java
  • Likes 1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
In theory, the timer resolution should be the same - theoretically, I suppose the actual hardware clock could affect this number too, but nothing's going to have a hardware clock slow enough to affect this.

In practice, though, lots of different things can affect the numbers you get - for one thing, hardware could be different (i.e. faster/slower CPUs; more RAM preventing things swapping in/out of memory) and/or one system could be under heavier load than the other.

Also, System.currentTimeMillis() is affected by what time the OS thinks it is - so if you change the time through "Set time and date" - you will get different millisecond values. i.e. you're running a program computing the difference between System.currentTimeMills() calls and while running it you set the time forward on your system by one day, as soon as that is applied, System.currentTimeMills() returns something close to a 86,400,000ms difference.

If you can use a JVM >= 1.5, you can use System.nanoTime() - this is more precise, and it's not based on what the OS thinks the date/time is - i.e. it's a timer, not a clock.
 
Don't get me started about those stupid light bulbs.
reply
    Bookmark Topic Watch Topic
  • New Topic