aspose file tools*
The moose likes Java in General and the fly likes System.currentTimeMillis universal? Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Java » Java in General
Bookmark "System.currentTimeMillis universal?" Watch "System.currentTimeMillis universal?" New topic
Author

System.currentTimeMillis universal?

Angela Truce
Ranch Hand

Joined: Nov 30, 2005
Posts: 47
Hello,

Does anyone know if the System.currentTimeMillis() method is universal? i.e say i have a program that records a time using the method and then i send something to another computer say on the other side of the world on the internet and i record the time there. If i take the difference of these 2 times would i get the send time or would the times vary because of the timezones and different OS?

Does anyone know how i can time stamp things? I just want to be able to record the time (preferably in seconds) to send something or the travelling time.

thanks in advance!
Paul Clapham
Bartender

Joined: Oct 14, 2005
Posts: 18901
    
    8

Yes. As the documentation for the method says:
Returns:
the difference, measured in milliseconds, between the current time and midnight, January 1, 1970 UTC.
So clearly that's independent of what time zone you are in.

As for your question about the difference if you send that number to another computer: since it's timezone-independent, the timezone and operating systems of the two computers aren't going to be a factor. What you will be measuring is a combination of the time to send the data and the difference between the system clocks of the two computers.
Campbell Ritchie
Sheriff

Joined: Oct 13, 2005
Posts: 39828
    
  28
Quoted by Paul Clapham
difference between the system clocks.
That will be the main source of error. System clocks can be out by dozens of minutes.
Angela Truce
Ranch Hand

Joined: Nov 30, 2005
Posts: 47
hi,
thanks for the replies everyone.

How can the system clocks be out be dozens of minutes? That means it is dependent on the machine or something like that so System.currentTimeMillis is not independent anymore???
Jim Yingst
Wanderer
Sheriff

Joined: Jan 30, 2000
Posts: 18671
It's dependent on the machine, and the administrator. I can set the system clock on my computers to anything I want. Most people find it useful to set the system clock to something close to the actual time. But they're usually content with a time that's within a few minutes of the true correct time. Some people have their system clocks off by months or years. There's probably no good reason for that, but it happens nonetheless.


"I'm not back." - Bill Harding, Twister
Jeroen T Wenting
Ranch Hand

Joined: Apr 21, 2006
Posts: 1847
Originally posted by Angela Truce:
hi,
thanks for the replies everyone.

How can the system clocks be out be dozens of minutes? That means it is dependent on the machine or something like that so System.currentTimeMillis is not independent anymore???


Of course. If someone doesn't set his computer clock correctly you can't expect software running on that computer to be able to get the correct time by asking the computer for the time...
I've seen computers where the system time was off by months.

While there are some free time servers available, you don't want to make a network connection every time you want to get the time. Far too expensive.


42
Ilja Preuss
author
Sheriff

Joined: Jul 11, 2001
Posts: 14112
I guess you could use a public time server for that purpose.


The soul is dyed the color of its thoughts. Think only on those things that are in line with your principles and can bear the light of day. The content of your character is your choice. Day by day, what you do is who you become. Your integrity is your destiny - it is the light that guides your way. - Heraclitus
Jim Yingst
Wanderer
Sheriff

Joined: Jan 30, 2000
Posts: 18671
[Jeroen]: While there are some free time servers available, you don't want to make a network connection every time you want to get the time. Far too expensive.

True - or, could well be true, depending on how often you need to do this. But one reasonable alternative is to check the time once from a public timeserver when the program starts up (or when a particular class is loaded), and measure the difference between system time and "true" time. Then assume this difference remains constant, and use the difference to determine true time from system time whenever you (Angela, that is) need the time after that. You could also recheck the time every n minutes, just in case. That's probably not necessary unless the program runs a very long time, or unless the user has some motivation to change the system time as a way to hack your software. But it shouldn't be too difficult to implement a recurring check, and impose some penalty on the user if you ever detect that the system time has been substantially changed while your program is running.
Stan James
(instanceof Sidekick)
Ranch Hand

Joined: Jan 29, 2003
Posts: 8791
We keep our servers and our user desktops as close as we can by synchronizing on a time server on our own network. But they drift by rather scary amounts if we don't reboot or resync fairly often. We get close enough to manually correlate records from multiple systems but not close enough to measure response times or even user case-handling time.


A good question is never answered. It is not a bolt to be tightened into place but a seed to be planted and to bear more seed toward the hope of greening the landscape of the idea. John Ciardi
Paul Clapham
Bartender

Joined: Oct 14, 2005
Posts: 18901
    
    8

I have my home machine set to synchronize with some network time server once a day. So normally it has something close to the right time, except if I have McAfee do a full disk scan then it starts running slow by a few minutes.

The usual way to find the time it takes to transmit something between machine A and machine B is to have machine A send the data to B, and then have B echo it back to A immediately. Then all the times you need to compare were generated by machine A, and it doesn't matter whether A's time is "correct" or not.
Campbell Ritchie
Sheriff

Joined: Oct 13, 2005
Posts: 39828
    
  28
Posted by Angela Truce
How can the system clocks be out be dozens of minutes?
I have one computer whose displayed time changes by an hour if I reboot from Linux to Windows or vice versa.
Rusty Shackleford
Ranch Hand

Joined: Jan 03, 2006
Posts: 490
The value returned is also OS dependant. Windows XP/2000 has somewhere in the neighborhood of a resolution of 15 ms, win98 is 55 ms, linux and OSX/Linux is <5ms.

It might not be of much concern, depending on how accurate you need to be, but it is worth keeping in mind.
[ September 01, 2006: Message edited by: Rusty Shackleford ]

"Computer science is no more about computers than astronomy is about telescopes" - Edsger Dijkstra
Rusty Shackleford
Ranch Hand

Joined: Jan 03, 2006
Posts: 490
Originally posted by Campbell Ritchie:
Posted by Angela Truce I have one computer whose displayed time changes by an hour if I reboot from Linux to Windows or vice versa.


Is the correct time zone set in each OS?
Jim Yingst
Wanderer
Sheriff

Joined: Jan 30, 2000
Posts: 18671
Yeah, that sounds like a time zone thing. E.g. one computer is using daylight saving time, and the other isn't. Or something like that. That sort of thing shouldn't affect Angela if she uses System. currentTimeMillis(), but it affects displayed times quite often in my experience, as systems are routinely misconfigured for time zones and DST correction.
Campbell Ritchie
Sheriff

Joined: Oct 13, 2005
Posts: 39828
    
  28
Originally posted by Campbell Ritchie:
Posted by Angela Truce


It was me, not Angela.
[ September 01, 2006: Message edited by: Campbell Ritchie ]
Jeroen T Wenting
Ranch Hand

Joined: Apr 21, 2006
Posts: 1847
Originally posted by Jim Yingst:
[Jeroen]: While there are some free time servers available, you don't want to make a network connection every time you want to get the time. Far too expensive.

True - or, could well be true, depending on how often you need to do this. But one reasonable alternative is to check the time once from a public timeserver when the program starts up (or when a particular class is loaded), and measure the difference between system time and "true" time. Then assume this difference remains constant, and use the


That would work assuming the system clock all clients running the software runs at the same predictable speed.
If they don't you can't reliable correct the intervals you get for any error in the timing you're calculating, and still won't have a "universal" time.
Jim Yingst
Wanderer
Sheriff

Joined: Jan 30, 2000
Posts: 18671
Yes, I suppose it depends how much accuracy is really required. For many applications an error of a few seconds really isnt that big a deal, while for some it could be critical. The more accuracy is needed, the more frequently you can check the time server. And you can measure how much drift has occurred between the time server and the system clock - if it exceeds a certain threashold, issue a warning or throw an error, so excessively imprecise clocks or tampering can be promptly detected. Doing this all very precisely could be a somewhat complicated project, but for most applications, that's probably overkill, I think.
Tony Morris
Ranch Hand

Joined: Sep 24, 2003
Posts: 1608
From my previous observations, machines that are out of sync by any more that network latency between the two can corrupt software applications that are attempting to synchronise. Examples are /usr/bin/sudo and many revision control systems.


Tony Morris
Java Q&A (FAQ, Trivia)
Jeroen T Wenting
Ranch Hand

Joined: Apr 21, 2006
Posts: 1847
Sure you can check the timeserver more frequently as accuracy needs to go up.
But consider that applications requiring such accuracy almost always also require high performance.
And consider that calling a timeserver introduces its own inaccuracy which may well be higher than just taking your system time and putting in place a procedure for sysadmins requiring they make sure the system time is kept accuract.
After all, if 2 machines request the time from an NTP server at the same instance they won't get the same answer back at the same instance.
Their requests will reach the NTP server with different delays, take different lengths of time to be processed by the NTP server, and take different lengths of time for the response to travel back over the network.
If network latency is high or the time server is very busy (which they would be if every piece of software constantly queried them) this could lead to inaccuracies higher than merely relying on a system clock that is synched once a day or so.
Jim Yingst
Wanderer
Sheriff

Joined: Jan 30, 2000
Posts: 18671
Yeah, on reviewing the first post of this thread, for the particular application Angela is doing it seems likely that the uncertainty due to latency will be about the same order of magnitude as the timespan being measured. So it's going to be pretty difficult getting a meaningful measurement using this technique. Oh well.
Anand Hariharan
Rancher

Joined: Aug 22, 2006
Posts: 257

Originally posted by Angela Truce:

Does anyone know how i can time stamp things? I just want to be able to record the time (preferably in seconds) to send something or the travelling time.


Search around for Lamport clocks. It sequences events not chronologically, but based on logical ordering of events. This way, you can put off fighting issues that arise from time-stamping to another day!


"Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away." -- Antoine de Saint-Exupery
 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: System.currentTimeMillis universal?