I'm having problems with syncing time with millisecond accuracy.
Here's what I did:
I started a game on my first computer with the time read out displaying.
I started another game on my other computer which is sitting right next to it with it's time also displaying.
I took a picture of both screens with my digital camera to see if they were close to the same time. They were very close.
I waited about two minutes and took a picture again. This time they had drifted off from each other by about 3 tenths of a second.
Something tells me I need a more accurate delta instead of Time.deltaTime
Anyone have any suggestions?
asked Feb 08 '12 at 11:10 PM
You have mistaken what is Time.deltaTime. It represents the time elapsed since the last frame. It depends on zillions on factors, software, hardware, program running, etc and is in no way related to Unity syncing out.
You want to use Time.time which represents time elapsed since the start of the application.
What Berenger propose is part of the base .NET/Mono Framework on which Unity is running. You can find more info about it on MSDN. Note that it can also be done in JS, with the correct syntax.
Just thinking out loud: I can't think of a way to start a process on both computer with millisecond accuracy without 'manually' syncing them once they're running. Either you're pressing buttons or sending calls, both of which will be different no matter what, isn't it?
EDIT: oh yeah, you could have something monitor time closely and launch the process for you, duh.
Try System.DateTime.UtcNow, TotalSecond and those kind of things.
answered Feb 08 '12 at 11:28 PM
I think Time.time is my closest bet. I still need to work on some other problems with my synchronization before I start worrying about this again.
Thanks anyway Berenger.
answered Feb 09 '12 at 04:04 AM