High Precision Timer in SDL? |
datahead8888
|
I've been using SDL_GetTicks() in my program to start work on a simple physics implementation.
This only has millisecond accuracy to my understanding. Is there anything similar to the Win32 function QueryPerformanceCounter in SDL, or are there plans to implement it in SDL? Looks like this might be a cross platform solution, but it looks kind of nasty: http://www.devmaster.net/forums/showthread.php?t=1407 Here, one of the posters makes reference to a call to QueryPeformanceCounter within the SDL source code: http://www.gamedev.net/community/forums/topic.asp?topic_id=471804 Are there any other reasonable options for cross platform high-performance timers? |
|||||||||||
|
High Precision Timer in SDL? |
Donny Viszneki
Guest
|
Why would you need greater accuracy than 1/1000 second for a video game?
On Mon, Dec 14, 2009 at 11:47 PM, datahead8888 wrote:
-- http://codebad.com/ _______________________________________________ SDL mailing list http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org |
|||||||||||||
|
High Precision Timer in SDL? |
Bernd Roesch
Guest
|
SDL use the QPF on windows but
currently SDL have precision of 10 ms because of SDL Header files. But you can change this values in your include/sdl_timer.h file. thats the defaults /* This is the OS scheduler timeslice, in milliseconds */ #define SDL_TIMESLICE 10 /* This is the maximum resolution of the SDL timer on all platforms */ #define TIMER_RESOLUTION 10 /* Experimentally determined */ and here is code of get ticks for win32.seem easy to add a SDL command that return better accuracy. Uint32 SDL_GetTicks(void) { DWORD now, ticks; #ifndef USE_GETTICKCOUNT LARGE_INTEGER hires_now; #endif #ifdef USE_GETTICKCOUNT now = GetTickCount(); #else if (hires_timer_available) { QueryPerformanceCounter(&hires_now); hires_now.QuadPart -= hires_start_ticks.QuadPart; hires_now.QuadPart *= 1000; hires_now.QuadPart /= hires_ticks_per_second.QuadPart; return (DWORD)hires_now.QuadPart; } else { now = timeGetTime(); } #endif if ( now < start ) { ticks = (TIME_WRAP_VALUE-start) + now; } else { ticks = (now - start); } return(ticks); }
_______________________________________________ SDL mailing list http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org |
|||||||||||||
|
High Precision Timer in SDL? |
Bob
|
On Tue, Dec 15, 2009 at 12:40 AM, Donny Viszneki
wrote:
Performance analysis. Not to mention that with 240Hz TVs I can see the possibility of needing a timer that can break up a frame time into more than 4 parts. Bob Pendleton
-- +----------------------------------------------------------- + Bob Pendleton: writer and programmer + email: + web: www.TheGrumpyProgrammer.com _______________________________________________ SDL mailing list http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org |
|||||||||||||||||
|
High Precision Timer in SDL? |
Michael Vance (SDL)
Guest
|
The current SDL timer/ticks resolution makes it almost useless to me--I'm surprised anyone is actually able to do anything with it! How are you guys generating accurate time as an input to your simulation? You need at least microsecond level resolution to work with frame syncing properly, do any useful intra-frame timing (like profiling), etc. Has anyone undertaken a survey of supported platforms to find out what the most common timer resolutions are? As it is I'm using mach_absolute_time, et al, since I'm writing exclusively for OS X/iphone, and that gives me nanosecond resolution.
(Speaking of which, it looks to me like the semantics of the values returned by SDL_GetTicks changed between 1.2 and 1.3 for OS X? When I switched versions recently my timing code became highly skewed, and of course the old implementation used the FastMiliseconds stuff while the new one just uses gettimeofday). m. On Dec 15, 2009, at 1:54 PM, Bob Pendleton wrote:
_______________________________________________ SDL mailing list http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org |
|||||||||||||||||||
|
High Precision Timer in SDL? |
Tim Angus
Guest
|
On Mon, 14 Dec 2009 20:47:33 -0800 datahead8888 wrote:
Sort of related point: it would be nice to have a API function like int SDL_GetTimerResolution( void ). My understanding is that 10ms resolution is about the best you can hope for on most platforms, but this isn't actually reflected anywhere obvious as far as I can tell. If you have the ability to query the resolution in the client, this gives you the potential to compensate. Furthermore, it means that for platforms where you can't ever guarantee a high precision timer, there are some means to indicate this to the client. _______________________________________________ SDL mailing list http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org |
|||||||||||||
|