QueryPerformanceCounter *less* accurate?

BlitzMax Forums/BlitzMax Programming/QueryPerformanceCounter *less* accurate?

Gabriel(Posted 2007) [#1]
In order to get high res timing, I'm using the timing method Stuart C demonstrates in this thread :

http://www.blitzbasic.com/Community/posts.php?topic=46443#516927

But I'm finding it's about half as accurate as milliseconds. Instead of seeing 1/1000th of a second before I get a timer update, I get the same time returned over and over until 2 milliseconds have passed.

It's pretty reliably bang on 2 milliseconds every time. Is there something inherently wrong with the way he uses QueryPerformanceCounter in there?


Grey Alien(Posted 2007) [#2]
I was just reading about this the other day. Perhaps it's due to having a dual core processor?


REDi(Posted 2007) [#3]
haven't had any problems with the performance timer here, but then I don't have dual core.

anyway, here's my time code if that helps (with example)...


*EDIT* tried it with this code...
Local Timer:TPerformanceTime = New TPerformanceTime
For Local n=0 To 100
	Print Timer.Time()+"   - "+MilliSecs()
Next

and got a different result on each line...

so seems to work ok here


Gabriel(Posted 2007) [#4]
Thanks Redi, I get a different result on each line too, so it must be something inherent within Stuart's code and not within yours. I'll have a fiddle and see if I can spot the differences. The one thing which stands out immediately is that he divides the frequency by 1000 to return values in milliseconds instead of nanoseconds. Perhaps it would be wiser to do that when returning the timer. I'll try that first.


REDi(Posted 2007) [#5]
My code returns the time in seconds, to change it to milliseconds you'll just need to multiply it by 1000, but whats the point? ;)

I cant see any reason for dividing the frequency by 1000, as frequency is the number of ticks per second, could be wrong tho :)


Gabriel(Posted 2007) [#6]
I cant see any reason for dividing the frequency by 1000, as frequency is the number of ticks per second, could be wrong tho :)

I'm guessing his assumption was that he could avoid the multiplication by 1000 to convert to milliseconds if he divided the frequency by 1000. When you divide the counter by the frequency, you're implicitly multiplying by 1000 because the frequency is already 1/1000th of what it would be.

At first I thought this might have been the cause of the problem, but it's not. Your code too returns the same time frame after frame if I plug it into my game. So evidently there is something in my game that the timers just don't like. Bugger.


REDi(Posted 2007) [#7]
:( Good luck mate!


Dreamora(Posted 2007) [#8]
In case your system has speedstep / AMD64 of newer generations, then most likely a loop that raises CPU usage from 30 - 80 might be a reason, as the stepping is changing that drastically that aboves value gets totally useless.

Do a high speed loop (1000000 steps and measuring in that time) at the beginning and use that to calculate the steps but do not use anything more precise than ms on intelligent CPUs in realtime ... that just won't work without strange and (for us notebook users) annoying glitches ... Its no fun if we get annoyed by "please disable speedstep to run this blabla" as some games do ^^


Gabriel(Posted 2007) [#9]
I don't have a system with speedstep, so that's not it. A millisecs resolution timer is not really sufficient for very smooth game timing.


Gabriel(Posted 2007) [#10]
Hmm.. this is strange. The time between updates on the timer is always 0.0039062500000000000 ( which I think is 1/256. ) Just thought I'd post that in case it means anything to anyone. Seems like such a precise value might be a clue to the issue, since it doesn't happen if I just run Redi's example above.

EDIT: And Millisecs() is returning values with the same step. What the heck is causing this? ;/


Gabriel(Posted 2007) [#11]
Ok, here's my debuglog, where it shows that the division is the problem. The value returned by QueryPerformanceCounter() is changing but the result of the division remains the same frame after frame.

DebugLog:155440113948855/3000460000=51805.429687500000
DebugLog:155440114599592/3000460000=51805.429687500000
DebugLog:155440115241945/3000460000=51805.429687500000
DebugLog:155440115880840/3000460000=51805.429687500000
DebugLog:155440116515760/3000460000=51805.429687500000
DebugLog:155440117174807/3000460000=51805.429687500000
DebugLog:155440117826407/3000460000=51805.429687500000
DebugLog:155440118557050/3000460000=51805.429687500000
DebugLog:155440119350047/3000460000=51805.429687500000
DebugLog:155440120107652/3000460000=51805.429687500000
DebugLog:155440120871160/3000460000=51805.429687500000
DebugLog:155440121631645/3000460000=51805.429687500000
DebugLog:155440122398310/3000460000=51805.429687500000
DebugLog:155440123184812/3000460000=51805.429687500000
DebugLog:155440123942830/3000460000=51805.429687500000


I'm none too clear why BMax is returning inaccurate results here, nor how to fix it. Doesn't make any sense. I'm using Return Double(Time)/Frequency, just as Redi does.


Floyd(Posted 2007) [#12]
Those values have been coverted to single precision and back to double, losing many digits of accuracy.

You probably have an incorrect return type for a function or method, or perhaps temporarily stored the value in a float variable.


Gabriel(Posted 2007) [#13]
Ok, thanks. I've been staring at the code for best part of a day, and I can't see where I'm implicitly converting to a float. There definitely aren't any variables declared as floats, so it must be an implicit conversion I'm missing. I think the best solution at this point will be to just rip out all the timing code and start over. If nothing else, it'll be a lot more readable, if I have to post it.


Dreamora(Posted 2007) [#14]
Any chance to see the code for above division.
Just to make sure there is not some kind of type problem (not defined variables are 32bit accuracy and you most likely would need to cast to double anything before division, BM has no long math when I'm not totally wrong. just "int" float double)

If you have a full test routine etc which saves you data I could do a test for you on core duo T2500 with dynamic stepping to show you how accurate the data are on current CPUs and what kind of impact it would give. (from experience: more precise than ms can be nice but can give worse results. Thats my experience from Torque whichs collision and physics can miss events on dynamic stepping systems if the cpu frequency dynamically changes by a factor of 2 - 3 which is normal on P-M / Core Duo T