General Game Questions

BlitzMax Forums/BlitzMax Beginners Area/General Game Questions

degac(Posted 2006) [#1]
I've some questions about games written in BlitzMax (but good in general I think)
I've quite finished my first game (a puzzle game) written in Bmax. Tested with a AthlonXP 1600+, Athlon64 3500+ and Athlon64 3800+x2 (all WinXP) it plays well (gfx & game syncro).
Yesterday I tested the game on a old&crappy notebook: Celeron800+128Mb+TridentCyberBlade+WinME
The game graphics is OK (the speed is quite the same as on the new machines - or acceptable for the hardware I think) but the general Game Speed is VERY different.
In my game every object has a own timer, so every N-Millisecs somethings happen.
I noticed that the only differene is the movement of the objects/sprites: I've used FLOAT to take the position .Xpixel and .Ypixel...

My questions are:
1. can the hardware plays in different manner (FLOAT slower on a Celeron than on a AthlonXP?)
2. Are Timers/Millisecs() EVER correct on different machines?
3. What are the best/good solution to determine the speed of a computer/hardware system and so to use this information in the game to 'fix' the speed?
4. Is it possibile to retrive gfx info about the GPU used? (eg: T&L support, AGP port and other strange info?)
5. Finally, what do you consider as the MINIMUM hardware requirements to play a game written in Bmax?

(for question 3: I'm using a For...Next from 1 to 100000 and misure the time difference (only CPU)...but other hints are welcome!)
(these questions are WINDOWS oriented, but I have a iBook @300Mhz and I'm testing the game on it - it's faster - many times faster - than a Celeron....)

Thanks in advance


kronholm(Posted 2006) [#2]
As for your first three questions and especially the third, do a search for delta-time on these forums :)


tonyg(Posted 2006) [#3]
Q4. Not from native Bmax but you can use ther DX and OGL interface to get that sort of information.
Getcaps
Q5. It's hard to answer without saying 'it depends' You might have a quite a high-spec machine but if you use large images you might break a 256*256 rule for that card. Will also depend on whether you're using OGL or DX drivers. Basically, it will depend on your program.


Dreamora(Posted 2006) [#4]
5) On thing is a definite must have: 3D graphic card. BM does not support software fallback on DX (and the OpenGL one is pure horror on windows)


H&K(Posted 2006) [#5]
5) I disagree, my laptop doesnt have a 3d card, and I program on it. (I agree that the 2d modual is slower than comparable 2d mods because its useing 3d), But you can program for it. MaxGui stuff works.

I would say 16bit colours min tho

3) you want to do some float calculations in the loop, I think, Specialy as you thing float is a different speed on different things


ImaginaryHuman(Posted 2006) [#6]
GL can't tell you anything more than what bit depths each of the buffers or color components has, it doesn't report anything about the underlying hardware or whether it's accelerated or what features are in hardware etc.


Grey Alien(Posted 2006) [#7]
For speed test you should actually try writing graphics (without flip) in the loops to see how fast it is in a real-life situation imho.


tonyg(Posted 2006) [#8]
GL can't tell you anything more

Blimey, moral of the story is never to assume then.


FlameDuck(Posted 2006) [#9]
1. can the hardware plays in different manner (FLOAT slower on a Celeron than on a AthlonXP?)
Yes. Floats are probably several orders of magnitude faster on the AthlonXP.

2. Are Timers/Millisecs() EVER correct on different machines?
Define 'correct'? You should note that there seems to be an arbitrary limit as to how many consecutive timers you can have (on Windows?).

5. Finally, what do you consider as the MINIMUM hardware requirements to play a game written in Bmax?
It depends entirely on the game in question. Anything worse than a first generation GeForce card is probably pushing it.


degac(Posted 2006) [#10]
Many thanks to all for the answers!

I'm making some test for understand the 'speed' of different CPU so I can 'calibrate' the game speed better.
The only certain result is that float calc on Celeron are very slow.
For the gfx I just discovered that on some *very old ATI cards* all is gone to the hell...I'm looking for more information about this card.

@Flameduck

2. Are Timers/Millisecs() EVER correct on different machines?
Define 'correct'? You should note that there seems to be an arbitrary limit as to how many consecutive timers you can have (on Windows?).



I express bad myself. I'm using something like this
IF millisecs()>Timer+Duration
...do something
Timer=Millisecs()
End IF

With timers I refer to the 'internal' clock of CPU/computer, not with the 'timers()' of BlitzMax/Windows.
On the Celeron (and on a Pentium II) 'seems' that Millisecs() runs in a different way...