Crazy looping idea

Blitz3D Forums/Blitz3D Programming/Crazy looping idea

Picklesworth(Posted 2004) [#1]
Why do games need to run at 60 FPS when a perfectly fine theater showing goes at under 30?

I think it's based a lot on the feel of the game, so I thought of an interesting idea: Rendering at half of the program's set framerate and thinking at the full framerate.
This way, user interaction could still be done with a nice smooth feel (when framerates drop, nothing is smooth because of the speed it recieves input), and rendering is done less often so that the frame rate could get a nice boost.

I don't know how this would turn out, but what do you think of the idea?


jhocking(Posted 2004) [#2]
I don't know how well this would work for a fast game. For slower games, like strategy games, this sort of thing (ie. different calculations loop at different rates) is common.


Ross C(Posted 2004) [#3]
Is it not because your monitor updates at, at least 60 fps. And since your only doing the game at 30 fps, it seems slow. For slow game, as jhocking says, it wouldn't matter. Alot of people say that 30 fps is fine for games. I wouldn't say First Person Shooters would be much good at 30 fps though.


AntonyWells(Posted 2004) [#4]
Well generally when people talk of locking a game to 30fps, they're not on about forcing 30 renders per second, but 30 logic updates per second. You still render as many times as you can, but depending on how fast the cpu is it'll be less than 30 times or more than 30 times.(upto60 if you're gpu supports it)..the anims are tweened, so they stay in sync too. But whatever the case, the logic is updated 30 times per second. (So the gameplay is the same on a p800 as it is on a p3ghz, only the p800 one will be more jerky)

so 30fps = less cpu overheads = more frames rendered than 60fps...hence a lot of people say 30is the best.


Beaker(Posted 2004) [#5]
A film can run at <30 fps because motion blur clues the brain into what is happening in the missing action. Computer games don't do this (much, yet) so you need many more frames. The brain notices anything less than 85 (I think it is).


poopla(Posted 2004) [#6]
All I know is the human eye only update about... 24 times per second.


GfK(Posted 2004) [#7]
All I know is the human eye only update about... 24 times per second.
ROFL!

There's so much potential for tearing you apart from just those 14 words.... I really don't know where to start!


jhocking(Posted 2004) [#8]
Beaker is correct about why film framerate can be so low. Film images include motion blur, games do not. That extra motion indicated by blur must be depicted "for real" in a game. And note that including motion blur in a game wouldn't really help since motion blur in 3D graphics is accomplished by rendering lots of frames and interpolating.


Spacechimp(Posted 2004) [#9]
Gah!! no wonder why I cant hit the nieghbor dog with my rifle. I need to upgrade

24 fps just isnt good enough....


Picklesworth(Posted 2004) [#10]
okay, that explains a ton. Thank you :D


jfk EO-11110(Posted 2004) [#11]
TV has not only motion blur, but also uses interlaced Frames, that means you'll see a semi-frame (every other scanline) at 60Hz(ntsc)or 50Hz(pal), but basicly you only have 30 or 25 Frames/s. You could use some kind of fake interlaced Frames in BLitz too, this would give you a touch of motion blur, All you need to do is: Instead of a simple Flip to show the latest render, you will have to copy the backbuffer to some imagebuffer, where it is secure, then copyrect every odd line of that imagebuffer to the frontbuffer. While you are rendering the next frame, you have to draw the even lines after the half frametime has passed by. If you swap this every time (one time the odd lines first, the next time the even lines first), you may get a nice interlaced feeling.


Rob(Posted 2004) [#12]
All I know is the human eye only update about... 24 times per second.
painfully wrong.

Shattered, please stop sounding like someone who really doesn't know what he's talking about...


JoshK(Posted 2004) [#13]
You can see pretty fast. Halogen lights give me a headache, because I can see them blinking on and off.


jfk EO-11110(Posted 2004) [#14]
What Shattered was talking about is the recording framerate. I know this statement, it is a scientific insight of the time when they tried to find the optimal framerate for movies. The point is: when you record a movie and the lense is open of 33 ms, then you have the whole motion of 33ms on one frame, thus motionblur. Now when you have 30 FPS with full 33ms motionblur recording, you won't see the diffrence to 60hz with 16.6ms motionblurred recording. Or at least the average human eyes won't see the diffrence.

Of course, this is not the same like crystalclear rendered Pictures without any motionblur. There you can easily see the diffrence up to Famerates beyond 100 fps, although you need a refreshrate on the monitor that is high enough, otherwise you'll always be limited to the physical framerate.


WolRon(Posted 2004) [#15]
A film can run at <30 fps because motion blur clues the brain into what is happening in the missing action. Computer games don't do this (much, yet) so you need many more frames. The brain notices anything less than 85 (I think it is).


Exactly.


MSW(Posted 2004) [#16]
Ack! it chopped it in half...read on below


MSW(Posted 2004) [#17]
*cough* not so fast folks...

You only need around 10FPS to convice your brain that an object onscreen is moveing...sometimes this can be even less.

Take a look at animated films before you start convinceing yourself that motion blur is what allows film to run at 24FPs...Even Disney flicks typicaly get by with only 12FPS for most everything (it's called shooting in "twos" as each drawing is held for consecutive frames...less work for the animators to do that way)

A large reason why film is shown at 24FPS is cost...24 frames of a typical 35mm film print is about a foot and a half long, all of it running through the projector in one second...double the projected frame rate and the film stock needs to be twice as long...not to mention that doing so also wears out the projector gate twice as fast.

TV, of course, runs at about 30FPS (or 60 fields per second...actualy the interlaceing stuff is largely a thing of the past...most modern TVs don't interlace at all)

The human eye doesn't work on any sort of frame rate...neither does your brain...its true persistance of sight, the little receptor cells ing your eyes reacting to the light that strikes them...course they don't respond instantly else we wouldn't have the lingering effects of "sunspots" when looking at a intense light...but if you must know how fast our eyes can see, well it depends on both mood and available light...Useing a linked series of variable speed strobe lights has sown that people can still sense the lights are flickering on/off even when the strobe speed is well over 2,000 flashes per second...

However, makeing a game that runs at 24FPS should be no problem...most of the apparent adversion to such a frame rate has nothing to do with players percieveing motion, but rather how thier brain has learned to process visual data when playing games displayed at higher frame rates...


MSW(Posted 2004) [#18]
:P double post


mrtricks(Posted 2004) [#19]
The reason that film was made at 24FPS was that it was found to be the *minimum* frame rate needed to kick-start persistence-of-vision. Added to that the motion blur someone else has talked about.

But also! I've noticed in the cinema that if the camera is panning quite fast, you DO notice a flicker. An FPS for example is going to have constantly panning cameras, which is why you notice it much more than in a film.

Having said that, some of the best console games run at about 20fps - look at Goldeneye or Perfect Dark, they slow down incredibly in places, but I still play them now.


Rook Zimbabwe(Posted 2004) [#20]
MSW is right... In the 1960's some Hollywood types did a study on this and eventually invented a process called SHOWSCAN. This process was a method of showing film at an incredibly high rate of speed (I think 124FPS) at this rate the brain became more highly involved with the image and thus the image became more REAL to them... Test films of a rollercoaster or a Car Crash had people getting physically ill in the theater. The process was discontinued.

The IMAX images are supposed to be shown at something like 80fps I think... The images are both bigger and faster than you are used to processing film images so the experience seems more real... I find the flicker in the films very annoying.

Recently the process has become interesting to new people as with the advent of the micro-digital age we can show images as fast as we need... I don't know if we can actually create images with a computer at 2000FPS, I don't know if there is a monitor out there that is even capable of resolving images that fast... but soon.


WolRon(Posted 2004) [#21]
Here is a post I wrote once before about the subject:

Let me straighten it out a bit for you uninformed. The human eye can be fooled into seeing fluid motion at almost any framerate.
The eye begins to detect fluid motion at about 12 frames/sec.
Movies are (typically) filmed at 24 frames/sec. I don't think anyone can complain that the movies they watch are 'jerky' (except maybe for some action films where they are using a high shutter speed).
It SEEMS (large emphasis on 'seems') that games/consoles/computers need to display images at high framerates to give the impression of fluid motion. This is only partially true.
Faster framerates WILL appear smoother and smoother. And at some point (70+) the eye can no longer distinguish between frames.
A misconception about all this is that the 70+ is NECESSARY. It is not. Film framerates (24/sec.) prove this.

Here's the difference:

When filming movies, the camera shutter is open for a certain amount of time (1/24th of a second) and all motion during that 1/24th second is blurred into one image. So one FRAME actually contains all the events that have happened during that 1/24th of a second.

When a game/console/computer displays a frame (lets go with 24/sec. to keep the comparison even) TIME IS STOPPED for that frame. It shows us what happened at that time as if we had a camera with a shutter speed of something like 1/1,000,000 of a second. The majority of the information for the remainder of that 1/24th of a second is missing.

What we end up with for the film is 24 frames that contain 100% (1 second) of data.
What we end up with for the game is 24 frames that contain 24 instantaneous screenshots of what happened in that second.

So...
When playing back the film, 100% of the second is replayed and looks smooth and fluid.
When playing back the game, 24 still images are flashed before our eyes and one of two things happens:
1. If the amount of motion between the 24 images is small enough, it will appear fluid.
2. If the amount of motion between the 24 images is too great, it will appear 'jerky'. In fact this is still completely possible even if the framerate is extremely high.

Games/Consoles/Computers are CAPABLE however of displaying fluid motion with low framerates even if the amount of motion is great. In fact they already do. Many, many games come with intro movies that do just that. Regardless of what framerate the movie is playing at, it looks fluid.

The question here is how does the game do it while rendering in-game screenshots?
What it has to accomplish is displaying 100% (or close to it) of the information during every second. It can do this in one of two ways:
1. EXTREMELY HIGH FRAMERATES displaying more and more of all the events that happened during the second. (This is what games are trying to accomplish right now and what many gamers think NEEDS to be done)
2. PROVIDE THE MISSING INFORMATION. It's the more difficult one to do but has better results. There is more than one way to accomplish this. Some graphic cards now can blur action by taking the last two frames and combining them to display a third blurred image. Another method (and one that I haven't seen implemented yet) is to use an algorithm to diplay an image that is the result of an object in it's initial position at the beginning of the 1/24th sec. and ending position at the end of the 1/24th sec. and all of the pixels in between the two (This may be hard to comprehend or visualize). Kind of like drawing a 'streak' as opposed to a static image.

I hope now you realize that high framerates aren't NECESSARY to display fluid motion. They are only one solution.
A 30 fps clip CAN LOOK JUST AS FLUID AS A 60 FPS CLIP for the reasons I stated above. It all depends on how much of the information is contained in those 30 or 60 frames.
And in the same respect a 60 fps clip can look more 'jerky' than a 30 fps clip.

Take care all.


MSW(Posted 2004) [#22]

The reason that film was made at 24FPS was that it was found to be the *minimum* frame rate needed to kick-start persistence-of-vision. Added to that the motion blur someone else has talked about.



Not at all...different film stocks actualy have different film projection speeds...35mm like at theaters is 24FPS...some 16mm and most 8mm are ment to be projected at around 12FPS or 16FPS...the 70mm IMAX stuff is typicaly projected at 30FPS with some at 60FPS...

Even then motion blur isn't needed at all, as proof it typicaly is absent from animation...again Disney films feature smooth character animation, yet most of it is only 12FPS.

Additionaly if a film is shot at 24FPS...you will not see 1/24 of a seconds worth of motion blur anyway...that is due to the cameras shutter speed (and ability of the film stock to react to exposed light) cameras are mechanical and some portion of that second will be spent with the shutter closed as the film advances to the next frame...under normal conditions only about 50% to 65% of that 1/24 of a second will actualy make it on film to produce motion blur, the rest of the filmed action happens while the shutter is closed...


jhocking(Posted 2004) [#23]
"Even then motion blur isn't needed at all, as proof it typicaly is absent from animation...again Disney films feature smooth character animation"

Animation is a bad example for a couple of reasons. Animators specifically craft the movements to look good and "read" at low framerates. This is accomplished with tricks/techniques like exaggerating movements.

Another major factor is squash and stretch. Hand drawn animation includes a significant amount of squash-and-stretch, in which things stretch out along the path of motion, and part of the reason is to replace the absent motion blur.

Both of these reasons are specific examples of classic animation princples employed (read about them here:)
http://members.iinet.net.au/~zapo/pages/animation_principles.htm
Most of the twelve principles are things which help animation look smooth even at low framerates, and which don't apply to most in-game situations.


MSW(Posted 2004) [#24]
jhocking - Animation also includes other media such as stop-motion (AKA claymation)...the works of Willis O'Brian, Ray Harryhausen, etc...even such films as a Nightmare Before Christmas has sections of it shot in twos (12FPS) with no motion blur...

And on the subject of games...look at all the 2D Final Fantasy type RPGs with two frame walk cycles that alternate every quarter second (4FPS!), yet players have no trouble "reading" that the character is walking...

The original Quake game had every animated monster, weapon and player model locked at 10FPS with no interpolation (just like the FF example above...even when the game runs at a higher overall FPS...the animated entites were locked in at 10FPS)...yet players had no problem "reading" the onscreen action...

The earlyest CG animation in films has no motion blur...not in Tron, not in the Last Starfighter yet the motion still apears nice and smooth.

Motion blur isn't really needed to convay smooth motion in film, never was...


Most of the twelve principles are things which help animation look smooth even at low framerates, and which don't apply to most in-game situations.



don't apply!?...now that is pure BS, animation is animation, be it for film or even videogames. the principals are the same, they arn't there to hide the lack of motion blur, but to give it a sense of naturalisam (not realisam)...stretch, anticipation, followthrough, slow in/out...all of it are principals for naturalistic animation...and it certainly applies to most, if not all, in game situations.


jhocking(Posted 2004) [#25]
I wouldn't call any of those examples smooth. I suppose I haven't seen Tron in a while so I can't really comment on that example. However, those other examples (claymation animation, Quake, etc.) are a far cry from smooth. If you think the two-frame walk cycles from Final Fantasy are smooth you need your eyes checked.

As for the principles of animation not applying to games, yes they do apply to animation created for games, but the quality of the animations in a game have little to do with the smoothness of the game's rendering, and they do not apply to the games themselves. The principles of animation do not apply to stuff like camera movement in an FPS game, and it's things like that that create issues for the smoothness of a game's visuals.


Beaker(Posted 2004) [#26]
Who said animation doesn't have motion blur? Thats not true, animators often add motion blur, you never seen "Road Runner"?

But, of course you can animate at 4 FPS if you want as long as you 'hint' at the intended movement/action.

Try doing a camera whip pan in 4 FPS - you can do it, as long as you 'hint' at it: character looks off camera, 2 frames of blurry streaky horizontal lines, to next shot.

But, thats the thing about animation, its a series of codes designed to minimise the amount of drawing/work required.

But, much of that doesn't apply to 3d games, for several reasons.

a) the author can't direct the visuals when the player is in control of the camera.
b) if you had 4 FPS in an action game you may miss some crucial piece of action between 2 of those frames! Even at 30 FPS. This is doubly true when you have game systems running at different rates, graphics, network updates, game logic. Trying playing an online war game at those frame rates, you will often wonder how you died.

I don't think you can deny that motion blur benefits the smooth movement in film. That just seems like arguing for the sake of it. Of tourse it won't matter if the camera is locked off on a sedate scene, but throw in some action or some high speed camera moves and you will notice it (or the lack of it - ever seen digitally added in camera movement without added motion blur? looks very odd and doesn't 'read' at all).

The only reasons some film was projected at less than 24 FPS was because of cost of materials and because at the time the cameras/projectors/film stock were invented they didn't know any better.

TVs might not rely on the 50/60 hz refresh that they used to but the video they display does, and still contains double frames in the form of the interlaced fields.

You could argue that you don't need any frames per second to know something has moved. Think about it, an object is in one position and then miraculously is in another. You know it has moved, probably very fast. That doesn't mean to say that it moved smoothly or in a way that we experience in real life.


Rob(Posted 2004) [#27]
Don't forget frame interlacing, which is completely absent from computer graphics.

A typical 12fps tom and jerry cartoon will actually be displaying 2 frames interleaved, which is a major factor in the smoothness of animation.

Therefore a 12fps tom and jerry cartoon will be approaching an equivalent 24fps on the PC.


MSW(Posted 2004) [#28]
jhocking - smoothness alone is not good animation, that link you posted even points this out...and the principals of animation, the arcs, and slow in/out directly applies to the camera bobbing about with the players movement in FPS games.

Beaker - Do those animator added motion blurs in animation like Road Runner...the speed lines and whatnot...do they even look like real motion blur? No, of course not...it's a gross exageration...the fact that you can percieve them, points that out.

Bloodlocust - Tom and Jerry were originaly produced on film...then transfered to video and now to DVD...by and large DVDs do not contain interlaced video...rather it's a part of the DVD hardware decoder that introduces interlaced fields.


Beaker(Posted 2004) [#29]
MSW

Your point being? The whole discussion here is about whether (some) games need high frame rates. I argue that they do. Do you disagree on the grounds that animation makes do with 12 FPS?

Why does TV and film use higher than 12 FPS? It doesn't need it, by your own argument. Maybe we should watch films in flick-book form only, or maybe even that is too much, picture books anyone?

The drawn motion blur isn't there for mere eyecandy, it lets us know what has happened in the action.

You ask if it looks like real motion blur, I say "yes, to a child or adult caught up in illusion of the animation". Of course, technically it doesn't, but then neither does Mickey look like a mouse, and nothing looks real if only cos real things don't have outlines.


AbbaRue(Posted 2004) [#30]
This topic has given me some good ideas.
For one, when texturing animated characters, have the texture blurr in the direction of movement when turning.
Then you want cystal sharpness at the end of movement.
Try freezing movies at different points and you get the idea of how to do this.
This was quite an interesting post.


Picklesworth(Posted 2004) [#31]
That's a neat idea. I hate it when animations just slow down, and that could help make it seem better.


WolRon(Posted 2004) [#32]
The earlyest CG animation in films has no motion blur...not in Tron, not in the Last Starfighter yet the motion still apears nice and smooth.

If I recall correctly, almost all of the scenes in Tron were slow in action (thereby not requiring motion blur). The only one that really wasn't was the bike scene but then they actually DID have motion blur in a sense (the trails that followed behind them).

And I actually remember a few scenes in Starfighter that were jerky. If you don't, then you apparantly can't tell the difference.


Warren(Posted 2004) [#33]
This thread is a real gem. gg all!