Fake HDR

Community Forums/Showcase/Fake HDR

t3K|Mac(Posted 2006) [#1]
Hi!

I am trying to implement a fake HDR effect into my game. here is a small video (22 mb) of the first steps...

this fx is based on swifts blur routine. hope you like it so far.

www.t3k-entertainment.com/FHDR.mpg

ok, the fx is still too intense, but i am working on it ;)


jfk EO-11110(Posted 2006) [#2]
Do you have a screenie somewhere? (I still should install a media player here :/ )

Fake HDR isn't hard to do, but it's a tricky job to make it fast. An effect that is slower if there are more Triangles to render isn't very effective. I'd rather want something that is pixel oriented.


sswift(Posted 2006) [#3]
Hm... Are you rendering the scene at a lower res, blurring that low res image, and then stretching that over the main scene? Thats the only thing I can think of which might cause all that flickering, which is rather annoying to be honest. And I don't think reducing the intensity will fix that.


t3K|Mac(Posted 2006) [#4]
yes, i am using a smaller scene as a overlay image (128 pixels, cause of low speed readpixel&writepixel commands) - thats the flickering.

I'd rather want something that is pixel oriented too. but i dont think its an easy job. so i'll have to work out the other method and make it better...


jfk EO-11110(Posted 2006) [#5]
I really don't know much about pixel shaders etc. but I guess it will be pretty easy to do things like that using Max3D and the shader system (?) So I think I'll wait for Max3D, and if it's once available I'll restart my engine from scratch - with a serious attitude.

Ok, I already have some sort of HDR in my current engine release, but it's slow as hell and - thanks god - optional.

Noless, if you find a fast and nonflickering way (my implementation flickers too), then please let us know.


9572AD(Posted 2006) [#6]
HDR WTF?


sswift(Posted 2006) [#7]
HDR = High Dynamic Range

Normally the colors on your monitor are represented by the values 0..255.

But what if the colors were represented by floating point numbers instead so that the colors displayed on your monitor were 0..1?

And what if you could specify light sources like the sun which used colors much higher than 1?

How would you display this on your screen that only goes from 0..1 though?

Well, you could just clamp every highlight that ends up with a color greater than 1. But that wouldn't look very realistic.

Another alternative would be to make those areas with a color greater than 1 "bloom" into adjacent pixels. So a pixel with a value of 10 might be blurred with a radius of 10 and the color from it added to the adjacent pixels, whereas a pixel with a value of 1 would not be blurred at all.

That is what this bloom demo tries to simulate. It pretends that any pixel with a color of 255 has a brightness much higher than that, and blurs those pixels and adds the blur to the adjacent pixels to brighten them.

So it kind of looks like one method of doing HDR lighting, (which Blitz cannot actually do, but Half Life 2 uses) but it's not actually HDR.


Dreamora(Posted 2006) [#8]
You know, that this behavior on HDR is hardware side (ok driver) and not shader?
Thats how current graphics hardware is programmed to behave on 6bit / 8bit per color screens (what most here will have, 8bit at the very best) to show "HDR overbrightness" ...


DH(Posted 2006) [#9]
I tried doing Fake HDR some time ago:



The trick behind FakeHDR is to render the scene with all objects set to black (with a white camera background). Take that render, blur it (extreme blur), and apply it to a sprite overlay with the sprite blend mode set to add (3). Then render the scene again with all objects normal color, the background being your sky or whatever, and the alpha intensity of the sprite overlay being the amount your 'eye corona' should be opened (IE the eye ajusting to the light intensity would close the corona, thus alphaing the sprite to 0).

Limitation:
No way to blur the image fast in Blitz (well, no way to blur it satisfactory). Rendering multipler resolutioned images of the mask to a single texture produces grainy results (in the above code), and doesnt get a good blur (or even the extreme blur you need). Doing per-pixel operations are just too slow to even consider (Even on a 128x128). Lastly per pixel operations is what you would want as you need an average amount of light shown by the mask so you can constantly adjust the corona (sprite alpha) to actually produce the realtime part of fakeHDR.

Blitz3D just has too many limitations to do this even close to poorly.

You know, that this behavior on HDR is hardware side (ok driver) and not shader?

Actually you can do psuedo-HDR on software end (and not hardware) pretty quickly given the right approach.

Some reading material showing psuedo-HDR done on the playstation:
http://www.dyingduck.com/sotc/making_of_sotc.html


sswift(Posted 2006) [#10]
Dream:
WRONG.

http://www.gamedev.net/columns/hardcore/hdrrendering/

8 bits per color? You do know that normal 24bit color scenes have 8 bits per color, don't you? And there is no such thing as 6 bit color. The closest you can get to that is 565 16 bit mode, which is 5 bits of red, 6 bits of green, and 5 bits of blue.


Vertex(Posted 2006) [#11]
Is there in real hdr a bloom effect? I think, the scene is lite by a skydome with a hdr "lightmap" and than its clamped (by a fragmentshader?) to RGB888.
cu olli


nawi(Posted 2006) [#12]
Dark Half, I think you can optimise that by instead of using black models just disable all lights and put ambientlight 0,0,0 while rendering the hdr map.


DH(Posted 2006) [#13]
Is there in real hdr a bloom effect? I think, the scene is lite by a skydome with a hdr "lightmap" and than its clamped (by a fragmentshader?) to RGB888.

Actually, in all the HDR games I have seen (Oblivian, HL2 (lost cost, EP1, and some CS Source maps), they all have a bloom as you first enter that area. As your eye (the computer eye) adjusts to the light, the bloom goes away. It's a very cool affect!

Dark Half, I think you can optimise that by instead of using black models just disable all lights and put ambientlight 0,0,0 while rendering the hdr map.

True nawi, optimizing that portion you are correct, but that still leaves the huge gapping optimization lack that which is blurring the HDR mask :-)

Although you are correct, my code was just a hack job to see if blitz3d could do psuedo-HDR


ImaginaryHuman(Posted 2006) [#14]
I don't really see how making the pixel values blur when its above a threshold has anything to do with HDR, really. In HDR you would not get texture colors maxing out and when you darken stuff you would still see bright objects.


nawi(Posted 2006) [#15]
Idea: Set HDR to a low resolution so you can blur it fast (like 64x64), then have for example 20 sprites on top of each other with a little offset, each having 1/20 alpha value. The sprites would be offset in a circle, so it would make round cornes I think.


Dreamora(Posted 2006) [#16]
Sswift: I know what the gpu is capable of.
Sadly TFT aren't even next to that. They have a range of 6-8 bit per color they can really display which is still below GPU on Integer precision without even taking HDR into account.

http://en.wikipedia.org/wiki/High_dynamic_range_rendering

But you are right, it does not seem to be a driver thing, although I thought I read on the NV or ATI page over a year ago that this behavior with "bloom aura" is driver side when a surface is "overbrightened" and the screen is not HDR capable (which 99% are not) ... but was not able to find it in a short search.


Bouncer(Posted 2006) [#17]
This all has been done a million times already... use search... no need to post new gloom/bloom/fake hdr etc. threads or videos....


sswift(Posted 2006) [#18]
Sswift: I know what the gpu is capable of.


That remains to be seen. :-)


Sadly TFT aren't even next to that. They have a range of 6-8 bit per color they can really display which is still below GPU on Integer precision without even taking HDR into account.


What does that have to do with anything?


Thats how current graphics hardware is programmed to behave on 6bit / 8bit per color screens


Ah, I see. When you said screen you literally meant screen. See, I don't care what the bitdepth of the actual monitor is. I thought you were talking about the bitdepth of the graphics buffer.

But anyway, my point still stands. The graphics hardware does not automatically do bloom. It is a shader effect, and even the wikipedia article YOU linked to says that.

Anyway, I've never heard of a sceen which IS hdr capable, nor have I heard of drivers compensating for it on those which are not. As a developer I would not want that because it would prevent me from making my art look the way that I want it to look. They'd have to be crazy to do that.


t3K|Mac(Posted 2006) [#19]
bouncer: no, gloom&bloom always gloom/bloom the whole screen. this is not what i want. i want partial bloom/gloom. this has not been done yet. and yes, i did use search (lot of times) - but found nothing comparable.


DH(Posted 2006) [#20]
This all has been done a million times already... use search... no need to post new gloom/bloom/fake hdr etc. threads or videos....


Yeah, perhaps a link or two rather than a smart-ass comment would be more productive?


jfk EO-11110(Posted 2006) [#21]
nawi: the problem with small renders (eg 64*64) is, the graphics are not antialiased, so pixels will "jump", especially on the contours of things. Bluring the render won't help since the rendered pixels remain the only information available, true interpolation is not possible. If you scale this up to the screen resolution and use the additional blend mode for a glow effect(also when using eg. 5*20% alpha as you suggested), this will result in flickering areas of the effect.


t3K|Mac(Posted 2006) [#22]
i stopped developing this, cause andreymans fx lib has such a function. and its pretty fast (not just full screen gloom, partial!). just load the demos!