file formats

Blitz3D Forums/Blitz3D Programming/file formats

Mikeyj21(Posted 2005) [#1]
Hi guys..
Just a quick question about gfx file formats, if I may?

Whilst considering which format to use for my textures,
I did some experimentation to find out which file format to use.
With a 1024*1024 plain white texture (16 bit), the four formats I tried gave the following results:
.bmp 2,049kb
.tga 2,049kb
.jpg 78kb
.png 22kb

With the same size and bitdepth, but with a very colorful picture, I got:
.bmp 2,049kb
.tga 2,049kb
.jpg 986kb
.png 1,800kb

Now, obviously the .jpg and .png are both compressed file formats..
I realise that the gfx card stores textures in it's texture memory, but are these stored in compressed or uncompressed form? If they are stored in compressed form, is there a performance penalty from using compressed textures?

For example:
In the case of my game, one of my textures is 495kb as a .jpg, rather than the 2,049kb as a .bmp.
If the gfx card is storing these uncompressed, then the only advantage from using compressed file formats is disk space. However, if they are stored on the gfx card in compressed form, then there must surely be a speed hit for decompression in exchange for the 'extra' texture space?

Many thanks!


GfK(Posted 2005) [#2]
you'll find that (for example) a 512x512 texture will take up approximately 1mb of video memory, regardless of the file format/compression used.

The only advantage of using PNG over BMP, is the amount of disk space it takes up.


Mikeyj21(Posted 2005) [#3]
Ah, thanks!
So, textures ARE stored uncompressed on the card.
At least it makes choosing the filetype only a case of which size and bitdepth to use! (Am I right in assuming that an 8-bit texture takes up about 1/2 the video memory that a 16-bit texture does... it certainly takes up half the disk-space)


ErikT(Posted 2005) [#4]
If you're running in 32-bit then all textures will be stored in memory as 32-bit textures. In other words there's nothing to gain from converting your textures to 8-bit or 16-bit.


Mikeyj21(Posted 2005) [#5]
Really? The guy who did the gfx for Medal Of Honor says (I quote from his book):
'We made the majority of our textures 4 bit (16 colors). This actually helped give the game it's unique look, and at the same time, nearly tripled the amount of textures we could have in the game...'

Have I misunderstood either you or him...?
Knowing me, probably!


ErikT(Posted 2005) [#6]
No, that's true for Medal of Honor I bet. But blitz3D stores the textures in memory according to screen bitdepth. I think it's a DirectX thing, not sure. Maybe someone who's got a better clue can answer that one.


Shambler(Posted 2005) [#7]
They were probably talking about .dds textures which Blitz doesn't support.

http://www.blitzbasic.com/Community/posts.php?topic=43695


Mikeyj21(Posted 2005) [#8]
Yep.. you guys are right!
After your posts, I did a bit of research and found that there are two blitz commands I'd missed... TotalVidMem and AvailVidMem.
So, armed with these, I did a couple of quick tests (I tried the textures saved in 8,16,24 and 32 bit formats...) and yes, I can indeed verify that textures are stored in the screen bitdepth.
Seems like a huge waste of video memory to me! (But, I don't fully understand all the issues, obviously...)

Unfortunately, I get very wierd artifacting with a 16-bit bitdepth (ever since I went to a Geforce4) :(
So, I am stuck with 32 bits... which means the amount of textures I can cram in is very limited (I hate using small textures on big models... the blurriness looks foul!).

Do you guys have any idea what sort of performance penalty you pay if you exceed the available video memory... I assume textures have to be copied across from sytems memory across the AGP bus? And that would be bad?

Thanks for your help!


Mikeyj21(Posted 2005) [#9]
Just thinking more about this... If I want to use 1024*1024 textures, then (given about 6meg for the front and back buffers with 1024*768*32 resolution)... that leaves room for about 15 textures with a 64meg gfx card!

Surely I must have misunderstood something along the way... this can't be right?

Or do games quite happily exceed thier video mem without penalty?

I'm now more confused than ever!

Cheers!


Bot Builder(Posted 2005) [#10]
thing is most games dont use 1024x1024 textures. Even halving that to a reasonably high res 512x512 actually fourths the mem. Also, figuring modern cards are 128mb or 256mb, games dont usually exceed their limits.


jfk EO-11110(Posted 2005) [#11]
Not so long ago everybody used 256*256 or 128*128 textures. Using 1024*1024 textures is still kind of a waste of VRam IMHO, unless you use it for things like Lightmaps, or SingleSurface mapping of larger objects.


Caff(Posted 2005) [#12]
I believe there is texture compression built in to DirectX in the form of DXTC, which if implemented in Blitz3d would allow far more textures in a scene. I think the compression rate averages something like 1/4 of the original bitmap size.

If this, and a third set of UV co-ordinates, were added to the Blitz3d file format, it would allow for some very high quality scenes, as you could use very large lightmaps and have another layer for detail (e.g. dirt maps).

So - base texture on UV 0, lightmap on UV 1, dirt/decals on UV 2.

Of course I might be totally wrong.