Max texture Size

BlitzMax Forums/BlitzMax Programming/Max texture Size

Grey Alien(Posted 2006) [#1]
So I've heard that 3D cards have a max texture size. Well what is current and what would be the best size to not go over for my game. For example is 1024x1024 OK, or should it be less?

If you had a card that only supported 512x512 and you loaded in an 800x600 title screen into an image then I assume it wouldn't work right! You'd have to load it in parts to be compatible with those cards or are they simply not worth supporting? Also how would it not work, would it crash or simply display a load of crap? Any ideas? I tried my AOTMG demo on a crappy office PC the other day and it ran but the map screen which has a 1024x1024 texture just showed a multicoloured pixelly mess, maybe this too big, yet other 800x600 screens loaded as single images were fine...

Thanks for any feedback.


Steve Elliott(Posted 2006) [#2]
I once had a 2nd computer with a poor Intel integrated graphics card, and found non square textures of large sizes would run without any graphical glitches - but very slowly.

I'd recommend always using square textures, and no larger than 256 X 256 for maximum compatibility.


Damien Sturdy(Posted 2006) [#3]
Heh, even though I've been using 2048x2048 textures before now, though on an onboard chipset, with Quad2, nobody whinged about it not working, but you never know.

These days 2048x2048 would be on the upper limit I would think- the old onboard cards used system ram and used as much of it as they needed too, which would explain why the 2048x2048 textures worked.


tonyg(Posted 2006) [#4]
This (DX) will print the maxtexture size plus whether pow2 is needed.
What you should work to is up to you. Larger texture sizes should be OK in the majority if cases but, as Steve says, if your target is low-end you'd need to work with square pow2 textures with max size 256*256.


bradford6(Posted 2006) [#5]
a word about quality though. in my experience, a 256x256 *.PNG looks better than some 512x512 pixel *.jpg's

I think they are uncompressed and are essentially the same within BlitzMAx but the PNG's just look better.

this is my opinion...on my PC (1-Rather High end Desktop and an older Laptop (Radeon 9xxx chip)


Grey Alien(Posted 2006) [#6]
so like if a card could only handle 256x256 and I load in a title screen of 800x600 would it just not display then?


Tom Darby(Posted 2006) [#7]
Grey, that's a real possibility--now that I think about it, I've seen this kind of thing happen on older computers. (They choke on the background images for my game...)

I remember seeing a code snippet a while back that'd automagically chop big images up into multiple bite-sized pieces. You may want to dig for that...

Ah--nevermind: it's Tesuji's "Large 2D Image Object" code in the code archives.

http://blitzmax.com/codearcs/codearcs.php?code=1440


ImaginaryHuman(Posted 2006) [#8]
PNG is compressed, it's just not `lossy` compression, so it is possible it looks better than a larger jpeg.

On my old ibook which had a totally minimal gfx card ATI Mobility thing, it reported max texture size of 4096x4096 for OpenGL. Then on my much newer iMac with GeForce4, it says 2048x2048. The ibook had a software GL driver, no hardware acceleration, which is probably why it had larger texture sizes, and why in some cases it actually supports more features than the newer iMac (aux buffers, etc)

I have no idea what happens when you try to upload a texture that is larger than the available size. Blitz would first be allocating texture space because, if I remember, it creates an empty texture first then uploads the image to it, so if it can only do 512x512 and you try to make a 1024x1024 texture I have no idea what it would do. I don't know if Blitz splits it up for you or if it would just fail, or whether the image gets cropped to the size of the texture, or whether it just looks like junk due to the rows being a mismatched number of bytes, etc. IT's probably unpredictable the effect you will see when there is a mismatch of size so generally you should avoid it ever happening.

Also of note standard OpenGL has square textures only, from 64x64 upwards. Rectangular textures are not supported without extensions or a higher GL version than the standard 1.2

Local SizeVariable:Int
glGetEnv(GL_MAX_TEXTURE_SIZE,Varptr(SizeVariable))

or something very close to that, is how you get the max texture size for OpenGL. It will just return one number, e.g. 2048, meaning 2048x2048, due to square textures.

Here's what I would do (and am doing):

Think of images (which are textures) as just a blank buffer space that comes in chunks of whatever the maximum texture size is e.g. 256x256. Think of it as similar to like a file read buffer or an array that you predefine as a certain size that you add extra size to [..newsize] when it is full. So the amount of space you have in that chunk of video ram is maxtexsize*maxtexsize. I would avoid rectangular textures unless you're requiring your customers and their customers to have higher versions of GL which are less common.

Then think of the `images` that you want to throw onto the screen, not as BlitzMax `Image`s, but as a custom image type which may be a collage of 1..N BlitzMax images. So say your maximum texture size is 256, and you want to upload a 1024x1024 picture, you will need (1024/256=4)*(1024/256=4) = 4*4=16 BlitzMax images at 256x256 each. So your custom image type will have an array of 16 Images, based on the fact that the texture size is smaller than the picture size. If the picture size was <=256x256 you would only need one Image to be referenced. So think of that array of Images as an array of fixed-size buffers, needing several to make up the whole image.

Then when you need to draw this large image, you position, rotate, scale, and draw each of the individual BlitzMax images that it is composed of, like a grid. You can use the image Handle as an offset from the center of the whole large picture, so that any rotation will be relative to the center of the whole thing. Also if you scale, you'll have to scale that Handle offset or you'll get gaps. It's also a good idea to use at least Floats for all the coordinates so you get sub-pixel accuracy.

So then any operation you do on the whole image you apply to each sub-image (BlitzMax Image) individually, acting as a group.

This way no matter what the maximum texture size is, you will never have a problem with it because you will compose your graphics using `pieces` that are as large as possible on that platform. You should always use the largest size textures that are allowed so you don't have do too many switches between textures (like a context switch) which is time-consuming overhead.

Then you should also think of each BlitzMax image as being able to comprise multiple sub-images or areas of that image. I call these sub-textures. Thus any given `picture` can be using an Image which other `pictures` are using as well. Like 50 frames of `bullets` or something, all can be on one BlitzMax Image, and each bullet `picture` refers to that same Image/texture, but to a different sub-image within that. Then you'd use drawing of a Rect Image to draw only portions of that Image for each picture. If you can keep animation frames within the same BlitzMax image that also is more efficient and cuts down on swapping between images.

This is the system that I'm using in my blob game. It then doesn't matter whatsoever what the maximum texture size is because images have been abstracted as a custom type referring to images as just resources. I plan to put multiple 512x512 textures onto a single 2048x2048 texture to get 16 frames of animation with no swapping between textures - using sub-textures for each frame, while with the same system will be using animations of small particles as subtextures either as a `spare part` of a larger texture or just as a smaller texture.

This then leads to whether you'd want to consider `sorting` pictures and `optimizing` them to make the most use of available free sub-texture space as possible, or to rearrange them on the texture to waste as little space as possible. It's then useful to be able to download/save an image comprising multiple sub images, as like a sprite sheet, and vice versa loading a sprite sheet and turning into a single image or part of an image or broken across a few images with a `sub-texture` for each sprite.

Of course, this is somewhat more complicated internally than the easier method of just going with what texture size is popular and hoping it doesn't break etc.


Steve Elliott(Posted 2006) [#9]
Yep - just tile your graphics from 64 X 64 to 256 X 256 - or write a program to do it for you.

Personally I wouldn't trust the feedback you get from the driver. So I produce tiles manually using Photoshop's snap grid feature at a standard 256 X 256 max.


tonyg(Posted 2006) [#10]
@Grey Alien,
- Use the link to the DXCaps and run it.
- Check what your maxtexture width/height are.
- Create an image (PNG whatever) larger than the max
- Load the image into Bmax and drawimage it.
Result : On my system a black screen but don't take my word for it.


degac(Posted 2006) [#11]
http://www.blitzbasic.com/codearcs/codearcs.php?code=1440

I'm using this to handle big-background image on old-gfx. It works.


Grey Alien(Posted 2006) [#12]
hmm interesting feedback from all thanks. It does seem crazy to think you can't load in a title screen and show it reliably you have to split it into chunks! Also the loader for that would have to be special unless you stored your title page in several files.

Only problem with splitting images is this, if you move it round the screen and draw at non integer values (for smoothness based on delta timing) the joins will antialias and you'll end up with horrible lines in the image where the textures join (just theorising here)...


Steve Elliott(Posted 2006) [#13]
Er, no - you can position at a floating point position for the first tile - the others are added at a position that is exact - just add the texture size and they will line-up ok.

The other point is, modern graphics cards are all 3d-based so you have to think (and optimize) in those terms - every square of graphical data is 2 triangles in a 3d world.

Modern 3d hardware has no problems with huge textures - but for older 3d hardware the inital limit was 256 X 256. To see the difference in speed that droping texture size and tiling has on a poor graphics card will convince you to adopt it - you just can't think in 2d terms!


ImaginaryHuman(Posted 2006) [#14]
I would think the sub-pixel filtering will take care of any joining?

It's not so much crazy that you have to split up, as crazy that there are hardware limits and they are not the same across the board and Max tries to usually cover those up, but sometimes you have to deal with it.


Grey Alien(Posted 2006) [#15]
wow, yeah this is pretty much a revelation perhaps needing a redesign...


ImaginaryHuman(Posted 2006) [#16]
That's why I shared it, it'll make your product more resiliant and capable plus provide a springboard to other new features. You can incorporate it with your particle engine and object handling etc.