Mark's Aug 2006 worklog

BlitzMax Forums/BlitzMax Programming/Mark's Aug 2006 worklog

WendellM(Posted 2006) [#1]

In his new worklog, Mark writes about the upcoming Max3D:

A couple of things are also becoming clearer about the initial release: It'll require a kick-ass graphics card and a kick-ass GL driver.

A Direct3D driver will happen eventually, but probably not until the whole Vista release drama is over and we know where things stand re: D3D9 vs D3D10 etc.

I'm a little concerned about the focus on OpenGL and the DirectX (Direct3D) delay. This concern comes from the stats for downloads of a freeware app/game written in BlitzMax that I put out a few months ago (which is equally available for all platforms that BlitzMax supports):

Win DirectX: 524 (65%)
Win OpenGL: 142 (18%)
Mac OSX: 78 (10%)
PC Linux: 61 (8%)

So, with respect, my experience is that DirectX is more widely used than OpenGL on Windows, PC Linux, and Mac combined, and thus it seems like DX should be the prime focus. Perhaps others (including Mark) have reasons for different views? I know that the situation regarding Vista/DX10 is cloudy, but I personally have no plans to replace XP/DX9 within the next year or so, and I expect DX9 to remain the most popular platform for the next year or two (due to MS's questionable decision to bind DX10 to Vista only).

Discussion and other views about Max3D's focus are welcome.


Smokey(Posted 2006) [#2]
Opengl or directx... what's wrong with Opengl? for me if it does the job and Opengl is for all plarform not only windowsm be happy that mark mention direct3d because in the first time Max was suppost to be only Opengl.


WendellM(Posted 2006) [#3]

what's wrong with Opengl?

I have nothing against OpenGL, myself (in fact, I like that it's out from under Microsoft's thumb). But in my experience as mentioned above, 65% of users chose DirectX when offered a free choice of any graphics/platform. That's the only reason that I consider DX so important: it seems to be what people are using by a 2:1 margin over OpenGL (whether Windows, OS X, or Linux).


sswift(Posted 2006) [#4]
Those stats don't mean anything. If I was offered choice between a Direct 3D version of an app and an Open GL version of an app, I'd choose Direct 3D too, just because it's likely to be a little faster. That doesn't mean people's cards don't support it or that they won't download OpenGL only games.


Smokey(Posted 2006) [#5]
well people use directx because it's buildin MS but Opengl are widely supported by most gfx card so I think
it's not people choice, if it's working people don't care if it's opengl or directx.


WendellM(Posted 2006) [#6]

Those stats don't mean anything. [...] That doesn't mean people's cards don't support it or that they won't download OpenGL only games.

I appreciate the theory, but do you have any numbers to back it up? :) Unfortunately, I've dealt with computers for too many decades to go with what's "best in theory." I've learned the hard way that real-world support is what matters, not what's theoretically best - or else I'd be typing this on an Amiga rather than a PC :). But if you have some evidence of OpenGL's acceptance, I'd love to see it (really).

At heart, I agree with you and Smokey that GL is pretty well supported. But I'd like to see some evidence of its acceptance in the real-world marketplace. And, with respect, I mean hard data, not just what someone feels or thinks. I know that my data above is a small sample, but it's a real sample and I haven't seen any others.


H&K(Posted 2006) [#7]
dont tell them that its openGl


Smokey(Posted 2006) [#8]
hum surely that directx is the requirement on the box of each game but like I said it's not people choice, directx is more popular because it's buildin, in the past I bought a game that was made in opengl and when I start it I did't notice that it was opengl ...see it's depend of the developer.


Michael Reitzenstein(Posted 2006) [#9]
I'm not really sure OpenGL only matters when the engine already requires a 'kick ass graphics card'!


WendellM(Posted 2006) [#10]
Agreed, Michael. I'd also like a little clarification as to just how much ass it needs to kick. :) I'm not expecting support back to the ancient days of the GeForce 2, but just what will any potential players/customers be expected to have? The higher it is, the smaller our potential audience is.


dmaz(Posted 2006) [#11]
well, I think it's pretty clear that opengl has the largest "possible" market. If this is really going to require a "kick-ass graphics card" I don't think there is anything to worry about because all current "kick-ass" cards have great opengl.

[edit] jeeze I type slow!


Smokey(Posted 2006) [#12]
I am also designing Max3D with older hardware somewhat in mind, but any such version - if it ever actually happens - will have *less* capabilities than Blitz3D. Which oddly enough increases the chance of a GL Blitz3D happening one day - unless Si H's MiniB3D gets there first!



Well if Bmax is slower on low end machine you could use blitzGL :) if it's append :) Blitz3d will have a reborn. :)


marksibly(Posted 2006) [#13]

but just what will any potential customers be expected to have?


Initially, it will require a 'latest' generation card - eg: GeForce6???/ATI-X???


Smokey(Posted 2006) [#14]
for my point I don't complain, Blitzmax will have the juce to compete with High end engine.


WendellM(Posted 2006) [#15]

Initially, it will require a 'latest' generation card - eg: GeForce6???/ATI-X???

"Require." As in I won't be able to use it at all with my current card. That neatly solves all my doubts about Max3D. If I can't run it, then there's no need for any players/customers to be able to run it, so I won't be buying it anytime soon. There's no way that I'm going back to Blitz3D from the much-superior BlizMax core language, so I just need to use free Irrlicht (the upcoming 1.1 wrapper looks promising) or some other 3D solution rather than pay for BRL's this year.

Thanks for this info, Mark. Rather than wait until Max3D's expected release at the end of this year for a GL-only solution that won't run on my current hardware, I can pick an existing DX-and-GL-supporting one that runs on what I currently have. I'll just take care to code things somewhat abstractly so that I can shift over to Max3D in a year or so when the hardware that it requires has become mainstream.


Smokey(Posted 2006) [#16]
WendelM there also Eliza3D that is a good mod, it use directx and I'm sure that lower spec machine can run it, the avantage with bmax is there always be an other alternative.


WendellM(Posted 2006) [#17]
Thanks, Smokey. I'll check it out along with the others on this thread.


Smokey(Posted 2006) [#18]
np WendelM ... btw


"Require." As in I won't be able to use it at all with my current card. That neatly solves all my doubts about Max3D




(freeware: Alethiometer and Tic-Tac-Think). BlitzMax 1.22 with MaxGUI, Blitz3D 1.98
Athlon XP 3000+, 1 gig RAM, 128 meg Radeon 9600; Win XP Pro SP2 & Win 98 SE both w/DX 9.0c; Ubuntu 6.06
933 MHz PowerPC G4, 512 meg RAM, 64 meg GeForce 4 MX, OS X 10.3.9




As I can see your card seem to fit the requirement.


WendellM(Posted 2006) [#19]

As I can see your card seem to fit the requirement.

Maybe I'm wrong, but I took Mark's "ATI-X???" to mean ATI's X800-to-X1950-class cards which I believe have significant improvements over the 9600. If Max3D will run on the 9600 then I may get it when released, but I don't plan to replace my hardware in the next few months due to Max3D. (I'm not concerned about my Mac, which is seriously outdated, since I don't plan on supporting Mac with BMax for too much longer: 10% isn't worth it to me.)

I stay a couple of steps behind on hardware since I don't want to outstrip my average players' machines. I've read what can go wrong when game creators use cutting-edge hardware and forget that their intended players don't. And it's cheaper, too :).

I've been a fan of Blitz since a bit before B3D's release (which I waited for, rather than getting DarkBasic or 2D BlitzBasic, and bought right when it first came out). Mark and Skid have done great things with BlitzMax, which was a big step up. I expect to go with Max3D someday, but perhaps just not right when it first comes out depending on hardware and DX support.


Dreamora(Posted 2006) [#20]
Wendel: While I agree that theoretically, a DirectX driver would be great, there is one elemental error in your stat:

Its taken from casual gamers. Most of these "gamers" have low end cards ... low end cards that won't work with the initial Max3D version according Marks worklog. (Over 50% of the systems are on Intel cards which are out for Max3D)
As especially those low end cards don't support OpenGL most problem cases are already away.

I've had myself a 9700Go for over 2 years and all OpenGL stuff (including Doom 3 and Quake 4) ran as good as HL2 and the like so there is no reason to fear that "real graphic cards" will have problems.

And an X1300 isn't faster than your 9600 btw :) And as X800 is in, Shader 3 can't be a req as well so you fit all the needs, don't panic :)
There is no real reason to target Shader 3 min requirement anyway as there is a significant amount of Users with ATI 9600+ or X600+ which are both non Shader 3.


FlameDuck(Posted 2006) [#21]
At heart, I agree with you and Smokey that GL is pretty well supported. But I'd like to see some evidence of its acceptance in the real-world marketplace.
Knights of the Old Republic. Requires an OpenGL 1.4 capable graphics card.

I've read what can go wrong when game creators use cutting-edge hardware and forget that their intended players don't.
Well poor marketing is always a serious concern for any game creators, especially since most aren't too good at it. However I fail to see how this is effected by hardware requirements?

There is no real reason to target Shader 3 min requirement anyway as there is a significant amount of Users with ATI 9600+ or X600+ which are both non Shader 3.
Rubbish. There are plenty of good reasons to make SM3.0 a minimum requirement (like dynamic lighting).


Winni(Posted 2006) [#22]
Just one comment on the DirectX vs OpenGL debate here: DirectX runs on Windows only. BlitzMax, however, is a multi-platform product, and then OpenGL is the only way to go. It's that simple.


Dreamora(Posted 2006) [#23]
FD: Yeah great ... require a shader version for exactly 1 or 2 effects and take care that most users even with high end cards can't use it. (X800XT and the like are out with Shader 3 ... and that juts because of lighting and a few other things? thats pure eye candy that takes massive performance anyway ... From what I've seen you can do dynamic lighting without shader 3 as well ... can't remember that XBox 1 has Shader 3 after all nor does Splinter Cell and other games with that heavy usage of light and shadow require shader 3)


popcade(Posted 2006) [#24]
I think we need a better native audio solution, too.


FlameDuck(Posted 2006) [#25]
FD: Yeah great ... require a shader version for exactly 1 or 2 effects and take care that most users even with high end cards can't use it.
What are you talking about? All high-end cards have SM3. (X800XT is more than 2 years old, and if you'd wanted SM3 you should've bought an nVidia card instead).

thats pure eye candy that takes massive performance anyway ...
First of all, per pixel lighting makes a huge visual difference, so it's hardly just eye candy, and second of all, with SM3 you can do it effectively on-diem so it's not nearly the performance hit you imagine it to be.

From what I've seen you can do dynamic lighting without shader 3 as well ...
Sure you can. But then you're doing much of the math in the CPU (SM2 does not allow for conditional branching). Doing it in shaders means you get dynamic lighting "for free".

On a closing note, let me just point out how hilarious I think it is that everyones been whineing for more advanced graphical features for as long as I can remember, and when BRL finally answer the call and decide to make a next generation engine, people then get all upset that it doesn't work on their 4 year old video card.

My question in that respect would have to be, what the hell are people expecting?


TeraBit(Posted 2006) [#26]
I agree with FlameDuck, it makes sense to have Max3D target SM3.0 and up. It's a next Gen 3D system and is targeted at next gen hardware.

If Max3D could only turn out only marginally better stuff than Blitz3D it would not exactly get very far anyway. It looks like Max3D is aimed at the future, and that makes sense financially and in terms of long term viability.

Purely by virtue that it uses recent hardware also tends to iron out driver issues, since if it has SM3 then it will likely have a decent OpenGL driver too.

I've had fun making my own OpenGL 3D implementation in Max, but rather than go it alone, I have made the decision to put my projects on hold until Max3D surfaces. Looks like it's going where I want to go.


Sledge(Posted 2006) [#27]
I'm mildly confused because I assume this means Max3D isn't going to get on with integrated GFX chipsets... but don't the new Macs use integrated GFX hardware as well as the low-end PC's? This is a great situation if your projected development time is four or five years, but in the immediate future there are going to be a lot of these underpowered machines knocking about. It'd be a bizarre paradigm shift in the BRL philosophy that would see them left behind. :/


Oddball(Posted 2006) [#28]
Most people in the real world don't even know what DirectX or OpenGL are. I have plenty of friends who are avid games players and don't have a clue of any of the advanyages or disadvantages of either. They only know that games need it and whether their computer has. I wouldn't worry about it at all.


Dreamora(Posted 2006) [#29]
I agree that it should target next or current gen at least.

This means Shader support and usage of Shader 2 which is standard today.

But what I do not agree is the usage of Shader 3.
Don't know, I played quite some game with dynamic light effects on my old 1,5Ghz P-M notebook with mobile radeon 9700 and it didn't have any problem after all. which means either the light effects are not cpu intense (was a banias so far lower FPU performance than other CPUs) or they are doable with Shader 2 / 2.1 as well.

Sure you don't mix some parts on Shader 2.1 which ATI supports for quite some time and 3.0?


PS: Just because Shader 3 is used, those effects aren't cheaper. Lets take 7200 / X1300 and try them on these cards. While they support Shader 3 it would be the far more intelligent decision to use the CPU as those shader units are totally underscaled for such stuff.


ozak(Posted 2006) [#30]
For low end systems, simply use Simons MiniB3D. It's already pretty good, and will get to the point of almost full Blitz3D compability.

Chances are that Max3D will feature a similar interface so porting should be easy. Always seperate you 3D and game code, so you can switch the underlying stuff easily :)


FlameDuck(Posted 2006) [#31]
but don't the new Macs use integrated GFX hardware as well as the low-end PC's?
Mac drivers for integrated (Intel) chipsets don't suck.

This is a great situation if your projected development time is four or five years, but in the immediate future there are going to be a lot of these underpowered machines knocking about.
Vista is going to kill them off a lot quicker than you think.

It'd be a bizarre paradigm shift in the BRL philosophy that would see them left behind. :/
BRL philosophy has always been to provide state-of-the-art technology. Five years ago, when Blitz3D was released it too required comparatively high-end graphics card. Just because the world has moved on in the mean time, doesn't suggest a paradigm and/or philosophical shift as far as I can tell.

This means Shader support and usage of Shader 2 which is standard today.
Standard? Set by who? Since the two market leaders (AMD and nVidia) no longer produce cards that don't have SM3.0 and by the time Max3D is released, chances are you won't be able to buy them off-the-shelf either.

Don't know, I played quite some game with dynamic light effects on my old 1,5Ghz P-M notebook with mobile radeon 9700 and it didn't have any problem after all.
You ran DOOM3 on that hardware without any problems? At what resolution, and detail setting, because as far as I can tell it only barely meets the minimum requirements, never mind what they reccomend. Hell I played FEAR on my laptop and had to reduce the detail and resolution significantly to get a decent frame rate.

While they support Shader 3 it would be the far more intelligent decision to use the CPU as those shader units are totally underscaled for such stuff.
Even if that where true (which it isn't) it's not possible to simplify it like that. There is simply not enough bandwidth available to send per pixel lighting information of any decent resolution along the north bridge 60 times a second.


Warren(Posted 2006) [#32]
Vista is going to kill them off a lot quicker than you think.

Can't say as I agree with that. There are ridiculous numbers of people still using Windows 9x when XP has been out for how long now? And Vista is a less compelling upgrade than XP was.


Dreamora(Posted 2006) [#33]
FD: I don't say it shouldn't support it.
What I say is that it shouldn't require it as base requirement.

I know that it gives some nice new possibilities after all (the one thats most interesting out of my view is the possibility to do terrain fully shader based which should give far more details and living outdoor scenarios. something that I've been waiting for years).

If I check your systems, they lack some current tech as well, the most obvious is the lack of H.264 hardware support which is needed for Vista as well and even supported by Intel onboard. So are they crap because of that and should not have the right to run Max3D games?

I don't try to have a min req of Shader 2, because my system isn't capable of shader 3. Its because something that the currently most spread ATI cards don't support.
And mentioning that people should have bought NV is at best *****, sorry. The cards that NV brought out at that time only had one outstanding feature: brain burnt energy consumption and lower shader performance for same or higher price than ATI.

PS: I've CoreDuo T2500, 2GB RAM, 7600Go 256MB (so newest generation of NV on 90nm etc) and 1680x1050 screen resolution. So it is not like I wouldn't be able to use all the capabilities of Max3D.


TeraBit(Posted 2006) [#34]
WinXP, DX9.0b, P4/3G, 512M, Radeon 9800

Of course you realise that Mark's sig. PC only contains a Shader Model 2 Card :)


sswift(Posted 2006) [#35]
My aunt was using Windows 98 as of a few weeks ago when I sold her a new used PC which could run XP, but probably won't be able to run Vista. So I don't think Vista is going to be taking over all the desktops anytime soon. I certainly don't plan to upgrade to it immediately. I'll probably wait at least two years to upgrade, if not more.


FlameDuck(Posted 2006) [#36]
There are ridiculous numbers of people still using Windows 9x when XP has been out for how long now?
True, but they're all running archaic software, and probably wouldn't be able to run your game, even if said person did have a SM3 card, and was able to find functioning drivers.

And Vista is a less compelling upgrade than XP was.
To me and you maybe. I guess it depends on how aggressively Microsoft are going to push Vista, and how many applications and games are going to be Vista only. If Microsoft choose to turn the screws too tightly, people are only going to have 2 alternatives. Upgrade to a Mac, or upgrade to Vista - and better the devil you know. Although I suppose the incremential improvesments Microsoft products recieve over the years are more significant for end users who've missed an OS upgrade the last couple of times.

the most obvious is the lack of H.264 hardware support which is needed for Vista as well
Maybe. But not for Max3D as far as I'm aware. Besides which I'll be buying a beefier desktop eventually.

I don't try to have a min req of Shader 2, because my system isn't capable of shader 3. Its because something that the currently most spread ATI cards don't support.
Well that's a valid point I suppose. Never the less if the choice comes down to SM3 by Christmas or SM2 by next August, SM2 support can go jump in a lake as far as I'm concerned.

Of course you realise that Mark's sig. PC only contains a Shader Model 2 Card :)
Sure. And DirectX 9.0b. Which leaves us with two options.
a) Mark hasn't updated his computer in a while.
b) Mark hasn't updated his sig in a while.

Me? I'm going with b.

So I don't think Vista is going to be taking over all the desktops anytime soon.
Ofcourse not. But nobody here realisticly has a target market segment of 'all desktops'.


TeraBit(Posted 2006) [#37]
Sure. And DirectX 9.0b. Which leaves us with two options.
a) Mark hasn't updated his computer in a while.
b) Mark hasn't updated his sig in a while.

Me? I'm going with b.


Heh heh, agreed. Just trying to start a conspiracy theory 8)


Gabriel(Posted 2006) [#38]
Ouch, Shader Model 3.0 minimum is high. I thought I was going out on a limb with a game which is Shader Model 2.0 minimum ( and won't be done for another couple of years ) but no way I'd go SM 3.0 as a minimum. I don't know how GLSL works at *all* but the great thing about HLSL is the ability to have multiple versions of the same shader, so you can ( and my game will ) use SM3.0 to have nice features "for free" ( as FlameDuck put it ) and then fallback to SM2.0 to have nice features "at a cost" ( as I'm putting it ) but even without conditional branching,you can still get some nice effects at a reasonably low cost with SM2.0.


Beaker(Posted 2006) [#39]
Mark used two important words:
eventually as in "A Direct3D driver will happen eventually"
and initially as in "Initially, it will require a 'latest' generation card"

I'm sure as high-end technology becomes more commonplace and Max3D evolves in terms of what low-end tech it covers, everyone will be happy. A lot of time will pass before then no doubt.

If you wan't to target low-end systems for all eternity, you can still use Blitz3D, miniB3D (GL only), ogre in bmax, irrlicht in bmax etc.


Sledge(Posted 2006) [#40]

BRL philosophy has always been to provide state-of-the-art technology.


Ha ha HAAAAAAA HAAAAA!! Seriously, which BRL are you talking about because it's not the one that runs this site. "State-of-the-art" is the last thing I'd associate with BRL... "tried, tested and proven" more like, which is vastly preferable anyway.


Mac drivers for integrated (Intel) chipsets don't suck.


Oh, so the niche low-end machines will be fine. Phew, I can totally stop worrying now.


Five years ago, when Blitz3D was released it too required comparatively high-end graphics card.


Dunno about that - I had a cheap setup that seemed to cope just fine back then. You might as well say that 3D work required a 3D card... that tells you nothing about BRL that wouldn't be attributable to a host of other companies. TGC clearly have a more valid claim over the whole state-of-the-art thing over that period and are much the worse for it.

Still, fingers crossed!


Gabriel(Posted 2006) [#41]
Beaker: Yes, he did. Unfortunately, a few of us would have much preferred it if he'd used those words in the opposite sentences ;)


anawiki(Posted 2006) [#42]
I would expect an engine possible that is able to use the power of latest shaders, but is able to live without that if gfx card doesn't have those (less details, but your game is still working).


Hotcakes(Posted 2006) [#43]
Oh for snail's sake, people! Mark has always said Max3D will be next gen... where's the surprise in this?!? Hell, the early tech demo only worked on GeForce 5600+ or something... this is only one small step up. And I also havn't seen Mark say -anywhere- about SM3... only Flamey... last I checked my 6600 doesn't do SM3 anyway and that's part of Mark's listed requirements... so go figure.

the great thing about HLSL is the ability to have multiple versions of the same shader,

QFT.


SillyPutty(Posted 2006) [#44]
People just love to complain. Period.


bradford6(Posted 2006) [#45]
I see a few flaws with this initial post.
Alethiometer

Windows DirectX (~590K download)
 
Windows OpenGL (~580K download)
 
PC Linux (~580K download)
 
Mac PowerPC OSX (~710K download)


1. the DirectX is the first in the list which naturally indicates the 'preferred' one.

2. the size of the directx file is larger. (bigger is better?)

3. many people's machines are capable of running OpenGL but they may not be aware of it.


IPete2(Posted 2006) [#46]
I completely agree with Bradford6.

Wendle I bet if you removed both the OpenGL and DirectX versions and just put the OpenGL one up (unannounced) but put alongside a minimum spec it would still get downloaded.

Most none games players (the majority of PC owners) do not know about OpenGL and DirectX anyway - and why should they?

Also - of all the downloaded ones we don't know if any didnt get it to work and why (do we?).


IPete2.


Dreamora(Posted 2006) [#47]


3. many people's machines are capable of running OpenGL but they may not be aware of it.


Unless they have Intel, SiS or S3 chipsets. Then they most likely know that their system isn't capable of anything usefull. (especially intel is breaking the drivers again and again)


Tom Darby(Posted 2006) [#48]
Mark, I cannot help but notice that you don't even mention whether or not Max3D will wipe for me after I'm done.

Even though you've never so much as hinted at this functionality, I'll be sorely disappointed if it isn't in the final product.

Yours sincerely,
A Very Large Rock


Smurftra(Posted 2006) [#49]
Question: are most of you making games that will rival doom3, half-life, prey and the likes?

Personnaly, i dont have the time/budget, so i aim at smaller games, less graphic quality, lesser sale price, but bigger audience. And that means low-end machines.

I'm not complaining as i think its great that max3d will be for next gen, but i just cant find a logical reason to use it in a commercial endeavour, as it narrows my target audience by a huge percentage. Most casual gamers have computers that date 3-5 years.


xMicky(Posted 2006) [#50]
I totally agree with Smurftra. For me it would be enough to have a language supporting the features aimed with MiniB3D but being fully compatible with the other elements of BlitzMax, in special the 2D drawing functions.


H&K(Posted 2006) [#51]
Maybe Im missing somthing here. But I assumed that Max3d was going to have a staggered framwork entry.
So that if you didnt envok Shaders at all, then you didnt need a chip set with shadders? When you instal Max3d, will Max2d suddenly want greater chip requirments? I dont think so. Hence the lower DX and GL are needed, so I think Max3d will stagger your driver needs.


tonyg(Posted 2006) [#52]
I'm waiting for the first '2D' in Max3D user module unless, of course, Max3D comes with 2D 'engine'.


H&K(Posted 2006) [#53]
As in its a replacement for Max2d? or An exact copy of max2d?


Dreamora(Posted 2006) [#54]
As Max3D is opengl it has no problems to interface with Max2D on OpenGL drivers.
The main difference is that it uses pub.glew which Max2D so far does not do (-> imagebuffers missing)


tonyg(Posted 2006) [#55]
From the worklog

A Direct3D driver will happen eventually, but probably not until the whole Vista release drama is over and we know where things stand re: D3D9 vs D3D10 etc


Now either Max3D will come with a built in 2D in 3D system
or not.
If it doesn't then the capability is there for somebody to do a 'SpriteCandy/NSprite' on it and create a DX9/DX10 2D engine. I know somebody could do that now but they'd have to create a DX9 driver which is a bit (a lot?) more work.
I'm assuming the same is true for the OGL 3D driver.
Does that make sense?
I have no real interest in 3D in 3D so I'm hoping Mark won't be dropping the 2D stuff.


Dreamora(Posted 2006) [#56]
Hu?
BM has no 2D stuff at all. Max2D is already full 3D so using that from within Max3D shouldn't be a problem, at least unless a DX driver is comming somewhen.

The only thing that needs to be done is a global variable that says if it is in 3D mode or 2D and if it switches, it changes the projection mode and a few other settings. Thats it.


tonyg(Posted 2006) [#57]
Blimey, it's like pulling teeth with you sometimes.
I am fully aware that Max2D is using 3D.
If Max3D comes with a DX9 driver will it come with it's own 2D engine using DX9 capabilities? If not, as you correctly point out, the existing Max2D DX7 driver can be used.
Alternatively, it would be possible to use that DX9 driver and extend it to perform 2D functionality a la Max2D.


xMicky(Posted 2006) [#58]
When I read the postings to the MiniB3D threats where many people failed in trying to make it working together with 2D drawing functions I really doubt whether everything is such easy as Dreamora says above.


taumel(Posted 2006) [#59]
SM3 as a minimum would be kind of a joke for the casual market (Is there any big game out there at the moment which dares to be a SM3 only?!) if intention would be to release the engine by the end of this or in the beginning of next year in a state that it would be somehow useful for production.

This would leave out most of the notebook market. This would leave out most of the ATI market as SM3 is only supported since X1900, X1600, X1300 cards. No X800 and such cards support SM3. nVIDIA users are in a more comfortable situation but only if you take the featurelist and don't look if it's implemented with a reasonable speed.

For the Apple market this would mean iMac, MacBook Pro and Mac Pro of the latest generation only. So much for the crossplatform usability of such an engine.

Any sensible reason why this could be the strategy? One reason i can think of is that BRL doesn't want to invest time into getting hands dirty for writing shaderfallbacks/workarounds for older systems which is pretty time consuming but also very important.


Picklesworth(Posted 2006) [#60]
Yep, advanced stuff is good but a really good engine needs fallbacks - and lots of them. (Source, for example).
Shader Model 3 will probably make it impossible for me to support this engine from the start, which is a shame.
Many other people who frequent these forums are in the exact same position.

I hope that Mark reconsiders!


Hotcakes(Posted 2006) [#61]
Requoting myself because not enough ppl pay attention to me ;]

Oh for snail's sake, people! Mark has always said Max3D will be next gen... where's the surprise in this?!? Hell, the early tech demo only worked on GeForce 5600+ or something... this is only one small step up. And I also havn't seen Mark say -anywhere- about SM3... only Flamey... last I checked my 6600 doesn't do SM3 anyway and that's part of Mark's listed requirements... so go figure.


If Max3D comes with a DX9 driver will it come with it's own 2D engine using DX9 capabilities?

If you look in one of the Max2D mod dirs, you'll see a dx9 driver lieing alongside the currently default dx7 driver. I don't think it works yet, but I'm guessing you can be pretty sure it will be finished by the time Max3D is out.


Picklesworth(Posted 2006) [#62]
Thanks for clearinfg that up, Toby :)

I was too lazy to check my sources; just relied on the knowledge of the crowd.


WendellM(Posted 2006) [#63]
Wendle I bet if you removed both the OpenGL and DirectX versions and just put the OpenGL one up (unannounced) but put alongside a minimum spec it would still get downloaded.

An interesting suggestion. Not one that I care to try, but it could be useful to see the results if anyone tried something like it: offering only a GL version (without specifying it) for a month or so, then switching to a DX (again silently), then back to GL or somesuch.

Also - of all the downloaded ones we don't know if any didnt get it to work and why (do we?).

Correct. As it happened, I had one problem reported during pre-release from a very casual gamer with an outdated machine that the DX version didn't run on. I sent him a GL compile and it ran. That's the main reason that I went ahead and offered a GL version for Windows.

I was surprised at how little it was chosen compared to DX. And, from a few, casual checks, at least part of those downloads of Windows OpenGL versions were done alongside DX downloads at the same time (presumably to see which ran better on the user's system?). Again, all of this is just a fairly small test, under less-than-statistically-pristine conditions, but I haven't seen any others (in this thread or elsewhere) with figures, so it's all I have to go on.

Also, thanks to everyone for the helpful replies. It looks like the situation is indeed murky for the next several months to a year in this whole area (not just Max3D, but Vista/DX10 and the future of OpenGL). I'm just planning to go with some other, currently existing solution for now (since I don't want to remain 3D-less in BlitxMax during that time), see how Max3D is doing in something like a year, and perhaps switch over to it then. This may turn out not to be the best approach, but I don't know that there's much certainty in any approach right now. Perhaps we can all share reports of how different methods work out in the upcoming year or so, which might reveal the best approach(es)?


Genexi2(Posted 2006) [#64]
A couple of things are also becoming clearer about the initial release: It'll require a kick-ass graphics card and a kick-ass GL driver.

I'm just a little curious here, but is there any advantage to working "high" on reqs first, then working your way down to the lesser kick-ass cards over time, instead of the reverse?


Dreamora(Posted 2006) [#65]
It is easier to add fallbacks if somethings fails than it is to add "kickups" if a more powerfull solution for the same problem would be available as the structure then most likely is already centered around the weak way of doing it. (which might not be the most performant for the "real" system)


Hotcakes(Posted 2006) [#66]
That's definately the impression I got from Marks posts.


Will(Posted 2006) [#67]
A) SM3 will be definately be standard before 3-4 years have passed. How many people are still using Blitz3D? Maybe BRL, in some fit of craziness, are writing code they mean to be useful for more than 2 years!?!?! MY GOD!?

B) If its a strong next-gen engine, BRL might be planning to offer different licensing options, that would let them make a little more money off larger projects, of the type alot of us on this forum don't do. If this is the business plan, it doesn't matter that SM3 isn't uber available THE DAY IT COMES OUT. Its likely that most large games projects starting now won't come out until most gamers in the market for large project games have powerful enough cards for Max3D. Its not just a casual games engine - in fact it doesn't sound great for casual games. He never promised it would be.

C) Beaker:
"Mark used two important words:
eventually as in 'A Direct3D driver will happen eventually'
and initially as in 'Initially, it will require a 'latest' generation card'"
Some people seemed to miss this. It sure sounds like its going to support Direct3D, and lower end cards - just not at release. Here are BRLs options: (1) Wait another year while they write all the fallbacks and everyone leaves BRL for Unity / Irrlicht etc, and then release Max3D, (2) Release Max3D now, so we can start learning it, get good at it, and produce some nice stuff, and then release the fallbacks at a later point, when we are about ready to be releasing decent games with it anyway. They would have to be INSANE to pick (1).

I think a few people are panicing for the sake of panic. Its fun, but its best done at home.

"Initially"

"Initially"

What?

"Initially"

Heres what to do, for those people who are freaking out and dying left and right about this (and props to those who aren't):

1) get a card that will let you use Max3D.
2) develop good games with it, even if they won't run with the current version of Max3D.
3) when the fallbacks are integrated into Max3D, hit the market with your nice, polished and quality games.
4) ???
5) Profit!


taumel(Posted 2006) [#68]
@Will

You make it sound like it will be better than any other latest 3d engine.

Will it really be such a strong next gen engine and what does this mean exactly (will only need nextgen hardware or also providing nextgen concepts and results)?
Are stencil shadows still the only shadow solution?
Any deals to get something beside of ODE?
Will there be an 3d editor?
...

I think it will be a decent product but i don't expect it to be the cream of the earth.

And it won't be of use for some people at launch if it's SM3 only due to that they don't have such cards (and it's not only about buying a new gfx-card if you're a mac user for instance) and that you can't use it due to the installed user base. By the way is SM3 needed for all functions or only for certain shaders?

Well, it doesn't take a year to do a simple 3d game or get a quick client job done for all. Some are already familiar with 3d concepts and only need to adapt to the engine. Sure this needs a learning time but not a year and then you will get stuck.

I don't say that this isn't the right strategy. Actually it is if you can't get all done till the release date but it would haven been so much nicer if it would be useable without such limitations. Ohhohh when i just remember all the headaches when using early BlitzMax versions. Anyway the best news about this is that they will work on fallbacks at all.

As for your argumentation that people swap if it won't be released yet. This might be right for some people but also wrong for others due to they want a complete solution from BRL or that other products are probably more expensive and they are familiar with BRL. And you won't stop switching those who need a solution by now if coming up with a SM3 only solution by now. But also here this might not be such an issue due to that some poeple also own more than one product.

As for your advices it will also take some further polishing on your side to adapt and test your media once fallbacks are added otherwise you might be surprised why there isn't a profit at all. ;O)


Relax,

taumel


FlameDuck(Posted 2006) [#69]
last I checked my 6600 doesn't do SM3 anyway and that's part of Mark's listed requirements... so go figure.
Check again.

Personnaly, i dont have the time/budget, so i aim at smaller games, less graphic quality, lesser sale price, but bigger audience. And that means low-end machines.
No it doesn't. Just because you have lower hardware requirements doesn't mean you automatically get a larger audience. That's just bad math.

I'm not complaining as i think its great that max3d will be for next gen, but i just cant find a logical reason to use it in a commercial endeavour, as it narrows my target audience by a huge percentage.
Why is that bad? The narrower your initial target audience, the easier it would be for you to acheive early market dominance.

Most casual gamers have computers that date 3-5 years.
No offense, but people who have computers that old, are not likely to even know about your game, let alone buy it.

You make it sound like it will be better than any other latest 3d engine.
You mean kinda how Blitz3D was, back when it was released?

I also don't get why it should take one year to do a simple 3d game or even a quick client job done.
You mean a game like say, Shpere Racer? Or Aerial Antics? How long did those take, do you reckon?


taumel(Posted 2006) [#70]
No offense, but people who have computers that old, are not likely to even know about your game, let alone buy it.

I sometimes really wonder if you're saying the stuff you say because you know what you are talking about or just because you mean it should be the way you think or just read it elsewhere?

Beside of whatever owners of older systems might know and what not you do realize that a large percentage of the gfx chips sold today aren't capable of SM3 but still powerful enough to run a decent game (notebooks, integrated chipsets, miniMacs, macbooks) and not only used for datasheets.

You mean kinda how Blitz3D was, back when it was released?

Puhh hard to remember what was available at the time Blitz3D (2001?) was released. I used another one which was available a) on win and mac and b) able to publish off- and online. Anyway things look a bit different today as there are much more products around and some of them are quite good not to mention the unaffordable professional ones. Furthermore some of them offer features which fit quite well into the wide gap between SM3 and DX7.

You mean a game like say, Shpere Racer? Or Aerial Antics? How long did those take, do you reckon?

I've no idea what kind of games you are talking about but i suspect you want to say that these games take some time to finish. But then i quite don't understand your point as obviously there is a wide range of different kind of games and some need more and some less time. I suspect most of the games around here are done under a year. At least mine are... ;O)


Will(Posted 2006) [#71]
I'm flattered that you've picked up on my "year" timeline for fallback support, however I think we should bear in mind that it might be much sooner, but is unlikely to be much later. I increased the estimate from 4 months (what I would actually expect) to a year to be nice and safe from people missing the actual point.

In response to Taumel's first post:

You make it sound like it will be better than any other latest 3d engine.

Will it really be such a strong next gen engine and what does this mean exactly (will only need nextgen hardware or also providing nextgen concepts and results)?


I doubt it will be the most capable 3D game engine in existance. I very much doubt that. I'd lay my money on the Unreal 3 engine - so why don't you pick that up instead? Oh, the $100,000.00 price tag makes that a little intimidating? I think BRL has a great shot at being the best engine that is affordable by all.

Are stencil shadows still the only shadow solution?
Any deals to get something beside of ODE?
Will there be an 3d editor?


I'd love to know, but I'm not the authority. I would guess: No, No, No.

As for your argumentation that people swap if it won't be released yet. This might be right for some people but also wrong for others due to they want a complete solution from BRL or that other products are probably more expensive and they are familiar with BRL.


I'm sorry if I implied that every user would switch. I agree with you that it is "right for some people" and its not in BRL's interest to lose them.

And you won't stop switching those who need a solution by now if coming up with a SM3 only solution by now. But also here this might not be such an issue due to that some poeple also own more than one product.


Fortunately very few people need an SM3 solution right now - the point is BRL is planning ahead. Regardless, anyone who needs any kind of 3D engine might switch if nothing comes out.


taumel(Posted 2006) [#72]
Fortunately very few people need an SM3 solution right now - the point is BRL is planning ahead. Regardless, anyone who needs any kind of 3D engine might switch if nothing comes out.

I would be very interested to know what the reasons/benefits for choosing SM3 as a minimum for max3D are beside of time issues due to the facts we've already taked about.

Maybe Mark could shed some *pixel precise* light on this?!

By the way ATI today presented a new SM2 chipset... http://www.ati.com/products/radeonxpress1250mob/index.html


marksibly(Posted 2006) [#73]
SM3? Wot's that then?!?


taumel(Posted 2006) [#74]
Short form we used here for pixel and vertex shader in v3?


Smurftra(Posted 2006) [#75]
Flameduck i get the impression you are talking out of your hat so i will not bother replying other then: I think you are very wrong in what you think are truth.


Jim Teeuwen(Posted 2006) [#76]
Interesting read (the worklog that is).
Personally I'm all for OpenGL. It's easy, and it tends to run a lot smoother on my XP/GF5700 machine than any DX stuff.

Understandable choice to, as BlitzMax is entirely focused on Cross-platform delivery. I'm guessing someone will eventually wip up a DX wrapper for the whole thing, but I doubt it's gonna be needed though. OpenGlLis supposed to be getting a pretty nice Performance boost in Vista. So really no need to limit oneself to just 1 platform.


marksibly(Posted 2006) [#77]

Short form we used here for pixel and vertex shader in v3?


Yeah, I know - I just love the 'community logic' involved:

Mark: 'It'll require a 6200'
UserX: 'It'll require SM3'
UserY: '6200 cards don't do SM3'
UserZ: 'Therefore, my 6200 isn't supported'.

Frankly, I don't have a clue about SM1/SM2/SM3 - I was just guestimating the level of card you'll need to get decent shader instruction count, texture accesses and of course performance.

The bottom line really is this'll all be quite experimental, and if you wanna be along for the ride get yourself a grunty card/driver.


Amon(Posted 2006) [#78]
I currently have the most powerfull GFX card in the community Mark. I will therefore accept you inviting me to join the beta team. :)


taumel(Posted 2006) [#79]
Yeah, I know - I just love the 'community logic' involved:

Come on...i'm sure you didn't know until now! ;O)

I have to admit that it was based a bit on a bubble but when i quickly browsed through the thread then i read that you were saying latest generation and then stumbled over Flameduck's SM3 => assumption for what followed...sorry if interpreted wrong. But at least this clarified it?

So this would mean that all those 9600/9800/...ers out there are also in the game mainly depending on the complexity of the scene?


H&K(Posted 2006) [#80]
Hes saying he doesnt know yet


Hotcakes(Posted 2006) [#81]
Check again.

Well bugger me dead and call me Dead Toby. Thanks Alive Flamey.


Dreamora(Posted 2006) [#82]
6200 is as much of a shader card as Intels are.
They support it but no one how knows the least thing about 3D and shaders would use them for shader applications due to the extremely low number of shader units. (at least 6200 / X1300 have shader units ... Something Intel still didn't see usefull)


taumel(Posted 2006) [#83]
The X1300XT could be a nice little card.


FlameDuck(Posted 2006) [#84]
I sometimes really wonder if you're saying the stuff you say because you know what you are talking about or just because you mean it should be the way you think or just read it elsewhere?
I'm talking about the Technology Adoption Lifecycle. Someone who has 5 year old hardware is, at best in the late majority on the TAL. Someone who is in the late majority will only buy products that:
a) Are market leaders, or come from companies who are market leaders.
b) Solve all their pains, instantly.
c) Is recommended to them, by one or more others whom are also in the late majority.
d) Have matured and are offered at a discount.

So while yes, the late majority is a large market (roughly 1/3) share, they are more demanding and less forgiving than others - and probably not a suitable point-of-attack for an new product. This is why you can now buy POP:SOT for a fiver.

Beside of whatever owners of older systems might know and what not you do realize that a large percentage of the gfx chips sold today aren't capable of SM3 but still powerful enough to run a decent game
Like which? The non-SM3 'Integrated' chipsets I have (GeForce2Go and whatever's part of the Intel i915G chipset) can't even play Neverwinter Nights at a respectable framerate, never mind playing a modern game. In fact the only non-SM3 chipset I have that can even remotely successfully play anything released this century is the X600XT.

Puhh hard to remember what was available at the time Blitz3D (2001?) was released.
Pretty much DarkBasic, Ogre and Quake 2/3 as far as I remember.

I suspect most of the games around here are done under a year.
True. But I suspect some of the more commercially successful ones (no offense) take longer.

stumbled over Flameduck's SM3 => assumption
I wasn't aware that good GLSL compilers for pre-SM3 hardware existed.


Koriolis(Posted 2006) [#85]
(I haven't the time right now to review the countless posts so apologizes if the subject was raised already)

If I understand correctly GLSL is going to be the shading language Max3D relies on. What is planned concerning the DX driver ? Will we have to explicitely manage 2 versions of each shader (HLSL + GLSL) if we want to let the user choose between DX and Open GL (I'm not talking about the inner workings of course, but about custom shaders) ?

I ask that because even though it seemed a bit unnatural, the former plan of exposing shaders via abstract trees had at least the advantage of enabling a complete and transparent compatibility with GLSL, HLSL or whatever. Now that it's dropped - as I understand - the intrisinc compatibility is dropped with it, right?


bradford6(Posted 2006) [#86]
My card is SM4 with Dual Pipes and Sweet Rims. Bling-Bling!

I'll happily be one of the beta testers.

seriously,
I have been doing alot of hi-end modelling lately and would be happy to donate my work to the community for testing. just let me know what you need (format, tri count, bones/no bones, animation, texture size, number of UV coordinates, etc)


taumel(Posted 2006) [#87]
@FlameDuck

Not much time right now but...

a) As for you first point i don't have statistics around which could prove this or that. I just look at the people i know and then i have to say that not all of them are as you say. But i dunno how large the percentage is...

b) Like the new ATI chipset, like all the other chipsets which concentrate on providing better performance at SM2 instead of producing more heat with SM3 which isn't needed this much. Performance of integrated chipsets traditionally isn't a killer but you could do a huge number of games which look really good on them. I can also do something nice with the crappy 9200er here in my mini.

c) As i circumscribed i was with Director...

d) Isn't an offense as this simply depends on the game and don't forget that you can do more smaller games (traditional client work) in the same timeframe as a longer project needs so it cummulates. Valve is goint this direction with the big games -> Episodes...

I'm out...


Hotcakes(Posted 2006) [#88]
Now that it's dropped - as I understand

I'm not sure about that. Mark says .fx files still can't be used directly so it seems to me like there is some intermediate processing going on, which means seamless HL/GLSL compatibility might remain an option down the track.


Koriolis(Posted 2006) [#89]
Sure Toby, but the question is: will this "intermediate processing" onlytake the form of an internal (and inaccessible for the user) conversion from brush objects to GLSG/HLSL/Whatever, or will we be able to create unlimited *custom* shaders of some form (then transparently converted to GLSG/HLSL/Whatever)?
There's a pretty big difference here.

Or maybe Mark expects the integrated brushes to be varied and combinable enough to satisfy everyone, with (next to) no need for custom (fine grained) shaders?


Dreamora(Posted 2006) [#90]
Perhaps mark just learned from TGC which advertised their great FX support long before it was in and it needed over 2 year to get it to a "working level" where it was considerably acceptable and not just usable on few chipsets.

NV and ATI shader creation apps give other export options than FX so there are other possibilities.


Hotcakes(Posted 2006) [#91]
Fair question, K... After having another read...

I'm still sticking with the idea of morph/light/brush shaders, but have ditched the intermediate code approach in favour of high level code.
...
Will we have to explicitely manage 2 versions of each shader (HLSL + GLSL) if we want to let the user choose between DX and Open GL


That's definately what it sounds like to me.

I suppose there might still be the opportunity to plug in our own intermediate code parser somewhere... assuming Mark hasn't ripped out the work (or interface) he put into that just yet.


FlameDuck(Posted 2006) [#92]
I don't get it. Why wouldn't you prefer working in a high-level shader language?


Koriolis(Posted 2006) [#93]
Simple. GLSL is for GL, HLSL is for DX and they're not made to mix. Letting the end user (of your application) chose between DX or GL means having to write 2 versions of every shader.


FlameDuck(Posted 2006) [#94]
I know which API they're part of but since they're high level languages is there any reason (other than because Microsoft hates you) why the shader instructions outputted by the nVidia/AMD GLSL compiler could not be used in DirectX?


Koriolis(Posted 2006) [#95]
Are you saying that it is now possible to have a GLSL precompiled shader passed to DX? Or even weirder, a plain source GLSL shader passed to DX?
Then I've been away from 3d world for much too long. Care to point me to some web resources about the subject?

By the way, I don't hate Microsoft. But too many of their products seem to hate me.


AaronK(Posted 2006) [#96]
Been away for a long time. Good to see that the 3d module is coming along and we're looking at a later this year release.

Good stuff Mark.


FlameDuck(Posted 2006) [#97]
Are you saying that it is now possible to have a GLSL precompiled shader passed to DX?
No I'm asking. More specificly I'm asking why it wouldn't be possible.


Koriolis(Posted 2006) [#98]
The question isn't if it's possible to do so. But as far as I know this would require to write a GLSL <-> HLSL converter as this is not a feature that comes out of the box with card drivers. I'd be very surprised if Mark did this, but if he plan to do so, this precisely answer my question.
But I'm pretty sure I won't get any answer until Max3D is released. In any case it seems the lib is going along nicely, I just hope it will be open enough to let us plug into and play with the engine as we see fit.


Dreamora(Posted 2006) [#99]
And better tested or earlier in "EA price beta" than BM was. shouldn't happen again that an unfinished beta lib is sold for full price again ...


FlameDuck(Posted 2006) [#100]
But as far as I know this would require to write a GLSL <-> HLSL converter
Why? It seems like a pointless task to write a program to convert from one high level language to another. Now if we where talking about ATi/nVidia low-level code, I could see the point of having a converter, but surely the low-level shader instructions (whether they're compiled from GLSL or HLSL) sent to the card are not dependant on the API it is using?

And better tested or earlier in "EA price beta" than BM was. shouldn't happen again that an unfinished beta lib is sold for full price again ...
Speak for yourself. I couldn't care less how stable it is, as long as it's available soon. Stabiliy can come later.


Dreamora(Posted 2006) [#101]
Sadly, the compiled code from the highlevel compilers is API dependant. Otherwise, It wouldn't make sense to have 2 different languages if the lead to the same result.

I prefer earlier as well but not at the cost of the of the finished and really working version which it will be 12 months later. We already played test bunny with BM and it took nearly 12 months till BM was somewhere next to stable enough to use it for more than "fun"
And considering that Max3D will cost as much as Torque or similar (at least I'm assuming that it will cost another B3D+), I'm not willing to pay 100% of that for beeing EA bughunter and I think its fair to give a 10-20% off to EA users, isn't it. (and it won't be bug free before it is released to the broad audience. 10 testers don't have 10000 machines to test, its that simple. this wrong assumption already trashed Max2D and MaxGUI)


H&K(Posted 2006) [#102]
@Flame,

Dont follow most of it, but are you saying that the main reason that there are two Highlevel compilers, is because there are two highlevel compilers? Basicaly that someone choose to make two, and to make them different, yet there is no fundimental reason for this.

@Dream
"Sadly, the compiled code from the highlevel compilers is API dependant. Otherwise, It wouldn't make sense to have 2 different languages if the lead to the same result."

The compiled code doesnot need to be complied from two different languages though does it. One high level language should be able to produce compiled code for two different platforms, like for example errmm Bmax?


Dreamora(Posted 2006) [#103]
Theoretically, yes.
Practically: only if all target systems (either software or hardware) are identically. Otherwise it is a very hard thing to achieve to let the exactly same code work the same on all target systems without the need of emulation (Java).
So this would mean: 1 high level language can only work, if the underlaying shader pipes etc work identically or very similar (that or the compiler won't be able to optimize stuff at least not in the near future. In that case they would be useless as Shader are meant for realtime environment, where speed and optimation is a very important thing)

But theoretically: why do we need 2 API? We only can have 1 GPU render the screen at a time (most people at least), so why 2 different APIs at all?
Right, because there are different opinions how stuff should be done ... Not the best thing to get highest optimized drivers etc, but it is the way it is.


FlameDuck(Posted 2006) [#104]
Otherwise, It wouldn't make sense to have 2 different languages if the lead to the same result.
You mean like .Net? There are plenty of reasons to do that - and incidently, there are currently five widely used high-level shader languages (RSL, Gelato, GLSL, Cg and HLSL).

Basicaly that someone choose to make two, and to make them different, yet there is no fundimental reason for this.
While it might be a huge simplification (there could be other reasons, like licensing issues) yes, that is pretty much what I'm thinking.

So this would mean: 1 high level language can only work, if the underlaying shader pipes etc work identically or very similar (that or the compiler won't be able to optimize stuff at least not in the near future. In that case they would be useless as Shader are meant for realtime environment, where speed and optimation is a very important thing)
GLSL compilers are provided by hardware manufacturers. I think you have it backwards.


Koriolis(Posted 2006) [#105]
Why? It seems like a pointless task to write a program to convert from one high level language to another. Now if we where talking about ATi/nVidia low-level code, I could see the point of having a converter, but surely the low-level shader instructions (whether they're compiled from GLSL or HLSL) sent to the card are not dependant on the API it is using?
You're still not answering my original question at all.

The issue at heart is if you let
the end user (of your application) chose between DX or GL
. In that case, you must ensure your shader work with both APIs. Either
(1) it's stored as a high level source code or is precompiled to a former(low level) shader and then you're tied to the API,
either
(2) it's stored in some card-dependant form and then you're tied to the card.

If there 's a simple answer to this issue that's great, I'm just not aware of it.

Well OK, one simple solution is just to NOT let the user chose, it worked great for ID software games.


Will(Posted 2006) [#106]
I think the percentage of users who know enough to care would be very small.


Dreamora(Posted 2006) [#107]
Yes. Or who care enough about shaders at all. (I mean beside eye candy freaks that only want the coolest looking game with the same lame content as ID)


LarsG(Posted 2006) [#108]
From what I can understand, the different "languages" like GLSL and HLSL (or what they are called),
do infact do the same thing on all the graphics card which supports the "API"...

Why on earth they couldn't settle for a common open format, beats me..


Panno(Posted 2006) [#109]
a cool gfxcard for pc is no problem but what about the new macs ?


Dreamora(Posted 2006) [#110]
Because OpenGL always bitches about standards even if they will follow the DX way sooner or later if they don't want to be kicked out the "reality"
*they normally start with great ideas but in their own "special" way and after DX mainstreams the stuff they add a similar thing*


Koriolis(Posted 2006) [#111]
Well actually it has often been the other way around. Take transform & lightning. It's always been an integral part of OpenGL, while in the DX world it was praised as the grand new thing when it finally was introduced. Old GL programs could benefit from T&L acceleration, while old DX programs couldn't.

The "reality" OpenGL now face is called market law.

Honestly I couldn't care less right now as I'm probably not going to write my own 3d engines anymore. Or more precisely, I wouldn't care if there wasn't the issue of having high level shading languages tied to specific APIs.


Takuan(Posted 2006) [#112]
If you read that worklog backward then you get this:

"Dear community, please remember me to buy Skidracer a Python book. He could then start to support Blender with an exporter for that cool new model format.
If you dont remember me, you will have to wait until an user made "finished soon" exporter is ready.
Or you have to model with Milk which would result in a large community shading cubes.
Cheerio.."

No offense:D


FlameDuck(Posted 2006) [#113]
a cool gfxcard for pc is no problem but what about the new macs ?
What about them?


Dreamora(Posted 2006) [#114]
Intel Macs have some usefull graficcards (X1600) unless they are regular mac books which were never meant for gaming.

Users of G4 based and iBooks are out, but hey, who minds? :)


deps(Posted 2006) [#115]
*Raises a hand*

I have a mac mini ppc, and not enough cash to buy a new mac. :P

But I also have a PC so with some luck Max3D supports my gfx card.