Why to Not buy a Nvidia 8xxx card for DX 10

Here’s a nice movie showing off some of the DX 10 effects that are enabled on true DX 10 hardware from the guys doing Hellgate: London.

I’m going to make the assumption that they’re showing on the best rig that they can buy for this (which would be dual 8800GTX’s). If that’s not a good assumption then the rest of the post falls apart:

Now, assuming the above assumption is true, these DX10 effects are absolutely thrashing the framerate. Look especially as the lights show up. I think in general DX10 is going to be great eventually for what it enables, but assuming this is running on reasonably decent hardware with a reasonably decent feature build (figure 75% or so of final speed) these advanced shader techniques would look to be able to bring current hardware to it’s knees. (In much the same way that Unreal showed of Unreal Engine 3 stuff a year or more previous and could do it on current hardware, but couldn’t actually run things on current hardware.) If that’s the case, I doubt the 8xxx series currently out are useful as “real” DX10 cards. (Of course, this is probably moot since we likely have quite a ways to go before we see many games utilize “real” DX10 features, as well.)

Some nice stuff that DX10 allows though.

So you’re making this judgment on pre-beta software, on what was probably beta OS with beta DX10 and early drivers?

Some of those effects being shown were damn impressive, though in subtle ways.

And on the knowledge of how much actual computation goes into fluid dynamics simulations, even non-hardcore, “just looks pretty” types, yes.

Seriously, if I notice slowdown it means the framerate is hitting 10 FPS or so. So let’s assume due to the beta OS and DX10 (though how “beta” the OS is at a recent event when Vista was released today is a bit of a semantic argument), that the final product will be 2x as fast. That’s still looking at, what, 20-30 FPS. Acceptable, but pretty much driving the hardware to its knees.

My point wasn’t really to poo-poo the 8800 series or anything, though. It was to say “Damn, look at some of these effects. And DAMN, look at how much power it’s going to take to run them.”

I’ll bet you one of your review 8800s against my piddling 6600GT AGP, though, that even when the game comes out, trying to enable all those effects (assuming they’re still in the game) brings a SLI 8800 rig to its knees. Especially when there’s something other than just scenery and one player on screen.

I’ve seen Hellgate running many times over the course of its development, and even when it was running a pure DX9 renderer, it often stuttered like you see in the video simply because the game hadn’t yet been optimized…for any version of DX on any system. My guess is that the final DX10 version will run much smoother than what we’re seeing now (depending on resolution, of course). Keep in mind, Flagship has only had final DX10 hardware in its hands for a very short period of time…

-Vede

A major bullet point for DX10, aside from the new graphical gee-whiz effects, is that it will speed up your frame rates. One of the reasons it’s Vista only, is because they’ve incorporated all sorts of techniques to allow the game to access the hardware more efficiently than DX9.

Also, despite the fact that Vista final has been released in some corners, there aren’t even 8800GTX Beta drivers for Vista available to the general public. Any drivers the dev was running were likely very crude.

Heh. I’m really not trying to be unfair to anyone here. But let’s just settle on a figure of ~10 FPS for that demo at the chunkiest times. It can run three times faster and still be pushing only 30 FPS.

Really, my sole point is “Damn… DX10 shader models enable some insane effects. But it’s also going to take a powerhouse of a graphics architecture to do this stuff.” I’m all for Flagship doing stuff that kills the card, because it shapes where the next cards will be sped up, and true volumetric textures? Count me in there, please!

Seriously, I’m just interested in how much more potential power is going to be exposed under DX10 and how these stupidly overkill right now cards are going to utilize that power. If an 8800GTX that can push something stupid like 150ish FPS in Prey right now gets brought down to 30ish FPS when all is said and done in Hellgate, assuming it’s not doing it because MS/Flagship suck, the implication is that there’s some insanely complicated calculations going on in there. I thought it was cool that we’re not getting better refraction or reflections or even HDR out of DX10, but real and true steps forward like interactive volume textures, true shadow projection, etc.

I’m stoked that it looks like we’re finally getting something other than “higher resolution” enabled as a sink for all that graphics power. I’d be very content running a game with FSAA, 8 or 16x AF, full volumetric textures, dynamic geometry tesselation, and 16+ bit per channel color spaces on my measly 1366x768 TV instead of trying to get increased visual fidelity from having to upgrade to a 26xx by 16xx monitor and loading up 2GB of texture data into the video card to make things look “better” (yet still strangely flat).

The shadowing he illustrated was completely ruined by the fact it was only shadowing a single, static, high light source (like the moon) and there was no shadowing coming from the local light sources (lights etc). I noticed he switched it on and off, however, so perhaps it’s still very much under development. The raindrop effects were nice, though - that’s the kind of effect that can make a real difference to atmosphere. Smoke effects are nice, but I can’t imagine they’ll model vortices (eg. if you shoot a rocket through a smoke cloud).

Smoke effects are nice, but I can’t imagine they’ll model vortices (eg. if you shoot a rocket through a smoke cloud).

Wasn’t the Doom 3 engine supposed to be able to do that? Or did it get cut out?

Well I’m glad I can get 3 new effects now which will totally innovate gameplay in ways never before seen.

What’s the real impact of having the splashes from the rain drops calculate realistically rather than just creating models with splashes on 'em, like they did in Metal Gear Solid 2? I suppose an argument could be made that it’s more realistic, but couldn’t the very same effect be achieved without fancy shaders?

The shadows and smoke was fancy, though. Although I fail to see how it’s going to make an impact. Can’t the smoke be done more or less exactly the same way using particles instead, only with lower visual quality? Again: What’s the advantage of doing it the “DX10 way” compared to the “old way” apart from the fact that it just looks better and eats a hell of a lot more clock cycles?

I suppose the point is moot, though. In a few years time, I guess everyone will be happy that they can just program stuff like that instead of giving some poor modeller the unenviable task of adding splashes to his character model …

I’m looking forward to what they’ll eventually be able to do with this stuff.

The smoke really didn’t look all that cool to me. If they do manage to have vortex effects and stuff though I’d definately reconsider.

The shadow effect was cool, except that, as has been pointed out, they were only using a single light source the whole time.

The rain effect was pretty sweet.

Err… how does a single light source give you umbras and all the other soft shadow bits? It’s impossible for a point light to throw “real” soft shadows. So while it may have only been a single source, if they’re actually modelling it as they said they are, then the effect is actually coming from an area light. If they’re doing real time area lighting, then that’s actually pretty cool.

mouselock’s post reminds me of the time when, as a poor high schooler working at EB, I plunked down $300 to get a new GeForce 256 card. With the T&L that everyone was talking about. Don’t get me wrong, it was a good card and served me well. But very few games took advantage of T&L and all the other fancy features. By the time those games did arrive, the card didn’t have the raw power to handle them well. Seems reasonable that the same thing might happen again - except the cards are a lot more than $300.

I’ve been thinking that might be the case. Plus I’ve read over and over from “developers” that Nvidia’s dx10 solution sucks compared to what ATI is coming out with. Oh and I was “lucky” to get a Geforce SDR. Which just sucked

That’s actually the point I was originally trying to make. I suspect that the 8xxx cards currently out are tremendous overkill for 99% of the DX9 games, and yet could easily end up pretty underpowered for DX10 stuff if these effects are any indication. DX10 seems like it’s going to introduce quite a few pretty large leaps in graphics fidelity, but I expect a high hardware cost and adjustment curve to come with that increase.

Like most things, only time will tell. I’m thinking the drivers are going to be playing a large part of the actual real performance.

This is also assuming the graphics programmers know what the heck they’re doing.

I’d like to think I follow the graphics industry fairly closely and I’ve never heard even the slightest inkling that ‘developers’ think R600 is going to blow G80 away for DX10 performance. Who knows, maybe it will, maybe it won’t (I tend to think process tech. will prevent one from being drastically superior), but I’d like to see a link to a game developer publicly stating such claims.

There is no way I myself can see buying a current 8800 card, at least not until the rumored toned down 8600s or whatever show up. I experienced the same thing as mouselock and jfletch with my geforce1 card.

My problem is choosing between a 1950Pro, a 7950gt variant or going really cheap with a 7900gs until those new cards show up (1280x1024 screen res for now, likely 1680x1050 6 months from now). I’m leaning towards the 1950pro or 7950 because I doubt that I will be going to vista and/or needing dx10 for at least 1.5-2 years. I just want something that will run nwn2, CoH, WoW, sup. comm when it comes out, at decent rates on those screen resolutions.