DX10 on Company of Heroes - Ouch

The 1.7 patch, which adds DX10 support to CoH was released today.

You’ll need a confluence of hardware, OS, and drivers to use it. Of course, Vista is required, a DX10 card, and if you run Nvidia you’ll need these new drivers. The patch is available here, and the changelog is here.

Now, surely, DX10 will improve performance!? Not quite. It puts a hurtin’ on my rig.

I’ve run the CoH in-game benchmark a few times, and here are some results from my rig (C2D E6700, 3GB DDR2-800, 8800GTX)

1st of all, you now have a shader dropdown with options of Low, High, and DX10. There’s also a new Ultra setting for textures, where I believe High used to be the max. CoH pops up a warning of impending system instability if you attempt to run Ultra textures without DX10 shaders. Sure enough, Shaders on High (DX9), and textures on Ultra crashed the game.

A few results:

1920x1200 Shaders on High (DX9), Textures on High, AA on 2x, and everything else, including Model Detail on max settings:
Avg: 75.8, Highest: 170.8, Low: 21.9.

1920x1200 Shaders on DX10, Textures on High, AA on 2x, all else max:
Avg: 34.3, High: 80, Low: 10.9

1920x1200 Shaders on DX10, Textures on Ultra, AA on 2x, all else at max:
Avg: 34.1, High: 72, Low: 5.7

Now, there are some nice DX10 lighting effects, and the Ultra textures are starkly sharper, but it’s difficult to choose DX10, when the game is so much more fluid at high frame rates under the DX9 path.

Edit: Scores updated w/ vsync disabled. Thanks Jason Cross.

But, what are the numbers for 1920x1200 DX10 Textures High?

That would be the one to compare with your DX9 numbers.

It would be hilarious if those features were also ported on DX9 and the engine still perform better.

snicker

Much of the point of DX10 is to allow you to access very large chunks of graphics data far more efficiently than DX9, which is likely why switching to Ultra just crashes under DX9 whereas it works under DX10. In cases where the engine doesn’t need such large amounts of data or isn’t written from the ground up to really take advantage of DX10, I wouldn’t be surprised if DX9 were faster simply due to driver stability (DX9 has been around a long, long time) combined with things like the fact that DX9 can kludge shader precision for speed but in DX10 there are severe limits on such shenanigans. But this isn’t a bad thing in the long run. DX10 has real value when it comes to the next generation huge-dataset, unique texturing, etc, etc games of late 2008/2009.

Anyway, I agree with Jonathan, where are your DX10 High numbers?

Oops. I ran that DX10 High test, but forgot to post the results. It was late, and the Qt3 posting editor failed to notice the omission. I’ll have to talk to him about that later. Meanwhile, I’ve edited the OP to include the results. There’s not much difference between High & Ultra under DX10.

It’s too bad you don’t have a 2nd GTX, Mono. I’d like to see how well ACoH scales with SLI. But thanks for posting those results.

Mono was it a NVidia card? If so, open then NVidia control panel, select the general options for AA, set it to 4X (or whatever you want) and tell it to override program settings. You will get a dramatic increase in FPS.

Yeah, as I noted in my specs. I’ll give the driver controlled AA settings a bit later, but don’t expect much of a difference since I was only running 2xAA.

30 fps on a 8800GTX? Awesomeness, looks like the Vista drivers still suck hard.

Can you post screenshots of DX9 lighting and DX10 lighting?

edit: Nevermind, found some here.

Well, you’ll note the fps is practically double that amount without DX10, but yes, there is a dropoff in DX9 between XP & Vista.

These are all Vista benchmarks right? Aside from the expectation that XP benchmarks would be higher, I would also expect DX10 benchmarks to be lower anyways, since its not as old and drivers haven’t been optimized as well for it yet.

But as a fellow 8800 owner I am disappointed by the Vista crap performance, which is pretty much the #1 reason I am not switching to it anytime soon.

Get a clue. ALL of the tests were run on Vista. The difference is between DX9 and DX10 with the same drivers on the same OS.

Yeah those are pretty disappointing results. Microsoft and Nvidia really need to get on the ball and get all the Vista issues sorted out pronto. Users don’t give a shit whose fault it is, so pointing fingers back and forth is of no use, just get that shit sorted because the longer it takes the more both Nvidia’s driver team and Microsoft’s Vista team look like a bunch of tools, with the end result being slow Vista adoption among one of the groups who are historically early adopters and an ever worsening view of Nividia (in)ability to ship good drivers anymore.

For a while there, I thought the duopoly of Nvidia and ATI was great because there wasn’t a monopoly but there also wasn’t a huge array of funky chipsets for developers to worry about during dev and QA… now that both companies have significant issues simultanously, my mind is changing on that.

I’m not certain I’ve done the comparison justice, but a Relic dev specifically referenced the Mission 1 bunker explosion as an example of the changes from DX9 to DX10.

The top image is DX9, bottom is DX10:


Just out of curiosity, isn’t it possible that some of the advanced effects really are that demanding? Some of the stuff DX10 does (well, allows) is not only impossible in DX9, but really damned computationally demanding. (Real volumetrics spring to mind.)

I don’t play CoH but uh… does it just not have shadows at all?

Those comparison pictures aren’t of the same moment (the 2nd one seems to be a couple frames later since the soldier is kneeling with his gun up). The explosion might look exactly the same in DX9 if they took it at the exact same moment.

Well, they have to start somewhere, right? If they’re going to convince us gamers that DX10 is something we “just can’t live without” and that Vista is not a bloated system hawg that does nothing to improve the average PC gamer’s experience, then some high profile PC games have either got to look better (they seem to be pulling that off, at least in places) and/or perform better (uh, um, better try again, Microsoft). Otherwise a lot of folks like me are not going to let go of XP until they pry it from our cold, dead fingers. :)

I know Microsoft wants to foist DX10-only and Vista-only games on us to force us kicking and screaming to switch, but most of the dev/publisher interviews I read make it clear most think it financial suicide to lock out gamers with slightly older machines or with “just” Windows XP on their machines.