NVIDIA: 3DMark doesn't like us!

http://gamespot.com/gamespot/stories/news/0,10870,2910707,00.html

On a related note, Nvidia has contacted us to say that it doesn’t support the use of 3DMark 2003 as a primary benchmark in the evaluation of graphics cards, as the company believes the benchmark doesn’t represent how current games are being designed. Specifically, Nvidia contends that the first test is an unrealistically simple scene that’s primarily single-textured, that the stencil shadows in the second and third tests are rendered using an inefficient method that’s extremely bottlenecked at the vertex engine, and that many of the pixel shaders use specific elements of DX8 that are promoted by ATI but aren’t common in current games.

Well, that’s funny, because HardOCP was saying the same things for completely different reasons. Eg, they don’t want to use 3dMark2003 as a video benchmark because nVidia has “special” drivers which cause the GFFX to beat the Radeon 9700, even though the individual low-level test scores are lower on the GFFX. I guess they wanted a pure DX9 test where the 9700 kicks everything’s ass? I’m not sure.

What exactly are we seeing here? First let’s look at the 9700 Pro, it is a strong performer from the start, even the new Catalyst 3.1 drivers do not improve the score noticeably. There is a very small 0.004% increase in performance at 1280x1024 with the new Catalyst 3.1 drivers. But the GeForceFX, with a new set of drivers, has now beaten the 9700 Pro in the overall 3DMark result. Remember, it is only the four game tests that comprise the overall 3DMark score, so all one must do is make sure those game tests run faster in order to get a higher 3DMark. What is the most important thing to notice is that the individual feature tests have not improved much at all. Vertex Shader speed, Pixel Shader 2.0 are only a few FPS faster giving the lead still to ATI in these tests, most notably Pixel Shader 2.0 test.

With all that we have seen so far it does not appear to actually give us any indication how video cards are going to compare in any real games. It produces an overall 3DMark which is taken from unbalanced game tests. Furthermore as we have seen directly above in the [NVIDIA] benchmarks, drivers can be optimized to run the game tests faster. The problem is this is just a benchmark and not based on any real world gaming engines. Therefore while you may get a higher 3DMark number you will not see ANY increase in performance in any games that are out there.

Anyway, I don’t really care. The hardware is so ridiculously far in front of the software at this point. When we have to come up with edge cases like benchmarking at 1600x1200, or benchmarking at 1280x960 with insanely high levels of anisotropic filtering and AA to get any meaningful distance between the benchmark scores… well, that’s masturbating with hardware.

that’s just kyle shooting his mouth off as he’s apt to every 2 weeks in a cry for attention. in more interesting news, a dust mote settled on my nose.

They’ve got a point in that the second test is inelegent in its design. The leaked Doom III alpha runs better on my hardware.

nVidia’s main problem, and their beef is that a couple of the tests try to use pixel shaders version 1.4 which ATI has supported since the 8500, but which nVidia still doesn’t support even in the GeForce FX. If 3DMark doesn’t find 1.4 support it drops down to 1.1 (skipping 1.3 which nVidia does support) which requires almost twice as many polys per scene. Futuremark should’ve been more forwardthinking in their design, though, making most of the benches vertex/pixel shader 2.0, then dropping down in versions till you find compatibility. But then again, nVidia probably should have bit the bullet and included pixel shader 1.4 support. You’d think their whiz-bang, 2.0+ NV30 could cut the mustard…

Overall, I’m unimpressed. I tried for a couple hours to get the goddamn thing from one of the conventional mirrors before I noticed Gamespot had it mirrored as a free download. Then I was cruising at 200K. They cut out a lot of functionality from the free version that really hampers its utility. You can’t change the resolution, and you can choose which tests to run. You can’t even view individual test scores without connecting to their online results browser, which is a pain in the ass. I liked being able to run just the fillrate tests quickly to check changes when I overclock, change AA settings, etc. Looks like I’ve got a reason to keep 2001SE around.

I don’t really like Kyle, but judging from his rant he’s actually pro-NVIDIA on this stance rather than pro-ATI. Of course, that’s Kyle… I’ll just skip the rest of this thought.

“Futuremark should’ve been more forwardthinking in their design, though, making most of the benches vertex/pixel shader 2.0, then dropping down in versions till you find compatibility.”

I thought this is supposed to be their DX9 version benchmark? If it only does up to version 1.4 and not 2.0 then they can’t claim this is a DX9 level benchmark.

It does go to 2.0 in the Nature 2 test.

Yeah, but even then only for some of the stuff, like the water and the leaves, but the grass is using an older version.