ATIs R600 might not be all that great

Unfrozen caveman hardware writer!

LOL

In all seriousness, it has nothing to do with me not trusting the dude. It has to do with realities like - if they’re not working with AMD/ATI on R600 stuff, and they’re not if they haven’t signed the NDA and agreed to the embargo date, then where are they getting their drivers from? What’s the bios revision on their cards? There are considerations like how you set AA preferences in the control panel that can still affect all games (and performance in those games) when you check the whole “let the app decide” thing. Do they have their AA set to a different resolution filter that is affecting performance without knowing it?

But most of all, I have cards and those numbers don’t match what I’m seeing. People won’t have to wait too long for the flood of reviews, we can all argue about the card’s merit then. :)

YES. I’d buy one of those Athlon 64 X2 6000s today if I could get it in 939. As it is I’m looking at a whole new rig, so I may as well go Intel.

But whatev. I wouldn’t mind going back to ATI if they can woo me with their sexy new hardware. Woo me, ATI!

You’re the second person who I know to be in the know to say this about these early benchmarks. Scooping everyone doesn’t mean much if your numbers lack credibility. We’ll have a broad range of sources soon enough. It’s never been good practice to base hardware buying decisions on a single review source anyway, just because testing methodology is so different across the board.

Even though I’m all DX10’ed up around here, I’m intrigued, as it looks like Nvidia and ATI have swapped places with regards to driver quality lately.

Though the R600 may throw ATI into the same boat as Nvidia for a while, depending on how radical an architecture change it is.

The benchmark results in a quite reliable tech magazine over here indicated that the HD 2900 XT is comparable to the 8800 GTX or GTS in apps such as Oblivion and Prey - if AA and AF are disabled. Took a serious hit in Oblivion with transparent and adaptive AA enabled in Oblivion though. (Drivers to be blamed according to AMD.)

Oblivion (8800 GTX | 2900 XT), HDR enabled

FPS @ 1280x1024, (AA and AF disabled): 48 | 49
FPS @ 1280x1024 (TAA (8x), no AF): 39 | 17

Prey (8800 GTX | 2900 XT)

FPS @ 1600x1200 (AA disabled, AF 16x): 107 | 98
FPS @ 1600x1200 (TAA 8x, AF 16x): 50 | 43

Results under XP and Vista were pretty much the same.

They also tried some DX10 demo in the D3D 10 SDK (8800 GTX | 2900 XT)

Default/PipeGS: 64 | 159
CubeMapGS, Car, no instancing: 11 | 23
CubeMapGS, Car, instancing: 11 | 18

Also, energy consumption of the 2900 XT peaked at 215W.

-Julian

Also, a few more tests to be found here.

-Julian

Dang, that sucks. I was hoping ATI would be more competitive, but it’s slower, more expensive, and draws a lot more power.

I really don’t want to switch to NVIDIA because their drivers are such a trainwreck on Vista.

I can’t speak for HardOCP, but in the majority of my tests the 2900 XT was considerably faster than the 8800 GTS (it’s real competition - it’s way cheaper than the GTX, and a smaller card, too).

It certainly needs a driver revision or two to fix performance in a few areas, and it definitely has a “too much power” problem. Well, I guess “problem” is relative. The power thing is going to affect some people more than others. It has a noise problem that ATI says is a driver issue, too.

I’d say it shows a whole lot of potential at the $400 mark, but I’d wait a couple months to see how the drivers shake out. Besides, the cool pack-in isn’t even out yet (everyone gets HL2 Ep2, Portal, and TF2). Besides, if it doesn’t sell well because of the power thing, the price may drop.

What’s disappointing is that we don’t really know how DX10 performance is going to shake out. It would be nice to have some real DX10 apps to test these cards on.

Man, when they’re able to build a 65nm version of this, it’ll rock. I’d like to see that sooner rather than later.

What’s disappointing is that we don’t really know how DX10 performance is going to shake out. It would be nice to have some real DX10 apps to test these cards on.

Even without apps, aren’t there synthetic benches though? Or are they all braindead (i.e. testing only a single aspect of DX10 compliance at a time, rather than multiple aspects at once)?

From the little reading I’ve done it looks like ATI chose a significantly more “forward looking” architecture than NVidia (e.g. Tesselation shading capability, complex shaders, etc). Kind of a double edged sword, since by the time DX10 games start showing up en masse AMD will have at least had time for a refresh (if they can limp along until then while getting clobbered by Intel on the CPU front AND NVidia on the GPU front.)

Yeah, they’re mostly sample code meant to illustrate a particular feature of DX10. Like “how to use geometry shaders to extrude shadow volumes.”

A DX10 benchmark that more realistically recreates what games are gonna do, or just plain is a game, we don’t have. There’s a sort-of borked Call of Juarez DX10 bechmark made available by ATI, that had a multisample AA filtering problem, so I wouldn’t rely heavily on it until the DX10 patch for the game is out there in the public.

What, you’re surprised? How many times does CCZ have to prove on these forums that he’s an imbecile before people stop expressing shock and astonishment at his expressing idiotic opinions? Newsflash, my friends, dumb people say dumb shit.

guess the days of powerful cards taking up only one slot and not requiring a power plant to run are over.

If AMD can sell cards at the 8800 GTS pricepoint which are a little bit better than an 8800 GTS then that’s really all they need to do. They’ve certainly got fans of their products, and there’s a lot of people who just plain dislike nVidia’s products and drivers.

nVidia has certainly earned a fair amount of enmity amongst the early adopter market with their horrific driver support under Vista.

So what if they don’t make a product to compete with the nVidia $600 card. That’s such a small segment of the overall market that it isn’t a necessity to win that battle if you can carve out a chunk of the mid-range market.

  • like, a decade ago, dude.

Bragging rights - it’s all about the bragging rights `round here.

The power and noise issues would be a deal breaker for me. A 65nm version of this card, though…I’d be all over it.

Same. Though Jason’s article seems to list the HD 2400 and HD 2600 parts as shipping in June and using a 65nm process. If that’s true, then how far behind would the 65nm HD 2900 be?

Or are the 2400 and 2600 ‘shipping in June’ the same way the 2900 was shipping last fall?

It better be before the end of the month. I decided I will stop waiting and buy a new box by the end of the month. I am going to get an 8800 GTX card, and I really want a comptetive price cut from NVida before then.

Don’t hold your breath. If AMD isn’t shipping the midrange 65nm stuff until later in June, you can bet the high end 65nm is later than that.

I am sorely tempted to go with the 640mb 8800GTS and overclock it. They’re down to about $330 now after rebate. The overclocked perf is within 10%-20% of the 8800gtx at the max resolution I can use, 1600x1200, in most games. And the power requirements are pretty reasonable for a cutting-edge video card.

I’d also have to buy an 8500GT for my third monitor (I have learned the hard way to be very careful about mixing video card brands or families under Vista), so that’s another $100.

But every time I look, I’m turned off by reader reviews commenting on broken-ass Vista drivers. One review posted 5/12 said the current NVIDIA Vista drivers don’t retain overclocking settings after a reboot.

Sigh.