Fuck ATI: Do I Nuke the Site From Orbit?

I hate to bring this up at risk of sounding like some kind of fanboy, but I like Physx. Yeah, I said it. I wish more games would use it. I think of how neat Skyrim would be if more leaves fell from trees and bounced off of your character and other things in the environment.

To be fair about the flickering shadows in your Whiterun residence, that’s a problem with the cooking pot grate. Also, I do get some shadow/texture flicker in distant mountains as well whenever I’m moving. I think the issue with these specific points you bring up are Skyrim engine or world development problems.

But yeah, people are still having issues related to specific ATI cards with Skyrim, which just seems like a foreign concept. nVidia problems seem to happen because of drivers, something that affects pretty much every card. ATI might have a mess going on beyond the driver level, which of course is affecting the driver level.

Oh, that’s whats been going on. Been driving me batty! Off to download me some beta drivers…

Unfortunately, nVidia’s “The Way it Was Meant to be Played” program is the reason I refuse to switch to their hardware. In order to make up for inferior, late and overpriced hardware over the last decade nVidia has leaned hard on their co-marketing deals with increasingly scummy result.

More often than not the reason AMD don’t have optimized drivers ready at release for these big games is because nVidia’s co-marketing contract with the developer prevents them from sharing pre-release version with AMD. You also have cases where developers are integrating nVidia written code into their games that specifically sabotages performance on AMD hardware for no visual gain (Crysis 2 and HAWX 2 DX11 support) or arbitrarily locking features using hardware vendor IDs (Batman:AA’s MSAA). They are also guilty of basically hobbling the software implementation of Physx in order to make the GPU accelerated version seem better than it is.

I’m currently thinking of upgrading my videocard, but despite how tempting that overclocked 560 Ti with a free copy of Battlefield 3 is, I’m just not sure I can bring myself to support a company that acts like that.

Yeah, it’s pretty good. I saw what a difference it made in Batman Arkham Asylum (don’t know if it’s the same for City) and got really bummed out that I was playing it on an ATI system and shelved it.

To be fair about the flickering shadows in your Whiterun residence, that’s a problem with the cooking pot grate. Also, I do get some shadow/texture flicker in distant mountains as well whenever I’m moving. I think the issue with these specific points you bring up are Skyrim engine or world development problems.

Sure, same with the shadow issues in general, but they seem to be worse on ATI cards - those crazy concentric circles you get from the cooking pot grate are pretty psychedelic on ATI systems.

As you said, it’s just unacceptable that ATI’s drivers seem to improve performance on the 6000 series cards, but not the 5000s, and screw up the 4000s, etc. – WTF? If the hardware is apparently so incompatible then have separate drivers for each card series, instead of just cramming different options until something seems to stick or at least diminish the complaints.

I also agree with what Brad G is saying - i.e. that Nvidia may be getting its edge through commercial relationships that are punitive to ATI, and if that’s the case I hope ATI can stop that sort of anti-competitive behavior – but I’m tried of being the victim of that war by being encumbered with those disadvantages as well. I just want the products that deliver the best consistent experience, and right now I’m increasingly realizing that’s not ATI.

I don’t care, I just want my games to run.

Then maybe you should hold it against the party that is conspiring to undermine the experience of half the paying customers? Really it’s the developers who shouldn’t be putting up with nVidia’s shenanigans, but I think these deals are probably signed at the publisher level. I’m sure the suits at that level think PC customers should be glad they’re getting a port of the game at all.

Who cares? Intentionally buying a product you know to be inferior is faulty logic. You can call it an issue of principles, but what’s really happened is that ATI has been too slow to adapt to a clearly brilliant move by Nvidia. Seeing a guy in second place and blaming the guy in first place for putting him there is just a backward way of looking at things. You’re actually fucking ATI over in the long run by giving them money, which makes it seem to them like what they’re doing is working (it’s not).

Also this.

If I buy a Toyota only to find out that Honda is paying the guy at the gas station to put sugar in my tank, my grievance is not with Toyota. You might as well be trying to defend a protection racket. nVidia isn’t saying they’re gonna crap up your game, but games get crapped up… So pay them, if you know what’s good for you.

AMD has tried to play into this ‘made for X’ crap too. Dues Ex had their branding on it, and ran better on Radeon cards.

Is there proof of this, by the way? I remember something years ago, but Brad’s making it sound like there’s a pretty big racket going on, when the simplest answer would be that ATI has simply not kept up to the latest releases with their drivers.

Not every new game has an nVidia logo, either.

The developer outreach Nvidia provides has made life much better for Nvidia customers. ATI has failed to catch up. That’s not anyone’s fault but ATI’s, and the customer suffers for it.

Pay the guy with the superior product if you know what’s good for you? I’d call that totally rational, not some half-baked morality myth where ATI is this poor waifish victim that only gets the scraps Nvidia deigns to throw at them and thus it’s your job to buy ATI’s shittier product to fight the power.

Fuck that. You want me to buy ATI ever again? Have ATI make a superior product.

Also, didn’t John Callback make some reference recently about Nvidia being much easier to work with as far as responsiveness to their questions during development and so forth? That sounds to me like ATI just dropping the ball.

Say what you like about Rage, but when it comes to technical stuff I’ll take his word for it over some shadow conspiracy theory.

Well, SLI/Crossfire is a very unsatisfactory solution. Unfortunately, there is no single good answer to solve it’s issues, and you can’t yet hard-bake the solution into DirectX.
Rumour has it that it’s something they’re trying to do in the DX12 architecture.

If you mean more generally, the problem is this: DX9c is old, and much extended, in non-standard ways. But you can’t realistically develop DX11-only games yet (Unless it’s BF3…and even then. There are just over a dozen DX10/11 requiring games, TOTAL). Even Steam’s survey shows just over 50% support, and you want 90%+. Microsoft basically fucked the hell up when it tied it’s DX10 architecture into the Vista driver model, and refused to back down.

Did DX3 actually have any problems on nVidia?

Sure, they have a program and a strict set of guidelines to use standard API calls and never do anything to purposefully harm performance on competing products.

Cases like Batman: AA’s MSAA being artificially disabled on AMD cards are pretty well documented:

There was the famous case of DX10.1 support being patched out of Assassin’s Creed 1 back when only ATI hardware supported the spec. At the time nVidia claimed it was unrelated with their marketing deal with Ubi, and that they never pay any money for those deals, but it later came out Ubi had received 2 million dollars from nVidia.

Lately the controversy has surrounded the way nVidia co-marketed games implement tessellation. The Fermi GPUs have really powerful tessellators so they’ve been having developers turn up the dial on that far past a reasonable level, as in Crysis 2 where flat planes that could be represented by two triangles are subdivided arbitrarily and for no benefit.

And let’s not forget nVidia has a pretty long history of egregious driver cheating in various artificial benchmarks. Not to mention accusing AMD of cheating over optimizations they themselves endorse.

Look, it’s no secret that AMD isn’t as deeply invested in developer relations as nVidia. They don’t employ as many people for that purpose, and they certainly aren’t writing 2 million dollar checks. But that in no way excuses the culture of deceit and sabotage that seems to infest the nVidia camp.

Personally I think it’s a combination of factors. Nvidia’s consistently supported a larger devrel team and as the years go by that pays off more and more for them in terms of new releases simply working better out of the box. Face it, if you’re a developer and you can get quicker responses from team A, you’re going to appreciate that, you’re going to build a relationship with those guys. It’s simply human nature. Also, I think I’ve heard Terry Makedon is off the Catalyst team (not sure if he’s still with AMD), so that doesn’t bode well unless they can replace him with someone of roughly equal caliber. This situation is why I tend to run with GTX boards (a 570 right now), I simply want new games to work correctly.

That said, some of Brad’s negative comments are accurate. But Nvidia is hardly the only company that does such things.

Just be careful with this, nvidia’s recommended steps are something like:

if you installed previous drivers and have them anywhere on your system (down to anything >275) and have the install files hanging around, you need to repeatedly uninstall them, reboot, see if they come back, uninstall them, see if they come back, etc, until either the native windows driver or 275 or lower are the installed set. Then install 290.36, do a custom install, and click ‘clean installation’ from the custom install screen.

I only ever had the 285 set installed and I was just able to uninstall the main graphics driver, renamed the old c:
vidia folder, reboot, and install the new set using the custom install/clean install method, and it automagically removed everything else (the hdmi audio driver, physx, etc) fine by itself.

The other thing is, I know a few games really don’t enjoy having the hdmi audio adapter (for either amd or nvidia cards) or the nvidia “3d vision” driver support installed, so I never installed that shit.

As I said I think it’s an issue that nvidia actually have useful people for game developers to chat with than that their drivers are some perfectly reliable product…

Yeah, it does feel pretty good being that righteous dude (I owned just about anything that failed hard with good tech - rendition, powervr, etc), but you know, at a certain point, it is worth doing yourself the favour of buying something where shit "just work"s.

Well, I’ve had three ATI/AMD cards in a row without issue. It’s not like the question is do I buy broken or do I buy evil. It’s do I buy the good hardware that’s not quite as good a value for the money but has better release day game optimizations versus do I buy the good hardware with slightly better value for money but game optimizations might lag by a few weeks or months?