Half-Life 2 Issues: AA not possible?

Interesting bit of info that has been popping up on Half-Life message boards that serves as a sidebar to the endless “should I buy Nvidia or ATI for HL2” questions. Apparently someone questioned a Valve guy about why no anti-aliasing was used in the videos, and the answer was because AA didn’t work properly with HL2. He notes that this is a hardware (!) not a driver issue, so if you were expecting to run the game with 8xAA and 16xAF, you’re probably out of luck:

There are problems with the way that current hardware implements FSAA. If you enable it, you will see a lot of artifacts on polygon boundaries due to the way that they sample texture subrects with FSAA enabled. We are working with the harware companies and the DirectX team to make sure that future hardware doesn’t have this problem.
Gary

  1. Is this a problem that can be fixed with new drivers, or would we have to buy a whole new card to recitify it? If so, are there any cards on the horizon that would offer it?

Drivers aren’t likely to fix the problem, with the exception of the ATI 9500-9800. There’s hope there for being able to use FSAA properly. You are out of luck on NVidia unless either NVidia or us come up with some clever way of solving this problem.

  1. Is this a problem unique to hardware + Source?

It’s a problem for any app that packs small textures into larger textures. The small textures will bleed into each other if you have multisample FSAA enabled. The best thing to do right now is either buy an ATI card in the hopes that it will be solved there, or wait until the next generation of cards come out.

http://www.halflife2.net/forums/showthread.php?s=6292e480d4992f708d893a011d25e808&threadid=2622&perpage=15&pagenumber=1

Another reason to be glad I bought a 9800, I guess.

In several threads they (Valve) state that if you’re going to buy a rig specifically for HL2 then to go with an ATi, not Nvidia. The game is optimised for Radeon in the same way that GSC is optimising STALKER for Nvidia.

Tough choices.

ATI: The way it’s meant to be played.

ATI: We never tried to sell you the offspring of a graphics card and a leaf blower.

ATI: We never tried to sell you the offspring of a graphics card and a leaf blower.[/quote]

ATI: This time our drivers don’t suck.*

[size=2]*Well, at least not as much.[/size]

Having spent a sizable chunk of change on two separate generations of ATI cards in the past – and found both generations (Rage and original Radeon) to be great cards with perhaps the worst drivers ever written – I’m afraid I’m going to have to suck it up and play without AA. I don’t think I’ll ever buy an ATI card again. Of course, with the screwy naming schemes on the GeForceFX cards, I don’t know if I’ll buy one of them, either, since I can’t figure out which is the “one step above budget” that has been my rule since time immemorial…

Somehow, I sense the compassion from the rest of you. Or maybe that’s just Saturday morning BO…

edit: for formatting and a little extra content.

Does anyone else think the optimization of a game for a single line of video cards is a BAD thing?

Most of us have been saying the same thing since EA’s 3dfx-only days. Look what that got us today… a library of games that don’t friggin’ run properly anymore, because no one (at least, no one with a decent computer) has a friggin’ Glide-card now.

You just have to have a second computer with a Voodoo5 for all your non-primary gaming needs. This is really just more evidence that you should never buy hardware until you need it.

I’m going to wait on a response directly from ATi, nVidia and/or Valve before I trust second-hand info posted to a 3rd-party message board :D. The details I’ve read so far have the faint but distinct smell of FUD. Those video clips, being video clips, don’t depend on the viewer’s hardware and could have been easily antialiased for your viewing pleasure. I suspect Valve just used lower quality to show what the game is likely to look like on a midrange rig.

/devil’s advocate

I’ve never heard anything about this before.

Unless I’m remembering incorrectly, Valve flew their E3 demo on a 3GHz P4 and a RADEON 9800 PRO. I saw that demo first-hand and have also watched the new movies. I find it difficult to distinguish between the live and the canned. I believe these movies to be representative of Half-Life 2’s graphics on current top-of-the-line gear, sans anti-aliasing.

-Vede

Huh. The huge Fileplanet vid looked quite nice compared to the recent Bink clips, but I trust you, since you were there and all. Was the E3 demo really that jaggy in person?

OK, question… since most people here saw the videos or the show itself - does anybody think that a current video card could hope to run HL2 at full detail with FSAA and anisoptric?

Hard to say, Tom. It’s been a couple of months since the show, so my recollection is a little cloudy. I remember thinking to myself at the time that I was surprised they hadn’t enabled AA - right up until the demo started chugging a bit. Then I assumed they hadn’t enabled it so that they could maintain acceptable frame rates. But I can’t say for certain whether the Binks are more or less jaggy. They might be a little more…but then the demo at E3 might have been running at a higher resolution, too. I’d have to go back to the office and check my notes to be sure.

In my previous post I was speaking more in terms of shaders and texture detail levels. In that regard, I’m relatively certain that those Binks show us what the game will look like on PCs with high-end, DX9-class video cards.

-Vede

I think the E3 demo was rather jaggy but the effect was covered up by the plasma screen blur. Of course, it had several ugly stuttering moments, and assuming a relatively low resolution (after all, it’s just a plasma screen), I seriously doubt anyone will be running full detail HL2 and FSAA on even a 9800 Pro or 5900 Ultra.

I can’t understand why running FSAA has turned into some kind of religious mantra.

I have used FSAA and find it not worth the effort. It slows my games and gives everything a fuzzy look. So I simply turn up the resolution - looks much better and everything is nice and sharp.

Personally, I don’t use AA much either. A lot of games don’t support it in in-game options, and I get annoyed having to switch D3D settings in my driver to enable it in the games I can get away with the performance drop, and switching back out on intensive games. Trying to figure out a performance-acceptable combination of AA/AF/resolution is enough of a pain without having to restart the game over and over. Not to mention that games like BF1942 still have driver issues with AA on my card.

I certainly don’t understand the Beyond3D forum IQ perfectionists that refuse to play a game without it.

But there are definitely occasions where it makes a better option that raising resolution : games that don’t have scaling GUIs making for ridiculously small fonts, a high-end graphics card paired with a LCD monitor limited to 1024x768, etc. Word from the devs says that the GUI for HL2 will scale, however.

Well, as a Beyond3D staffer I have to confess to being one of those AA nutcases who can’t play without it. Ever since I laid hands on a V5 good anti-aliasing has been an absolute must for me to enjoy a game. And higher resolutions don’t get the job done either. It’s just a subjective preference.

AA - For flight sims you can’t go without, but much less so with any other genre.