GeforceFX

Well official info on the NV30 is coming out(read a thing on it at Toms Hardware) so thought I should start up a thread on the latest uber card. Its feature specs beyond DX9 are impressive but then how long after its release before we start to really see anything use it? It seems a ways ahead of the R300 in some areas but not so much in others performance wise. So give your thoughts.

Some links:

http://www.anandtech.com/video/showdoc.html?i=1749

http://www.tomshardware.com/graphic/02q4/021118/index.html

These tidbits of information sure will tide me over for the 3 months we still have to wait before an actual release.

I expect ATI will have a tech refresh around then with a higher clockspeed core and faster memory. nVidia has the upper hand with the complex shader programs. Unfortunately for them I doubt that will make a practical difference until after the next generation comes out.

nVidia’s finally onboard with adaptive filtering. Guess they decided it wasn’t cheating after all. Color compression is interesting, and sounds like it’ll be usefull for FSAA, but I’ve heard that the R300 does color compression during FSAA too. The Gamma control thing is interesting as well.

The most persistent rumor I’ve heard in the weeks leading up to Comdex is that NV30 hasn’t been in stores yet because of difficulties that TSMC is having with the .13 micron process. In fact, the delay is allegedly so significant that NVIDIA is still on-schedule to tape out NV34 (basically an speed boost… like GF2 or GF4) soon and could release it in 1H '03, right after NV30 hits stores in numbers. Assuming that the .13 micron process gets sorted out.

IIRC, ATI is sticking with .15 micron even with R350. They clearly out-maneuvered NVIDIA by staying with .15 for R300, since NVIDIA basically lost almost an entire product cycle with NV30. Question is, if .13 micron does sort itself out, will that give NVIDIA the opportunity to pull ahead of ATI again?

Hmmm… so much speculation, so little fact - all off one rumor. I’m going to start applying for financial analyst positions.

Despite the fact that in perception and technology ATI is currently ahead, Nvidia actually had market share gains over ATI recently.

I haven’t heard it said, but I would imagine that comes from the ubiquity of their MX line as a cheap but good solution, and reviewer recommendations of the Ti 4200 as a best price/performance card.

As you say, Jakub, the switch to the .13 process should eventually, theoretically and probably in actuality, give Nvidia an edge in being able to push clock speeds on that line. On the other hand, I wouldn’t doubt ATI has a vamped up 9700 ready to debut right as the GeforceFX hits shelves, and with an array of tempting price cuts on the other Radeons.

ATI has a hole in the mid range. The 9700 is the best high end line and the 8500/9000 are the best under a hundred bucks. The 9500 and 9500 Pro will be available I think December 8th, and they’ll take the 4200 head on. And there’s really nothing nVidia can do for the next 3-4 months.

Tech Report has an interesting editorial about how the GF FX in no way destroys the 9700 like it was supposed to. All nVidia has done is release a R300 class part 6 months later than ATI. http://www.tech-report.com/etc/2002q4/geforce-fx/index.x?pg=1

According to them the 9700 does the Gamma thing and the color compression thing too. All nVidia has to toot their horn about is support for super complex shaders you’ll never need, and really high frequencies. And who knows what ATI will do before February? nVidia’s greatest asset at this point is really the quality of their drivers. And even those can be hit or miss. The latest WHQLs won’t let me run at an 85 Hrz refresh rate like I could previously.

I think the article does a great job of cutting through the paper-launch hype crap that is currently larger on the game web site rounds than even the 9700 “ATI day” which had real hardware running to back up the excitement. But then it falls into the opposite trap of criticizing hardware that isn’t out yet because it will be too loud or not fast enough over the competition.

I think it’s too easy to discount features that won’t be utilized till future games and frame-rates so fast you don’t need them. Those are what every 3d accelerator generation has sold on; or rather, they sell on: it’s a year later and those future games are here and your old graphics card has frame rates so low you need to upgrade to that card that was so ridiculously powerful at the time.

ATI has been consistently losing market share for years – the correct spin on this quarter’s performance is that the 9700 Pro allowed it to avoid losing even more share. Nothing comes close to the GF4 4200 in terms of performance for a cheap price – it’s definitely the sweet spot for cards. In spite of being an inferior product, the GF4 MX is also selling huge numbers because of packaging deals with Dell, etc., where it’s the default card on almost every system.

But it’s amazing that the ATI9700 pro essentially came out of nowhere, and will dominate performance at retail for at least another four months (including the Xmas season, although that isn’t that important for hardware). 7+ months of having the best performance, for prices less than the GF4 4600, is a great coup for ATI.

Stefan

As the 9700PRO users will quickly find out, though, ATI’s support and service is horrible. I’d suspect it’s a lot of “newbie’s” buying ATI cards, either because it was cheap and fast or because it’s an alternative to Nvidia. Who really knows why.

If ATI keeps up the service and support, they’ll make a great coup. If they don’t, well, they might be burning another generation on their brand and kill themselves in the end. Who knows, really, time will tell.*

*I am not against ATI. I think they make wonderful cards. They’re just not the best at supporting them.

I think ATI has come around a little in support. When I first bought my 9700, it had all sorts of problems with the drivers–but by a month later they had released drivers that cured all of them. I felt confident enough in their drivers to download the DX9 betas, which work like a charm.

The GeForce FX is a totally new architecture for nVidia. They might have a few bumps getting the drivers right for that, as well.

I would have to agree about ATI and the FX launch. You have to remember the GeForce original launch was not without a lot of the same issues as the 9700 launch. Problems with power supplies, drivers, motherboard incompatibilities, etc. So I also would be surprised if the FX launches perfectly due to the fact it is a totally new design. Nvidia will likely be more on top of driver issues but that is not all they will have to contend with if the GeForce and 9700 launches are any indication.

– Xaroc

According to this report from Merril Lynch, the NV30 has been moved back to April. Back to waiting.

There’s a lot of speculation that Merrill Lynch got the part designations mixed up.

http://www.tech-report.com

I highly doubt it’ll take until March to get your hands on a NV30. Bring your wallet, though-- <smashtv>you’re gonna need it!</smashtv>

ATI took the opportunity of our meeting with them at Comdex to hammer NVIDIA about their bandwidth claim. This is going to be the focal point of ATI’s campaign for a long time to come, but I suspect they’ll have to concede and re-market their cards with “revised” bandwidth, rather than the other way around. Same situation as Intel and AMD. Intel should be the ones using a “P” rating since Athlons perform far closer to historical clock-for-clock values than P4s, but since Intel holds market share…

Yeah, I knew that 48 GBps figure was complete BS when I fist saw it on the Register (or was it the Inquierer?). ATI clearly wins the memory bus portion of the challenge, and I think they’re justified in questioning the usefulness of nvidia’s large instruction numbers. We’re a couple generations from vertex and pixel shader programs that complex being practical. The GF FX doesn’t have much going for it besides clockspeed and drivers. nVidia cheerleaders have been trumpeting the NV30 for a while as a part that would evicerate the R300, and that simply isn’t the case. I’ve been warning people that this was going to be the case for a while. Word on the street for a long time has been that the NV30 would be riding a 128 bit bus with DDR-II, but still less bandwidth than ATI’s 256 bit monster. For a while nVidia has been trying to de-emphasis bandwidth in favor of what they called programming efficiency. This was certainly to soften the blow when the GF FX finally did show up trailing the 9700 in that respect. Clock for clock the R300 has a more powerful geometry process as well. The peak rates they’re claiming for a 500 Mhz barely surpases the numbers for the 9700 Pro, a part almost 200 Mhz slower. So when the part doesn’t obviously crush the competition in traditional performance quantifers, we get lots of marketing crap about color compression (which ATI has), 16 texture “operations” per “pass” (ATI has it too), and CineFX which as far as I can tell is a meaningless term which provides nobenefits whatsoever (aside from looking good in a power point presentation and on the back of a retail bo). Presumably ATI will crank the clockspeeds and give their product even more memory bandwidth by Feb-March-April and nVidia will be embroiled in a real knife fight.

I’m not all too sure if I’ll take ATI’s part on the bandwidth wars. While I doubt you’ll ever see an effective 48GB/s of bandwidth, you will see more than the raw physical data that the memory is supposed to provide. Unless NVIDIA is completely bullshitting everyone about the framerate claims we are looking at a card with at least twice the power of the 4600.

48GB/s sounds pretty ridiculous though. I mean, if that’s near-peak theoretical numbers using all the cache, Z-buffer and instructions tricks, sure. But real-world I’m guessing that it will perform at about 25-30GB/s of performance.

But ATI is using all the same tricks. There are going to be varying degrees of effective implementation, but both companies are employing the same types of technology. ATI just happens to have a 4 GBps head start. Next year’s GF FX may be faster than September '02’s 9700 Pro, but I think that’s basically a function of a higher core clock and perhaps better optimized drivers. But we won’t even really know that until we see benchmarks that aren’t from nVidia’s marketing dept.

ATI is not using the “same tricks”. If only because both companies have patented every last combination of transistors that anyone could imagine. NVIDIA might fudge the numbers a bit, but they’re not going to blatantly lie. The worst that could happen is some sort of benchmark specific driver that they can pull out at whim.

Sure they are. Name something that nVidia’s Lightspeed memory architecture uses that isn’t represented in ATI’s Hyper-Z. They’re all compressing as much as possible, doing fast Z clears, doing visibility checks, etc.