…as the largest videocard company, measured by current market value. That’s a pretty amazing story, considering only a year ago Nvidia was 4 times the size of ATI.
One generation of cards, plus the Xbox2 contract, allowed ATI to double in value while Nvidia simultaneously lost half of its value.
No, it’s because I got fed up with the crappy ATI 7000 series I used to have and it’s poor drivers and got a GeForce 4. ATI saw me do that and cleaned up their act.
This is why Warren Buffett stays away from tech stocks.
Not completely though. He spent some money on Level-3 a while ago, who surprisingly scored a deal with Microsoft about the Xbox Live infrastructure not too long after that among other things.
They were betting that TSMC would have their .13 micron process ready. TSMC failed to deliver, and the 1st-gen GF FX line couldn’t go into production until TSMC could deliver decent yields. This delayed the GF FX launch by 6 months. The 6 months proved to be the difference, as it allowed the 9700 and 9500 line 3 months of unfettered dominance, as they didn’t require a .13 micron process to run properly.
You’ll notice how quickly NVIDIA came up with replacements for the disastrous first-gen FX line, since of course the designs were long-since completed and the only thing holding up the launch was the presence of the original FX.
I don’t have a favorite among either company, so please keep that in mind before you scream bias. However, I wouldn’t count out NVIDIA at all. The few insights I’ve had about their corporate culture have been of a lean and aggressive competitor. Given their problems with Microsoft, the Xbox deal was a mis-step. ATI might have similar issues in the future.
Perhaps. But I don’t think it was just the GF FX delay. The 9700/9800 cards are just plain superior in terms of tech. Add to that the fact that the GF3 was all they had for over two years, and I’m not so sure I’d say it was just a gamble. All of their cards before the FX were just GF3s on steroids. Two years before any kind of tech advancement? That seems somewhat lax to me.
didn’t the acquisition of 3dfx’s intellectual property screw Nvidia? i mean, they had some kick-ass design teams doing this great work, but then all of a sudden they buy all this new technology from 3dfx and then try to weld that with their existing technology, and that was bound to be problematic. after all, nvidia and 3dfx had used totally different approaches to the problem, and trying to reconcile the technology to make it work together had to create a frankenstein of a chip.
yeah, the xbox thing kinda hurt nvidia. they had to devote dev resources to get the chip done in time for microsoft (if you read dean takahashi’s book, there were some really hairy moments for nvidia when they had a huge bug in the chip they couldn’t figure out and the deadline was literally days away. they burned the midnight oil like crazy over there until one of their engineers realized what was going wrong.) but i imagine that trying to incorporate 3dfx’s tech was probably worse.
Add to that the fact that the GF3 was all they had for over two years, and I’m not so sure I’d say it was just a gamble. All of their cards before the FX were just GF3s on steroids. Two years before any kind of tech advancement? That seems somewhat lax to me.[/quote]
I may be wrong, but wasn’t there a two year time gap between the introductory GeForce and the GF3? And wasn’t the GF2 basically a 'roided out GF1?
My point is that Nvidia’s philosophy may be to introduce a chip, stretch it out for a couple of years (more RAM, increased core clock rates), all the while developing that chip’s successor. If that time frame gets skewed (for example, by the TSMC stuff Jakub alluded to), especially at the introduction to a new chip cycle, then Bad Things ™ can happen.
I need to get a new video card in the next 2-3 months for my gaming machine (the Ti4400 is going to the “backup” machine ;) )
Midnight Son mentioned his “deeply discounted” 9700 Pro. I’m wondering what the real gear heads on the board think the good mid-range (pricewise) card is these days. Assume you’ve got $200-$250 to spend… What would you buy?
Do you have a link? It’s not that I doubt you, but that I would be interested in reading the article. :)[/quote]
It was in the financial papers yesterday. You can independently verify it by just checking the market caps at Nasdaq, but I don’t have a link handy – read it in www.nationalpost.com, however.
I’ve been very happy with the Radeon 9700’s I got. i have one 9700 lite tx model that performs BETTER than the G4 4600. My AIW Radeon 9700 totally rocks. It does dual gaming/record digital cable shows. Also, ATI figured out making stable drivers is a good idea.
This is true, but what I find particularly interesting is how well at Radeon 9800 Pro - at $100 less, .15 micron, only one slot wide, and lower clock speeds - performs so well next to the “got it right” GeForce FX 5900 Ultra.
There’s no denying that the NV30 (5800) cards were the wrong technology for the time, and the Radeon 9700 Pro cards were far better than the rest of the graphics industry was expecting.
But I think it’s more than just a misstep on NVIDIA’s part. I think it’s that combined with a real change in ATI. It’s not just the engineering talent they acquired with ArtX, it’s the corporate culture change that Dave Orton brought in.
I agree that it’s way to early to count out NVIDIA. Jen-Hsun Huang is, by all accounts, an excellent motivator. ATI’s success over the last year is probably just the kick in the ass the company needs to redouble their efforts and come back swinging, plus one or two other cliches.