Help save AMD!

So… AMD just took a 1.2 billion dollar loss. We need AMD to survive unless we want the Intel virtual monopoly to become fact. So.

Perhaps i need to sell my computer, and make a new AMD based one!

I can just sell the whole thing i have now, case and all, and start over, or i can sell the intel/nvidia pieces off and start anew. But either way, i think i need to make an AMD based box.

I’d like to stick with their energy efficient lines at first. If i keep the case, which is a mATX box, i’ll have to keep the total power use under 350w.

How does the 4850 compare to the 9600 GT in power use?

I’m running a shuttle XPC (SN27p2) with a 4850 and a x2 6000 in it. I think the shuttle power supply is something like 250 watts, and it handles it perfectly. That’s a nice AMD machine you can build on the cheap, and it’s tiny and quiet.

On the other hand, if I was building a new machine now, I’d be pretty tempted to go intel - phenom processors look like they are pretty poor across the board, so I don’t have a decent upgrade path.

AMD will make some good $$$ off the new 4850 - it’s by far the best bang for the buck. It’s a good thing they got ATI - they should make some profit there, and hopefully they’ll come back doing well once their next architecture comes out (2009?).

The 4850 is awesome. I replaced an X1900 with it and its faster, quieter, smaller and uses less power.

AMD’s a scrapper. They’ve been here before, and they’ve pulled it out. If they don’t beat intel pound for pound, then they’ll just dominate a different weight class, or compete strongly on pricing.

That being said, I’m not going back to AMD until they make their own chipsets. I stuck with AMD based on price/performance for many many years. Nearly a decade, actually. But all the chipset issues I had to deal with really just killed it for me. I bought a core 2 duo a year and a half ago, and I’m not going back.

Intel has more income than AMD has revenue, and 30X their market cap. It’s hard to imagine how AMD could ever catch up at this point.

What’s wrong with nForce chipsets on the AMD platform? I never had any issues with any of them, up to and including the 7xxx series.

I think ExtremeTech is going to have an energy efficient PC build article soon. According to the podcast, it will use the 45W dual core AMD CPU’s as well as thier chipsets.

nVidia chipsets tend to draw more power than intel and AMD.

Build this! It freakin’ rocks as a secondary box – far better than any Intel combo on a bang per dollar/watt basis!

http://www.codinghorror.com/blog/archives/001107.html

Cheap, too. Just slap a 4850 in there (and a slightly upscale CPU) and you’ve got a gaming box that can handle anything.

I have that box down to FORTY FOUR WATTS on idle now (I went with the fanless Scythe Mini). 44 watts people! That’s absolutely incredible for that much gpu/cpu power.

I’d say the hot card to get right now is a 4870. Maximum PC’s EIC recommends it. It’s 75% of the power of NVIDIA’s $600 card, at 1/2 the price.

Not to derail this thread too much, but Nvidia just dropped the price of the GTX 280 to $499 and the GTX 260 to $299.

… It’s 75% of the power of NVIDIA’s $500 card, at 60% of the price. Yeah.

They do make their own chipsets. They’re based on ATI technology.

this notion of corporate charity or supporting the little guy…well, is a little weird. your purchase won’t matter.

Don’t bother to vote either.

I’m going to help save AMD by buying an AMD HD 4870 X2 as soon as one comes out! The fact that it’s going to be the best card on the market is just second-thought.

People say the 4850 is a good deal, but I can’t say that I’ve read anything that purports that AMD’s CPUs are worth a crap.

If you’re primarily a gamer, you’ll be GPU limited FAR before CPU limited in most modern games. General advice: spend as much as possible on your video card, skimp on CPU and upgrade that sometime in the indeterminate future.

Even if you’re not a gamer – CPU matters a lot less than you’d think unless you’re doing something like raytracing, video encoding, etc. And even that parallelizable crap, once they move it to GPUs, will clobber CPUs of ANY speed.

I mean, how fast does your CPU need to be to open webmail and browse the web, which is what 95% of users appear to be doing exclusively nowadays?