Upgrading memory and video to run Oblivion and FEAR

I couldn’t convince myself to upgrade for one game but now TWO games are out there taunting me. Just wanted to run my idea on upgrading my system to run Oblivion and FEAR to be honest.

Current system:
AMD 64 +3200 newcastle (socket 754)
ATI x800 pro (agp)
two 512 meg sticks of ddr 400

I want to upgrade the memory. I currently have 1 gig total. I can have either 2 gigs and run them at ddr 400 or 3 gigs running at ddr 333. For Oblivion, which is better?

For the video card, I’m think the 7800 gs. I’ve read on several forums that upgrading from a 6800 or x800 to a 7800gs is like night and day for Oblivion and FEAR. (hope they are right)I thought about upgrading to pci express. However to do that I would realisticly be looking at new motherboard, cpu, power supply and then the video card. A $400-$500 upgrade turns into a $1000 to $1200 upgrade. Which wouldn’t be so back execpt i know I will want to replace everything next year when Vista ship sitting on the new platforms and pretty DX10 cards.

Is there really a big performance gain of 2gigs vs 1gig? I’m thinking you can just get a new video card and you’ll be set.

The bump from 1GB to 2GB is more useful than the bump from 2GB to 3GB.

Oblivion seems to be the first game that really can take advantage of the extra memory. Some people have claimed that moving to 4 gigs is even benifical.

Well, what I got by bumping from 1GB to 2GB wasn’t a performance gain so much as a responsiveness gain. For example, I could ALT-TAB while playing Planetside and I’d get to my desktop immediately. With only 1GB of RAM, the hard drive would thrash for 30 seconds before the desktop would appear.

In otherwords, it alleviates hard drive thrashing.

I don’t know what kind of performance improvements people might say they are getting with more RAM for Oblivion, but I can say that my hard drive does not thrash in Oblivion at the moment, and so I don’t know what benefits more than 2GB of RAM could provide in this circumstance.

I’d hold off on the video card. I have an x800xl and it runs both games well. While the 7800 is a great card, I’m holding off an upgrade until the dx10 cards come out.

Rob: your system is not unlike mine. I’ve got an A64 3000+ (S754) AGP PC which I’ve had about 18 months, IIRC. When I made the step up from a 9800 XT to 6800 GT last year, the performance gain in most of my games was substantial; I could crank up the detail settings with no loss in framerate, if nothing else. Since the 7800 GS is even faster than my 6800GT, I’m willing to bet you’ll see a pretty big jump in performance if you play at higher res & detail settings.

OTOH, upgrading from 1GB to 1.5GB has had a much more modest effect on my games, which is a bit disappointing: as Roger says, there’s less disk thrashing, especially when alt-tabbing or running other programs; but most of my games haven’t been any zippier. That said, I don’t have Oblivion, so I couldn’t tell you if it would benefit. And I haven’t played FEAR since upgrading, so I don’t know if extra RAM helps it, either.

Personally, if you’re looking to upgrade now on a tight-ish budget, I’d get a 7800GS (and maybe the extra gig of RAM), then save my pennies until I could afford a full-fledged system upgrade next year, when Vista, DDR2-compatible Athlons, and DX10 video cards are on the market; and SLI / Crossfire have had more time to mature, possibly making dual-video card systems more attractive in a year or so.

Get the RAM, see how the game performs, then consider the videocard. 2GB of RAM is better than 1GB, though it won’t be some sort of magic fix. (I tweaked my INI file for Oblivion to have it use more RAM, though I didn’t notice any real improvement.)

If you’re serious about upgrading for Vista, however, then I’d wonder why you’d buy a videocard today knowing you’d be replacing it in less than a year.

Mostly because FEAR has always run like a dog on my system and fighting off three or more enemies upclose in Oblivion with single digit frame rates is no fun.

I buy/build a new system on average of every 18 month. I’m on the 23rd month of the current system so I am past due. If it wasn’t for Vista coming out, I would of just build a new one and be done with it. Now I’m looking at just getting by" solutions

FEAR and Oblivion are pretty much the worst offenders right now, though patches have made FEAR perform a lot better today than it did at launch.

But you already know that. And yes, a 7800GS will be a significant improvement. But if you’re already resigned to upgrading some part or all of your system for Vista, why not consider the motherboard/CPU and a moderately priced PCI-e videocard like the GS today? You’ll get X months of fun, then you can buy a DX10 card and use it with the CPU/mboard you buy right now.

Hmmm I should look into FEAR patches. I think I’m playing unpatched. Thought they only improved multiplayer.

I would do the motherboard/cpu/videocard/powersupply upgrade except that both AMD and Intel are about to jump platforms as well. Why replace the motherboard and cpu now if I’m going to have to replace the as well? If I know I was going to keep the pci-e card for more than 12 months, I could swallow the motherboard/cpu pill. Yet looking at Crysis, I just know nothing currently available in the videocard arena can run that decently.

Now I’m fairly sure that I can run oblivion and FEAR if I run them in ugly mode. Heck I probably can run Crysis in ugly mode on my current hardware. However thats not the point. I’m not in this hobby just so I can skirt by on lowest detail only to “proudly” claim I’ve ran the same system for 4 years. :/

Its a really sucky time to have games that beg for upgrades and all the upgrade paths being so unclear.

The first patch caused fairly major jumps in my frame rates. Definitely check that out before upgrading anything.

Oblivion is probably as good a test as anything right now, except for maybe the “whoa, this must really be next-generation ‘next-generation content’ since it brings an SLI system to its knees” option in Tomb Raider Legend.

I would do the motherboard/cpu/videocard/powersupply upgrade except that both AMD and Intel are about to jump platforms as well.

The AMD stuff is pretty late, and I personally would rather they work out the kinks with the first-generation motherboards than wait for the new socket design. A good dual-core CPU today should last you your typical upgrade cycle.

Its a really sucky time to have games that beg for upgrades and all the upgrade paths being so unclear.

Yeah, it’s a particularly bad time right now.

But if you’re looking too far down the road, it’s always unclear. I wouldn’t even think about Crysis, since it’s a 2007 (at best) game. And nothing with use or require DX10 for a while.

I’m running the Radeon 8500 that I picked up to run Morrowind. So it seems like Oblivion will hasten my way to a video card upgrade. I’m considering trying to limp by on the the 8500 using the Oldblivion patch I’ve read about. Has anybody here tried it out, or am I going to be posting my specs here asking for upgrade recommendations?

An 8500? You’re joking?

It is time, Timemaster Tim.

Well, my gaming has gone way down so I just don’t find I need to keep up. Civ IV is happy as a clam running on the 8500, and Oblivion is currently the only game where I simply must play it.

I have no idea how much texture memory Oblivion uses for the different texture sizes (small, medium and large). When Oblivion first auto-detected my video settings it set the texture size to medium, but I think that’s due to it thinking I only had 1 6600 w/256 megs instead of 2. I cranked up the texture size to large and it looks far better, in my opinion, than medium size textures. My guess is you need 512 to use large size textures without any performance issues, but I’m probably wrong about that.

It jumped me straight to Large with my 256MB 7800 go gtx. Game runs fine.

It uses less than 256MB at large texture setting. If you hit the tilde (~) key and press Scroll Lock to cycle through the various debug info, it will show you how much Oblivion is consuming.

What a 512MB card would help with though, is if you plan on using a lot of mods to play your game as some of them combined pushes it over 256MB like some of the new texture replacing mods.

Oblivion is a very GPU-heavy game, as such thats the most immediate impact you can get. But I’d be careful, its a bad time to upgrade. My 6800GS runs the game fine with large textures and all that but I needed a bit of tweaking. I also ran FEAR like a motherfucker. Im going to be holding onto this videocard and my current CPU (3200+ S939) until the middle of the next year when DX10 is around but I have that luxury. As long as 939 doesnt fall of the face of the Earth, which I think AMD wants to happen (fuck AMD).

Upgrading from 1 gig to 2 gigs doesnt make a huge difference with Oblivion. Slightly better load times, less stuttering, that sort of thing. The only game that in my mind really really hungers for 2 gigs is Battlefield 2 - that game runs so much better with the extra RAM.

“just getting by” is probably a good idea, if you can grit your teeth for the next several months.

When those Conroe Intel CPUs ship, you can get one of those and get better-than-Athlon64 gaming performance, the best media performance, and upgrade to a good PCIe graphics card and so on. And be so ready for Vista.

  • disclaimer: I have not tested a Conroe-based CPU in-house, but I’ve seen it running and it’s mighty impressive