Games that perform better than their predecesors?

Over the past year or two I’ve noticed a wonderful new trend. For me the problem with PC’s is having to compete in the hardware race as each year’s worth of games require more advanced hardware in order to play them properly. But lately I’ve noticed that not only is my 2 year old laptop able to play the current gen games without making sacrifices in visuals but it actually plays newer games better than than older ones. Fallout 3 and Skyrim play better on my PC than Oblivion does. And not only does Mass Effect 2 look far nicer it also runs far smoother than the original Mass Effect! Have you noticed any other games following in this pattern? I hope it continues!

I’m pretty sure you can blame consoles for that trend. Expect a reversal when the next generation of consoles is released…

Your examples are CONSOLE games. They’re made for consoles that don’t change across four years.
While your average laptop, OTOH, keeps getting better as previous gen tech becomes cheaper and is implemented onto laptops as default over the further previous gen tech.
It’s not really a trend.
Also, more and more are adopting multi-core/multi-threading technology whereas older games were still running on single and didn’t exploit the full potential of the hardware.

I don’t think the trend is going away. PC game devs/publishers are realizing that making a game that can’t be played on someone’s comp means they don’t buy it.

I don’t think enough people buy games just for the bleeding edge graphics to make it worth pricing/locking up a significant number of consumers.

Skyrim and Fallout3 are on the same engine as Oblivion - no doubt it was optimized on subsequent games. Same with ME. Practice makes perfect [QUOTE][/QUOTE]and all that.

Supreme Commander 2. Admittedly, they scaled back the ambition of the first game’s engine quite a bit to make it happen, but you can easily get rock solid framerates in that one when the first would barely chug along.

Another possible reason is that, all else being equal, games should perform better if they use DirectX 10+ properly rather than DirectX 9.

I totally agree. I certainly never had a problem with a developer focused on having “state of the art” on everything UNTIL they complained later about low sales.

I’d love to be able to make 64-bit only DirectX 11 games. Would make life easier. But you’d lose a lot of people for now. It’ll come soon enough.

How is it easier, Brad? I can see how 64 bit might allow faster processing and such, but is DX11 easier to program for?

I’d love to be able to make 64-bit only DirectX 11 games. Would make life easier. But you’d lose a lot of people for now. It’ll come soon enough.
Are you sure?

I just looked at the Steam hardware survey, and it says “51.94% are DX10, 30.56% are DX11”; a further 10.86% has a DX10 GPU but is using Windows XP, and only 6.18% are still using DX9 GPUs.

DirectX 11 works just as well on DX10 hardware (except for DX11 features, of course), so it seems only 5-15% of the market would be lost, and this figure will obviously get lower over time, and among people who buy cutting-edge games.

Regarding 64-bit, if the game is written properly it can just be compiled and shipped as both 32-bit and 64-bit versions trivially.

So if someone is starting now on a 3D game, DirectX 11 compilable as 32-bit and 64-bit versions seems to make quite a bit a sense.

Yes, and it’s also much more powerful and allows drivers with less CPU overhead.

I’m holding 64-bit until there’s a wider adoption and support in the market.
Plus, backward compatability/emus of sorts so I can keep playing the older games I own and enjoy dusting off every now and then (you’d be surprised how much fun D1 is, for example, once a year).

I complained for years that the ongoing death of PC gaming was mostly due to the ridiculous rise in system requirements.

It is no coincidence that we’re seeing a renaissance in PC gaming at a time when hardware requirements are being held back by the lengthy lifetime of the current console generation. Even developers who have traditionally pushed the limits of PC hardware are finding that they can’t ignore the consoles, and the result is games that run quite nicely on even fairly modest PCs.

Sure, but what’s the point? If the game is designed to run as a 32-bit version at all then that’s the best version, even on 64-bit Windows. 32-bit versions are smaller and faster. The only reason to go 64-bit is to exploit the larger memory space, and then you can’t have a 32-bit version anymore.

This.

Not so much programming wise but it provides a lot of stuff for free. For example, fonts. May seem like no big deal but font handling in games is a pain in the butt but you get it for free in DirectX 11. Also, DirectX 10+ makes it easy to keep textures just on the graphics card and not in main memory which is a pretty big deal these days.

The point is to be able to make use of the memory difference. I’d like my games to be able to have a lot LOT more monsters, aliens, enemies at once in much larger worlds while having them be very high detail, complex units and that is very hard to do with a 32-bit game.

Well if consoles are to blame then I would like to say THANK YOU consoles! I always used to say (from a totally layman’s perspective) that PC’s made PC developers lazy because they could always rely on the next gen tech to make their games look better whereas console developers had to squeeze every drop of power that they could get out of their fixed hardware in order to get graphic improvements. I like the second option much more :)

Uh, PC development actually begun from exploiting every byte out of a 64k of memory…
It’s not that PC developers are lazy, it’s simply that there are more of them with varying quality of programmers as compared to the few exclusive wizards of past.
Today you have many many many programmers and a big portion of those aren’t that good, they try to just get by and they write sloppy and/or unoptimized code, because the hardware can allow this, as its progression creates tolerance to soak resource wasting code.

64 is twice as large as 32, so 64-bit games are twice as good. Similarly, 32-bit games were twice as good as 16-bit games while 16-bit games were twice as good as 8-bit ones. Only on the PC do you get this trend. On the console, Playstation 2 games were twice as good as PlayStation 1 games, but PlayStation 3 games are only 33% better than PlayStation 2 games. Micro$oft was going to fix this by going from the Xbox360 to the Xbox720, but that was before the GFC (global fundraising crisis). Now you’ll be lucky to get the Xbox540, never mind 720. Trust me, I’m a statistician.