Stardock Consumer Report for 2012

You’re talking about an OS that until recently needed a reboot to change the network address.

Production costs for “cutting edge” were a hell of a lot lower in 1990. The break-even point is entirely different.

Origin also existed in a time in which it was perfectly normal (and neccessary, to a degree) to buy a new PC to be able to play newish games every other year.
33MHz. 66MHz. 100MHz. 133MHz. 166MHz. And so on.
For a while, having a new GPU was equally important, but I think that was actually post-Origin already.

These days, however - when I picked up BSI I was visiting my parents, and I still have my old gaming rig there, which is about seven or eight years old, I think - a wimpy dual core with 4 gigs of RAM running Vista (and XP on another partition) and a GPU that was cost-efficient at the time (and whose drivers haven’t been updated in an eternity). Much to my surprise, BSI ran perfectly fine on it. It sure looks more pretty on my actual PC, but it doesn’t run any better. Both at 1900x1020.

Personally, I blame the weakness in desktop PC sales these days MUCH more on the fact that, even for a gamer, there’s hardly a NEED to upgrade any longer in a remotely timely manner than there used to be than on all those tablet shenanigans.


rezaf

I have to wonder whether or not we’d be having this conversation if Microsoft had no game consoles to protect.

There was an article recently on Eurogamer about how to possibly match / outmatch a PS4 / next Xbox today by carefully choosing PC components.

My current PC (Core Duo E8500+, 4 GB RAM, Radeon 4870) is getting old now and I refrained from trying to play certain games as I want to play them with most / all bells and whistles in 1680x1050. On the other hand I have a backlog that will last me a life time so I’m in no hurry to upgrade.
Now imagine if there will be components available to match a PS4 / next Xbox end of this year that are reasonable priced (I’m not talking Titan)! That would mean I might have no reason to buy a new system for another 5-6 years until the next-next-gen will be coming.

There is also a problem for Microsoft as I will probably go from my XP / Win7 installation to a Win7 / Win8.1 installation and then be done with their operating systems for 5-6 years as well. I have a Win7 license and got a Win8 license dirt cheap at release (I assume Win8 users will be able to get Win8.1 free unless Microsoft plans to piss off people even more).

I most likely will wait until PS4 is out and / or Xbox is revealed before buying my next system. Last time I made the mistake to go with Core Duo instead of Quad as most people told me that no one will need a Quad for some years (GTA IV says “Hello!” :p ).

Windows 8.1 will be a retail upgrade, not a service pack, the same way Windows 7 was considered a point release of Vista internally.

Just for clarification, Windows 8.1 equals the “Windows Blue” thing that’s making the rounds, or rather the first step of it, right?


rezaf

Yeah, Windows Blue and Windows 8.1 are the same thing. Presumably neither is the final name, but it is the next full version of Windows following Windows 8. I wouldn’t expect any favors from MS on pricing for existing Win 8 users, either. I certainly didn’t get any consideration when 7 came out as someone who paid for the Ultimate version of Vista. They were only giving such low prices for Windows 8 because it was such a trainwreck.

Every Origin release was the equivalent of Crysis back in the day, and those are exactly the kinds of games that can and should push hardware. They’re showcase games.

We’re not talking about those kinds of games here. We’re talking about more niche-y games that are often played by people who aren’t staying on the bleeding edge because their genres don’t really demand that kind of horsepower. That audience is strong in places where you don’t really think about PC gaming, like Turkey and South America. They play tons of strategy games and RPGs in those places, and are huge growth area for F2P games and browser games because both types typically run on slower hardware.

North America no longer is the most important PC gaming market; the rest of the world dwarfs it in absolute numbers and willingness to actually spend money on PC-only games. You cater to one while ignoring the rest at your own peril. North America is quickly shifting to phones and tablets; desktop sales are dying, and laptops are holding steady but may start flattening out here.

Why in hell would you have a Windows 7 32-bit install on a 64-bit machine?

Shut up! I hate you!

Well yes, but you agree it’s something they should do, right?

OEM’s installing it, the lack of a 32-bit install path since XP… Corperation’s installs, people not knowing better, even.
My point is that it should be an easy fix. Let people get a 64-bit OS running without needing to wipe and reinstall.

Considering all the crap that most people have on their computers, a fresh installation would probably be just what the doctor ordered. A migration tool that could easily ‘grab’ the user content and extract it to a USB-Device, a VHD or something similar, or even a new partition on the HD after shrinking the installed one to make space, or uploading it to Skydrive (with encryption+password) for retrieval later on should take care of any issues with “oh noes, I have to reinstall Java v1.6U24 and Adobe Acrobat 6”. Perhaps even have the tool verify installed packages and warn user about security issues - akin to Qualsys Browsercheck - so they could Migrate what they want – that is safe – and have the rest available for ‘one click installation’ once they are riding the wave of the future.

Having said that; Most users would probably just want to press a Button and have it all handled including the - 36.743 “copy of”, .~tmp, *.jpg.pif, “Shortcut (6)” files, Eggplant theme and insecure applicationS etc… "

I think you have a bit of a rosy view of that which you’re suggesting here.

Basically every strategy game keeps everything in memory at all times - because there’s really no other choice. As KevinC pointed out, you can jump to any section of the map at any time, and at that location you might see virtually any unit in the game. This is not a challenge that games like Diablo 3 have to tackle.

Sure, you might argue that in the first few turns you don’t see the late-game stuff so why not just load that up when you need it? The problem is that engine architects must build for the worst-case scenario, not the best-case one. If it’s possible on the very last turn to have 95% of the units on the map, then your job is no simpler if that’s not also a requirement early on.

You suggest that assets be built on-the-fly. This is certainly possible, but the more you do the more expensive it gets. If this generation process is even marginally too slow for some users it becomes completely unusable. And if that happens, the price is either cutting assets from the game or redoing them all to take up less memory. Not a risk you want to run.

As you note, Civ 5 does keep a low-res cached version of the terrain in memory at all times and then builds a high-res one as the camera moves over an area. The fidelity of the map dictates that this was the only way it could have been done. However, the entire set of units, buildings, etc. is actually stored in memory at all times, as it doesn’t really make sense to keep “half” assets of objects around. Not only would doing so greatly increase the workload for the art team, but you’d also need to transfer any state data (e.g. animations) over between the two seamlessly enough that players don’t notice.

And “just” the map work required some of the most skilled programmers in the industry using low-level features of modern video cards that even recent AAA shooters rarely, if ever touch. Firaxis was incredibly fortunate to have a former Microsoft’s DirectX architect on board to steer this groundbreaking effort. Not every studio is quite so lucky.

Now that having been said, development teams certainly have complete control over the style and cost of their art assets. Yet another reason why I’m a fan of 2D games!

  • Jon

While I’d agree it’s something that would be awesome, I think just from a driver perspective it’s an impossible battle at this point to try and do it w/o 10+% of existing installs puking on the process. There’s so many driver/hardware configs out there that there’s no way they could test even a tiny fraction of them…

Jon, first of all, thanks for taking the time to write this all down - it’s interesting to read the “inside perspective” on those things.

It’s hard (and pointless) to argue with someone who has actually shipped strategy games from what’s basically an armchair perspective. I just can’t help but wonder on how it works in other games.
You say Civ5 was a bleeding edge graphics effort - but this “pop-in” effect was actually far too noticeable early on - and, I know you should know better than me, but it wasn’t just low-res vs. high res. Well, I guess it depends on how you define low res. I remember scrolling and I always had seemingly untextured models for a second before they slowly (well, over the course of one to three seconds, depending on the zoom level), one by one, were replaced by actual nice-looking models.
This made the “feeling” of the map much inferior to Civ4, which never had that issue.
Much later when I’d picked up the game again for G&K, the effect wasn’t nearly as noticeable (still on the same computer, of course).

As for keeping EVERYTHING in memory, I’m really surprised to hear it actually works this way. I can’t imagine MMOs, where a TON of actors can be on screen at any time, with a huge variety of completely customizeable gear, really go that route as well.
I mean, you wrote all strategy games do it, but if there are techniques to deal with such requirements, they should be useful in any genre, no?

And I’m sure there are solutions to various problems that present themselves (some you have pointed out) - especially in a turn-based game. For example, I though Stardock’s idea to just fade to a painted map when zooming out was cool. The end result … could’ve been better, but the idea was really neat. And saving the engine some work when zoomed out, I’d imagine.

Btw., I’m also a fan of 2D games - the better ones look great to this day, while early 3D looks horrible and even later 3D with non-AAA budget can look rather crude. I’d even 3D needlessly complicates games like Civ because clarity is traded in for single-shot eye candy, but I’m surely in a minority there.

In any event, it wasn’t really my idea to go down this technical route - my main point, and it still stands, is that, for whatever reasons, developers these days shy away from complicated games with a lot of possibilites and gravitate towards simpler is better - you’re a major advocate of this idea yourself - and this is what keeps grand games from being made, not memory limits.
I’m on Win7x64 with 16gigs of RAM, so I personally couldn’t care less if x86 support was dropped.

Finally, thanks again for your opinion on those things - much appreciated. The same to Brad, of course.


rezaf

From my understanding of things the shared video and system memory is going to be a hard thing to duplicate on a normal pc if it’s used to it’s full extent. Imagine scene changes that don’t need textures loaded to a video card across the bus but just need a pointer passed to tell the gpu where to start rendering. I don’t think Direct X has any provision for shared memory tricks in it, does it? I’d think it would take an ungodly PC to be able to match the performance people will be able to squeeze out of the PS4 once programmers get up to speed on it.

As someone who rolled their own linux distribution at one time I would say that it is probably not worth the effort for them to do this as it’s likely a tiny minority who would want to do it. It would also be rather costly as they would need to install all the 64 bit binaries and libraries which is effectively a complete reinstall. I certainly wouldn’t budget the time for someone to script and test the entire installation procedure unless a very solid business case was made for the demand being there. They already provide a method of backing up all your user files and settings I think that’s the best you’re going to get. If you want to make something like this a lot easier on yourself stop putting everything on c:

My pleasure. It’s a great discussion and I wanted to throw my two cents in since it is a topic near and dear to my heart. It’s important that everyone acknowledge the technical limitations that exist, because that allows you to design for it, rather than around it.

I’ll probably stick to 2D forever - and there are drawbacks to this approach as well (such as combining things in interesting ways… a mix-and-match equipment system is virtually impossible in a 2D game). But at the end of the day I’m a designer, and I want as much flexibility from my technology as possible. 2D is less expensive on both the art and tech side, and opens the door to a wide variety of games that I could never even consider were 3D the only option.

Anyways, I digress!

It’s hard (and pointless) to argue with someone who has actually shipped strategy games from what’s basically an armchair perspective. I just can’t help but wonder on how it works in other games.

You say Civ5 was a bleeding edge graphics effort - but this “pop-in” effect was actually far too noticeable early on - and, I know you should know better than me, but it wasn’t just low-res vs. high res. Well, I guess it depends on how you define low res. I remember scrolling and I always had seemingly untextured models for a second before they slowly (well, over the course of one to three seconds, depending on the zoom level), one by one, were replaced by actual nice-looking models. This made the “feeling” of the map much inferior to Civ4, which never had that issue.

Much later when I’d picked up the game again for G&K, the effect wasn’t nearly as noticeable (still on the same computer, of course).

This is the challenge of PC development when compared with closed platforms like consoles. Phones are starting to drift into the same territory as PC now that the market is becoming so fragmented. But mobile games… shudders

As for what was causing your particular issue - I have no idea. It could be related to drivers, a specific bit of code in the engine or perhaps optimization. Drivers are often the culprit, particularly in games that use some of the fancier, less-often-used features (like Civ 5).

But it just goes to show what can happen if your sophisticated build-it-on-the-fly processing doesn’t work quite right.

As for keeping EVERYTHING in memory, I’m really surprised to hear it actually works this way. I can’t imagine MMOs, where a TON of actors can be on screen at any time, with a huge variety of completely customizeable gear, really go that route as well.

I mean, you wrote all strategy games do it, but if there are techniques to deal with such requirements, they should be useful in any genre, no?

It very much depends on the design of your game.

If you can “teleport” to anywhere in the world and the expectation is that this will be instantaneous, then everything has to be immediately accessible. This is the major challenge provided by 4X games development, and why games that “look so simple” can be such performance hogs.

But if you have zones, levels or it simply takes a while to get from here to there you have the option of loading assets in and throwing them out when they’re no longer needed as the game continues running. This is the only way open-world games like GTA and Skyrim are possible. But if you go into a building or something, there is some load time, as keeping building interiors in memory at all times isn’t feasible with the small amount of RAM the current generation of consoles provide.

4X games could utilize this technique, but only if people were willing to wait several seconds every time you clicked on the minimap, or just scroll around slowly like a character in an RPG. I’m sure some people would be fine with this, but it would be an unacceptable trade-off for most.

And I’m sure there are solutions to various problems that present themselves (some you have pointed out) - especially in a turn-based game. For example, I though Stardock’s idea to just fade to a painted map when zooming out was cool. The end result … could’ve been better, but the idea was really neat. And saving the engine some work when zoomed out, I’d imagine.

Btw., I’m also a fan of 2D games - the better ones look great to this day, while early 3D looks horrible and even later 3D with non-AAA budget can look rather crude. I’d even 3D needlessly complicates games like Civ because clarity is traded in for single-shot eye candy, but I’m surely in a minority there.

Immersion can be provided in many ways. I like both approaches, but many people who play games mainly for the “feel” aren’t willing to play 2D games any more. There’s no arguing that the look has a dramatic impact on this, so I can’t begrudge them too much. It’s just something you have to accept as a creator of 2D games.

Personally, I find the benefit of dramatically lower development costs to be more than worth the lost sales. Then again, I enjoy more niche, exploratory games though, so there’s no chance of a game becoming a multi-million seller. Other studios can’t take that risk.

In any event, it wasn’t really my idea to go down this technical route - my main point, and it still stands, is that, for whatever reasons, developers these days shy away from complicated games with a lot of possibilites and gravitate towards simpler is better - you’re a major advocate of this idea yourself - and this is what keeps grand games from being made, not memory limits.
I’m on Win7x64 with 16gigs of RAM, so I personally couldn’t care less if x86 support was dropped.

I agree, this has been a trend over the past twenty years, and it’s almost entirely due to cost. $500k projects allow the opportunity to experiment much more than $50M projects. And I don’t blame the people writing these massive checks for playing it safe - honestly, if I had $100M in the bank, about the last place I’d invest in would be games.

  • Jon

Is the 2D lost sales argument as big of a deal now for non-AAA games, especially those that cater to an older crowd? Those of up who grew up on sprites for the most part don’t have a problem with them today. Also, some in the younger generation do appreciate good sprite work. The success of Skullgirls’s crowdfunding and Cryamore’s kickstarter are examples of that.