Stardock Consumer Report for 2012

Sounds like another Indie Fund, which is probably a good thing, depending on the specific terms of the funding offered.

It’s more than just having individuals look unique. If you look at say Shogun 2, look carefully at the terrain and the detail on the units themselves. Also note that the battles are pretty much the whole thing with relatively few types of units. Because they’re such good game designers, the game is fantastic. However, there are many types of games we just can’t make until we have more memory. For example, without 64-bit, Shogun 5 isn’t going to look noticeably better than Shogun 2 and it’s scope won’t change.

Sins of a Solar Empire is a really good example. I’d love to see a 4th race added but even as-is, we had to lower the texture resolution in Rebellion some in order to fit the new ships.

Futhermore, going the uber-beautiful route has other implications - look at your own WoM as an example - you were stingy with the races because you felt you had to save on models.

It wasn’t a cost issue. It was a memory issue. We simply couldn’t store unique equipment and such in memory for more than 4 models (Men and Fallen, male and female). In the next fantasy game we make, we’re going to just require 64-bit so we can have truly unique looking races.

Now, if you could have had a myriad actors on screen, each with perfectly motion-captured movements and individually rendered armor and weaponry, enough memory and CPU power to simulate the level of tension for each actor on the field, their individual stamina, their moral, their adrenaline level, their whatever - would you have been able to include all that stuff or would the simple fact that implementing it costs development time which cannot be allowed to go totally overboard have put a very heavy dampener on what’s actually added, CPU and memory limits notwithstanding?

I’d say that the low hanging fruit is that we just can’t store that much stuff in memory right now. We had to limit map sizes in FE simply because of memory. We wouldn’t even bother worrying about the other factors until we can deal with memory.

Somehow, FPS/TPS developers have been able to live with the limit and craft breathtaking visuals despite still having to live with a 2gig architecture. Visuals (in the best games) have massively improved since the early days of 3D engines around 2000.

They’re only showing a handful of units at once and you’re seeing less and less variance. Consider how many units you see at a time in Battlefield 3.

Memory = Quality X Quantity. Strategy games always have a large quantity of objects to deal with. So we find ways to lower the # of objects (Moba games for instance).

Yet for every game that does something groundbreaking, you see half a dozen games that ignore that stuff - most likely because they are unable to fit it into their development budget.
So I believe that, if (and it WILL happen sooner or later) everyone has a 64 bit OS with gigs and gigs of memory and a CPU with >= 4 cores, game developers will find another scapegoat that’s responsible for them being unable to make a superior game.

I can’t predict what other developers would do. I can tell you that we’re just going 64-bit regardless because we can’t go any further with 2 gigs. LH will be our last PC game that supports 32-bit because otherwise we’re just nibbling around the edges.

I don’t know about you but I want to play against Orcs, Trolls, Goblins, Faeries, Halfings, etc. that look visually distinct and have their own unique stuff and play style. To do that, we have to have more addressable space to work in.

And it’ll not even be a scapegoat, but simple facts. Beyond FPS and TPS games, how many games will get the super-beautiful and slick treatment the new TombRaider enjoyed? Even compareable games with compareable amounts of actors, like RPGs?
These things CONSUME development resources (=money), and a game must be MASSIVELY successful to recoup the investment. Take a look at how S-E were disappointed with TombRaider’s sales.

RTS’s were very very popular for a long time. They didn’t stop being popular. Look at how popular Age of Empires 2 HD is. The problem is that people wouldn’t be satisfied with a 32-bit Age of Empires 4 because it wouldn’t be able to visually look any better than Age of Empires 3 (which forced the player to be zoomed in because they needed to offload memory).

Added to that is the aforementioned issue of developers shying back from complicated systems in the first place. XCOM being much poorer on features than X-COM.

XCOM is an interesting case study on the state of our industry. For XCOM, Firaxis licensed the Unreal engine which is designed for first person shooters. So a lot of the design of XCOM had to be adapted to what the engine would allow.

Or lots of envisioned systems in WoM that got axed in favor of a more steamlined game experience. SimCity 5 being more of a SimTown successor. The simple fact that a fair number of ambitious 90s games remain without a proper successor to this very day, even though THEY had to cope with single-core machines with often less than 100MHz and a few MB(!) of memory.

Simcity 5 is another good example. Why are the cities so small? Why are the textures on the buildings so much poorer than the alphas? It’s because of memory.

Yeah, if we had developers going for 64bit only today, we’d have sequels to all these games tomorrow. Sure.

Not tomorrow. But I suspect 2015 is going to start seeing some interesting new strategy games show up.

Will one of those be under the Stardock banner, by chance?

//shameless fishing for information.

Thanks Jeff.

I think what you’re seeing is the difference between a publicly traded company and a privately traded company. A publicly traded company is supposed to do whatever it takes to increase shareholder value. By contrast, a privately traded company is supposed to do whatever its shareholders think should be done.

Most of what’s already been invested in the past year or so won’t really show up for a couple years yet. But I think we have an opportunity, especially with all the exciting changes going on in our industry, to do something that will “play it forward” for the next-gen. I don’t think it’s healthy if the only decent budget games are made by a handful of mega companies churning out the latest entry in their franchise.

And there’s a lot of talented people out there right now. You read about these lay offs and it’s heart breaking to think of all the talent and experience that might be permanently lost. And as the tech requirements get more severe for new entries, the days of being able to start from scratch are coming to a close – I mentioned this earlier, Firaxis licensed the Unreal engine to do XCOM.

If you lose too much of the talent and experience in our industry it’ll get harder and harder for us to create new types of high budget games because you’ll be stuck having to make games that can realistically be made with the Unreal/Unity/DICE/CryEngine/Gamebryo.

Heh. I never commit to dates anymore. :) I learned my lesson.

Supreme Commander runs out of memory because it isn’t particularly efficient, not because it’s impossible to have a game of that type work in under 2GB. SupCom could use up 2GB of RAM to handle Seton’s Clutch with 4000 units; SupCom 2 can handle it in 400-600MB of RAM. And despite what people think, the map is exactly the same size in SC2; it’s a direct import.

At the very least, numbers progress for 64-bit OS seems to be favorable among gamers:

http://store.steampowered.com/hwsurvey

64-bit to 32-bit ratios:
1.4:1 for Vista
4:1 for Windows 7
13:1 for Windows 8

The main reason why 32-bit OS’s still existed and why there’s no upgrade path from 32-bit to 64-bit are:

  1. 32-bit Intel Atom CPUs
  2. Most machines that come with 64-bit CPUs are coming with 64-bit Windows, so that upgrade path doesn’t matter
  3. Most customers don’t upgrade anyway

Stardock is also completely disingenuous when they talk about how Microsoft “didn’t give” DirectX 10 to Windows XP. It’s not a matter of “giving” or being nice and favorable. It’s the fact that Windows Vista introduced a completely overhauled, completely built from the ground-up new driver model for a brand new OS. And DirectX 10 fully relied on that new driver model. Saying that Microsoft should have “given” DirectX 10 to XP users is akin to saying Microsoft should have given Vista to XP users. XP is based on such ancient technology by the time DirectX 10 came out that it would have been impossibly overwhelming, unfeasible, and relatively pointless to figure out a way to fit it all on there.

Besides, it’s not like the lack of XP support was the main limitation preventing DirectX 10 from taking off. It’s the simple fact that the vast majority of games on PC are cross-platform with consoles, so there was little incentive to build support for it in early days.

Hmmm, well you would know better than me, LMN8R, but that POV is hardly unique to Stardock… it is shared by basically everyone I know. The way MS went about it sure made it look an awful lot like they were trying to drive Vista upgrades.

Of course new features in a new OS is an attempt to drive purchases and upgrades to a new OS. But focusing exclusively on DirectX as one of those features and saying it should have been back-ported ignores the billion other things that feature inherently requires to fuction properly, to the point where you’re essentially asking “hey you know that new OS? Can you back-port that new OS to the old OS?”

There are so many dependencies and such an overwhelmingly huge level of architectural changes that this isn’t a simple matter of a crude business-level decision driving a technical decision. I’d argue that it’s an incredibly good thing for the PC market as a whole. Had Microsoft decided to work some magic and do that extraordinary amount of work, it would have either come at the expense of DirectX itself (since it would likely need to be extremely compromised to achieve this), or it would have taken such a long amount of time that it would have delayed Windows 7 by a huge amount.

The “point of view” by people who thought that this was a crude business decision also had the “point of view” That DirectX 10 would be “hacked” into XP. But that absolutely never happened, because it was so technically infeasible to accomplish. Every attempt ended in utter disaster because they yielded such horribly buggy and unacceptable results.

Brad, first of all thanks for your very extensive reply.

Wait, what?!? Why would you need to store all that stuff in memory all the time? Why is this in any way more complicated than, say Diablo 3, which has a metric ton of creatures in it, which all look way better than anything you can see in WoM? And if you’re pressing it, you can almost see as many actors on screen at once as you can see in a WoM battle, too.
Furthermore, D3 has tons and tons of visually distinct equipement - MUCH more than WoM, and you can mix and match it as you please. As should be the case in a 3D based system.
I’m not sure you tried to use the canvas you have in the most efficient way before demanding a bigger one after these expanations.

I’d rather quote Diablo 3. If I’d play any, I’d possible quote some true MMORPGs, some which are bound to have tons of actors on screen at once, all belonging to a variety of races and equipped with stuff from an immense selection. How can they even function if you run our of memory after four models and a few handfuls of equipment?

Well, good luck with that. I’ll wait for that game you’ll make to see how much of your plans come to fruitition.

You mean like in Master of Magic? :p
Must be 3D you say? How about in Warlock? Or in Fall from Heaven? Or in Mark of Chaos? Or in Rise of Legends? Or in Spellforce?
If only any of those games had managed to put more than two humanoid species into memory - think of the possibilities!
Wait a second …

That’s a bold assumption. Firstly, much to my dismay, yes, RTS are much less popular than they used to be. And, I guess it’s arguable, but I think Starcraft 2 looks better than AoE3. It has a couple of quite different units, too - sometimes a large number are even on screen at once.
I have a VERY hard time believeing memory limitations were the reason no AoE3 was made…

Jake Solomon claimed they had the original game working and only then started removing stuff to steamline the experience. I strongly suspect this is not entirely true, but that’s what he claimed.
Other than that, yeah, possible Unreal-Engine limitations played a role - for example as to why there are no randomized maps.

Hmm. I seem to recall a previous SimCity game with many more building types and decent textures, both piled upon almost endlessly by a very active modding community. Must’ve dreamed that up.

We’ll see, I guess.


rezaf

SupCom 2 has a a smaller unit diversity. SupCom has what, nearly 400 different units? And SupCom 2 has about half that per-side, and 3 sides rather than 4. So more like 150 units…

LMN8R - And so? There should STILL be a 32-bit to 64-bit upgrade path for windows in-place. Even if it’s a virtual reinstall, as long as people kept their data… (I’ve been asked to do it before, I just refuse - it’s a wipe and start over scenario which is far more pain than it’s worth). Even today, the steam survey shows over a quarter of user’s systems are not DX10 capable! If I was looking at making a PC game, a non-AAA game, then 32-bit and DX9c is a no-brainer still.

Rezaf - There’s only a handful of creatures on any given level in D3. In a multiplayer RTS, ALL the units can be built. The “units on screen” issue isn’t the same at all, that’s generally GPU-bound.

Both SupCom and SupCom 2 ran on the 360; in the case of the original, the map detail and resolution was changed, not the quantity of units, as they consumed much of that RAM. Also, the Lua code that ran most of the unit logic was moved into code for SupCom 2 for a huge savings both with RAM and with required CPU power.

Rezaf, an ARPG like Diablo 3 and a strategy game like Elemental or Civilization or Supreme Commander are orders of magnitude different in terms of how many units they are displaying and tracking at once. I highly doubt Brad is talking about a tactical battle being limited by 2GB of memory, he’s talking about the game in general. All those models and units are running around the world map. In a ARPG, you only ever are displaying what… 20-30 units at once? Everything else is safely put to sleep, or kept in different zones altogether, until activated by a player approaching. You also don’t have all the other factors involved like AI for multiple civs, etc.

What benefit do you get from two cores feeding the GPU that couldn’t be adequately addressed by having one core do geom/lighting/etc setup and another core just pipelining that set up to the GPU? Do modern GPUs have the ability to seperately process two sets of independent data (say, a thread per buffered frame or the like?)

Kevin, I’m aware of this, which is why I also quoted some strategy games downpost. Also, technically, all those models and units are “running around” on Diablo’s map just as well. It’s just that, in Diablo, new models are “activated” when the player moves near them, whilst in WoM this has to happen when the cursor moves close to them.
The reason I picked Diablo over the strategy games is because of the modular nature of WoM’s units, which happens to be also the case in Diablo 3 (to a MUCH larger degree, even). You can take a number of base models (male and female version of all 5 classes) and equip them with a selection from a gargantuan array of weapons and armor of different types. And after that, you can add the huge selection of actual monsters that have unique icons.
Of course an efficient engine needs to “build” these models on the fly, and not keep them all (or even all possible combinations) in memory all the time.
In early Civ5, you could actually see this effect in action whenever you scrolled - you could literally see the models being created and textured whenever you moved the screen.

About AI and whatever else might be noteable - I referenced all this in an earlier post, but Brad specifically explained mainly graphics are the problem, everything else is neglectable.


rezaf

Yeah, Intel’s been super shitty with 64bit support and that’s to blame in a lot of ways for how long the Windows transition has taken. There has been no technical reason for them not to support x86-64 across their entire product line, it’s just been an arbitrary feature they used to differentiate price points, much like they’ve done with hardware virtualization and other things.

Those luddites are still playing CS 1.6 and wont be interested in anything else.

There’s no good justification for games that don’t require high-end 3D hardware to require 64-bit CPUs. “The world” is a bigger overall market than North America, growing at a much faster rate, and a few generations behind on hardware. Going 64-bit only right now would pretty much take you out of multiple markets.

There is always the possibility of more people upgrading to ‘the future’ if there were more applications that required it. I.e. someone has to go first and innovate so the rest can follow.

But from the looks of it, the future will be tablets and mobile thingies running apps in walled gardens and 64-bit PC/Options will be a niche.

I disagree. Origin Systems did VERY WELL not catering to the installed base.