Sweeney: "The World Should Revolve Around My Needs"

Interesting interview with Tim Sweeney of Epic, here, whose tag line is "PCs are good for anything, just not games”.

Summarizing perhaps a bit unfairly, here’s what he says:

(1) People aren’t buying expensive enough PCs.
(2) Even the expensive PCs aren’t good enough to run his games.
(3) People who buy cheaper machines with Intel integrated graphics are giving their money to Blizzard instead of Epic.
(4) This aggression cannot stand. The solution is that everyone except us should change what they’re doing and buy machines with more expensive graphics hardware.

What I find sort of amusing about this discussion is that the only time the word “stability” enters the discussion is when Sweeney talks about Epic’s development environment (“Do as we say, not as we do!”). This reveals a fascinating truth that doesn’t get a lot of wide discussion among hardcore gamers: cutting edge graphics solutions tend to make your computer crash more.(footnote 1)

The high end videocard market is a dog-eat-dog market where the margin is high. Both hardware and drivers are revved frequently, more frequently than the marketing names of the cards might indicate. As a developer, if I had a buck for every time I encountered a situation where software behaved differently on two allegedly “identical” videocards, I’d be rich. Likewise, I’ve heard my friends in the Microsoft OS group talk about graphics drivers negatively impacting system stability in great detail (this detail usually involves phrases like “utter and complete shit”, sometimes coupled with the word “pigfuckers”).

All of which is a way of saying: complain about integrated graphics all you want, but computer and electrical engineering is, inherently, all about tradeoffs. What Intel is selling in their integrated chipset is more than just “crappy performance and low cost”. They’re selling some specific level of performance (“Good enough for X% of PC users to accomplish everything they need.”), along with low cost, low space, low power usage, low heat, and a certain level of stability.

Games are, for most of us, only a part of what we use our computers for. It’s not clear to me that a computer manufacturer – or a consumer – who accepts the tradeoff offered by the Intel chipset is making a bad choice. And if you insist on defining “the PC games market” as “those consumers who are willing to pay more money for a faster but louder, hotter, larger, less stable product,” then no wonder you think the market sucks. The market you picked sucks.

In summary: I wish Epic was a publicly traded company, so that I could short them.

  • Footnote 1: Cue response of “Well, my computer has never blue screened, so you must be doing something wrong.”

Actually, it’s not that it’s a bad market, just a limited one. The failure has really been the video card makers not being able to get Intel to integrate at least their older hardware into cheap Intel motherboards. The Intel integrated graphics are crap, and have been crap for nearly a decade, but no one seems willing to do anything about it. This is really limiting our PC games market. Even grandma’s cheapass old computer has a P3 in it these days which can run just about any current game; it’s the on-board graphics that are insufficient.

Of course, there’s the argument that people with low-end PCs won’t buy high-end games anyway, but that seems something of a catch-22.

Bruce

I guess I’m skeptical of the concept that the market should conform to the product, rather than the product conforming to the market.

If integrated graphics are the future of the PC market, then companies that want to sell games in the PC market have to design their games accordingly. Somewhere in Ford there might be a car designer who is bitter that most consumers can’t afford a car with a 1500 horsepower engine, yet we don’t claim that the automobile market is dead. We just adjust our expectations accordingly.

But it’s not really a matter of price, although that’s part of it.

It’s more like, suppose I sell HD programming. Now, only those people with HD TVs can enjoy my programs. Over the past decade I’ve been involved in a race, making better and better HD programming and my target market buying better and better HD TVs to enjoy it.

But the problem is, 95% of all the homes built are built by one company, and that one company has put default cheapo standard definition TVs in all their homes. So yeah, I can make SD programming, but then I’m not making HD programming, which is my area of expertise. Instead, right now I’m trying to get more people to buy HD TVs, but ulatimately it would help a lot if I could get the one company making new homes to at least put in one of my five-year-old HD TVs instead of SD ones.

Bruce

It’s not a catch 22, these people aren’t passing on games because of their limited hardware, they’re passing on them because they have no interest. You can buy your grandma a $5k behemoth and she’ll still just surf and e-mail with it.

The PC games industry is suffering and it seems hell bent on blaming the market (and piracy) for it’s woes. The PC games market is an opt-in market, people have to pony up for the hardware. This is no different than the console market (which is thriving) where people have to buy the console. They can’t change the market so the industry is going to have to adapt. For developers, that means targeting lower end machines, the high end market just isn’t big enough anymore to support the high cost of development.

Looks to me like you are blaming the wrong party here. It’s not like the PC manufacturers don’t offer some models with decent graphics integrated. They do, but those seem to be only a small fraction of their sales. That tells me that the buying public doesn’t want to pay any cost premium for 3d graphics. They express this by buying systems with graphics subsystems good enough to do everything but 3d game.

Every PC sold isn’t necessarily part of the PC games market. This fact needs to accepted. The PC market has spoken and raging against it isn’t going to get you anywhere. As Peter said you can’t force the market to conform to your product’s expectations.

I think the PC game devs need to stop pissing in their own market before they can fix the situation. The graphics hardware manufacturers have been doing advancements at a runaway pace in order to keep the hardcore geeks on the oh-so-profitable upgrade treadmill. They’ve deliberately created a market that moves too fast for the mainstream to follow which is why the mainstream isn’t trying.

The PC game devs can put a stop to this though, if they make a concerted effort and get some help from MS. They just have to say “Hey, hardware companies, this is standard capability model X and we are developing for this and only this for 5 years.” Having MS throw in a stable version of directX to last for those 5 years would be the other part of the puzzle.

Though of course would overnight slaughter the market for the Nvidias and ATIs of the world, so the game devs would have to win a really nasty fight for that to happen. Which I don’t think is realistic, but it’s nice to dream.

God damn car makers driving the buggy whip manufacturers out of business.

Am I the only one that didn’t ready any of that out of Tim’s interview? Basically it seemed to come down to common sense, and he even admits as much – high end games require big powerful machines that most consumers don’t own. ZOMG STOP THA PRESES! It takes a bit of effort to contort that into ‘blaming the consumer’.

I don’t understand why they can’t or won’t do this. Surely it would save money on development costs and ensure a larger market for their products. I see the high end video card/PC market as being driven primarily by the games market, not the other way around.

Why can’t Valve, Epic, id, Blizzard, EA, etc get together and proclaim to the hardware market what they are going to do? Surely there would be developers who didn’t follow but they would be catering to the niche market. This problem is similar to the one console future thing in that the machines exist only to run the software for them so the software makers should be in a position to flex their muscle and get what they need. Sure nVidia won’t be thrilled (AMD is moving more towards integrated stuff yes? And historically ATI was most dominant in the OEM market) but why should the software companies care? nVidia can come out with a product that meets spec or be left by the wayside.

edit: Also, with multicore systems being the norm now why is it not possible to use a second core to do the video card work? This is an honest question, if the two are completely different beasts and it isn’t possible feel free to tell me why. It seems though that the trend is to offload physics and sound to the other cores now, why not video as well?

I think you are wrong on several points here. Yeah, grandma doesn’t want to play shooters. But I do, and I can’t buy them because my PC isn’t fast enough. We aren’t talking about non-gamers here. We’re talking about people who would like to be able to play games without buying a $2k machine to do it.

The comparison between consoles and PCs is off too. It doesn’t cost me nearly as much to ‘pony up’ for a console as it does for a good PC. I have a 360 Elite, and it cost me 1/4 of good mid-range PC that could run Epic’s latest games. It runs Gears of War just fine, and anything else made for the 360 (haven’t tried Bully yet, though!). Having to buy a console isn’t even close to having to buy a gaming PC.

This, in fact, is part of the problem. People know they can buy a great console for $500 or less. They think they should be able to do the same with a gaming PC. But a $500 PC will have integrated video that can’t run these games.

Agreed. After reading your post, I actually followed the link to read the actual interview, and the original post is a bit of a stretch compared to what Tim Sweeney actually said, which is all perfectly reasonable.

That’s because grandma has been educated that those games aren’t for her, and if she did buy one, she couldn’t play it anyway.

Now you’re getting close. What’s the integrated video hardware on those consoles, hmm? Exactly.

Yes, it’s true, many grandmas would rather play on Pogo. That doesn’t mean there isn’t an underlying hardware issue that hurts the PC gaming market.

Bruce

The only thing that can make your PC fast enough is more money, that’s not going to change, ever. If you can’t afford to be part of the market then you should be championing my suggestion of developers aiming lower on their hardware specs, it will result in you being able to play more games.

Forcing fancy 3d graphics down everyone’s throat isn’t going to make PCs less expensive. You’re still going to need that $2k rig to run cutting edge games and turn that $500 machine into a $600+ machine for no real reason. The point is the PC market is what it is and PC game developers need to make products accordingly.

Read the interview with Tim Sweeney in the original post. He actually mentions this possibility as one possible future for PC Games. Switching to software rendering on the CPU, instead of hardware rendering on the graphics card. Since CPUs are getting more and more powerful, software rendering is becoming a good possibility again.

Again, catch-22. Premium 3d games aren’t for them, so they don’t think they need the premium 3d hardware. If they had the hardware, we could make games for them that they might actually want.

But the onboard Intel graphics can do some 3D. Why? Those same people would have been happy with 2D, and in fact were back during the Windows 95 days. You could always do the 3D in software. Intel correctly identified a LCD need for 3D graphics, but that LCD hasn’t really changed because Intel stopped competing with the 3D graphics chip makers. It’s those guys (and to a lesser extent, Microsoft and other game developers) who have failed to get Intel to integrate better on-board graphics, thus limiting the market.

Yes, there are many other reasons why the PC games market is limited, but don’t pretend like cheap onboard graphics isn’t one of them.

Bruce

I agree on a high level but I think the PC game devs have some measure of control, via what hardware they target, over the PC market. If they target a reasonable set of hardware in terms of costs/capabilities and stopped catering to the ultra high end then the market might well shift. The PC hardware manufacturers may just decide it is worth a few bucks more to be able to add “certified game capable” to the list of specs for a mainstream PC.

That’s my theory anyway.

Oh I agree it’s very limited because of cheapo onboard graphics. But I’m interested in exploring the factors that led us to the situation where cheapo onboard graphics are the preferred solution for the masses. Where exactly did those consumers get alienated and how can they be recaptured into the gaming market, etc.

You really aren’t listening, are you?

Consider this analogy to your statement.

“Forcing fancy new Pentium processors down everyone’s throat isn’t going to make PCs less expensive. You’re still going to need that $2k rig to run cutting edge games and turn that $500 machine into a $600+ machine for no real reason. The point is the PC market is what it is and PC game developers need to make products accordingly.”

So why isn’t Intel still putting out even cheaper computers with even older Pentiums? Because they don’t have to, as they’ve already reached a mass-market price point, and at some point due to overhead it becomes cheaper for them to make only a limited number of models. So the high end buys the high end stuff at a premium and that actually helps push the prices lower on the older, lower end stuff. But the lower end keeps getting more powerful.

The problem is the integrated graphics hardware isn’t doing the same. There was a time when integrated sound was pretty lame, too, but that eventually changed.

Bruce