The specifics of what hardware is required is irrelevant, the point is both the PC and console games markets require specific hardware and therefore are opt-in markets. You can’t force this hardware onto consumers.
Yes, it’s true, many grandmas would rather play on Pogo. That doesn’t mean there isn’t an underlying hardware issue that hurts the PC gaming market.
I agree that having more of it out there would only help the PC games market (as it would any market with similar dependencies) but the market clearly isn’t interested in 3d hardware being a standard.
The problem is people don’t want to pay for what they don’t need/want/intend to use. Most people who buy a PC aren’t buying it with the intention of playing games, if they were they’d get the damn video hardware.
Isn’t that in part because they can get a console that would do the same thing for cheaper? Especially with HD, the graphics difference between consoles and PCs just isn’t that huge anymore (to the average consumer…yes, I know the difference, as a PC/console gamer).
If PCs could play the same type of games at the same price, it might help. But I think TrunkDR is right that it may not be an achievable goal. I disagree that it will always cost 2k to run good games, but the main point is still valid.
BTW, what really bothers me is that these integrated machines can’t even run games from a few years ago. They are utter crap, as far as hardware is concerned. I could put together a $500 PC right now that would run Oblivion just fine (and it would look good). But if I buy a $500 PC from Best Buy, I doubt it could do the same.
The market is already paying a premium for a faster processor than they probably need, an onboard 3D graphics accelerator which they probably don’t need, and onboard 5.1 audio which they probably don’t need.
But because of proper leadership by hardware manufacturers, those things have gotten much cheaper than they used to be, and so the premium is relatively small. In some cases, making a slower processor would actually cost the hardwre manufacturer more.
Again, the point you’re missing is that one of those components – the onboard 3D graphics – simply hasn’t kept up. Throwing up your hands and claiming it can’t keep up because it’s too expensive is not seeing the forest for the trees.
If we go back 10 years ago, the difference between the high end and the lowest end may have been a factor of 10. We could have scaled games between those two. For example, with the first version of Unreal, a resolution of 320x200 was good for software rendering and we were able to scale that up to 1024x768, if you had the GPU power. There is no way we can scale down a game down by a factor of 100, we would just have to design two completely different games. One for low-end and one for high-end.
That is actually happening on PCs: You have really low-end games with little hardware requirements, like Maple Story. That is a $100 million-a-year business. Kids are addicted to those games, they pay real money to buy [virtual] items within the game and the game.
If you look into the past, CPU makers are learning more and more how to take advantage of GPU-like architectures. Internally, they accept larger data and they have wider vector units: CPUs went from a single-threaded product to multiple cores. And who knows, we might find the way to get the software rendering back into fashion.
Then, every PC, even the lowest performing ones will have excellent CPUs. If we could get software rendering going again, that might be just the solution we all need. Intel’s integrated graphics just don’t work. I don’t think they will ever work.
Software rendering is slower, but sure it would still “work” today. Probably better, with multicore processors that are probably more than what most games need. But most games stopped shipping with software renderers a long time ago, because it was expensive to maintain and they wanted to target a faster minimum perfomance spec for graphics.
Because Intel decided to make that hardware cheaper for consumers (with some help from AMD and other PC hardware manufacturers). This was partly helped by the constant pursuit of increasing performance at the high end, which drove down the cost of older components. Again, it’s just that one of those components had been lacking. (Hard drive speed is another component that’s also been lagging behind.)
I still think you are missing the big picture. There’s mainstream market demand to advance components of PCs. The hardware manufacturers have invented newer processors and faster memory and bigger hard drives and so forth because that’s what the market rewards. Better specs sell (assuming the right price). But the 3d hardware hasn’t sold outside of the high end hardcore niche market. It hasn’t gone through the normal cycle of trickling down from the high end consumers to the low end consumers. This is because the low end consumers don’t seem to be willing to pay even a small premium for this stuff.
If it isn’t a selling point then there’s no reason to pour tons of R&D dollars into feverishly improving the integrated 3d chipsets. So nobody has.
Sweeney and Rein have been railing against integrated Intel graphics for a few years. I see the point. If mainstream PC’s were given a better integrated graphics solution, a host of gaming issues on the platform would go away. Not everyone needs an 8800GTX, but even if new PC’s had an onboard 6800GT equivalent, the potential audience would skyrocket.
This is a classic issue in economics. The fact is that lowering prices can create market demand. VCRs were around for a long time before they became consumer electronics. Why? Because consumers wouldn’t have liked VCRs in the 1960s? No, because they were expensive. Microwave ovens had the same problem. But once you had cheap VCRs and cheap microwave ovens, there was suddenly demand for video rentals and microwave dinners.