Nintendo finally reveals the Switch console

Serious question, how many games does the iPad have that have the visual quality that’s been shown on BOTW that run for more than 3 hours of gametime on battery?

My wife’s iPhone can’t barely handle Candy Crush and Tsum Tsum for that long.

Several games, though I’m not up on the very latest graphical iPad games. Nova 3 ran for 3 hours+ on the iPad air and performance/battery life has improved with newer iPads.

Well, they did it with the Wii and with the DS/3DS just fine.

fair question, but part of that is programming too. There are games that suck battery life like oxygen that aren’t graphically demanding compared to others that are more efficient.

They’re not following the Wii; they’re following the Wii U… the good will or buzz they got for the Wii is long past and the Wii U is not regarded as very successful.

The machine can reach 1089p at 30fps. Mario Kart 8 Deluxe will be running at that framerate. It certainly speaks to the graphical quality and depth of the game world that it won’t be able to run at 1080p at 30fps though.

There are a million different factors that determine the resolution and framerate of a game. It’s not like “1080p and 30fps” are just two checkboxes that you can tick. You can run Pac-Man at 1080p, but that doesn’t mean it will look better than Breath of the Wild at 900p. What’s the texture resolution? Draw distance? Shadow complexity? Lighting model? Everything is a push and pull, and obviously Nintendo didn’t want to sacrifice graphical quality for a bump in resolution.

And by the way, a lot of Xbox One games only run at 900p resolution, and I don’t think anyone would accuse that system of having “the power of a mobile device.”

I agree with what you’re saying here, although I’d get nitpicky with “…Nintendo didn’t want to sacrifice graphical quality for a bump in resolution”.

Resolution is a part of graphical quality. It’s just not all of it. Again, I get what you’re saying, Nintendo (and everyone else) make these trade offs all the time. But I can’t not argue with Andy Bates in a Nintendo thread, right?

I think that was the 5200. They even made the first dual analog controller, if you count the Space Dungeon pack-in dual controller holder (which at the time was fabulous).

/nerd

Looking at 1080p instead of 4K is already generous. At some point, you have to wonder if Nintendo is asking consumers to sacrifice too much, the resolution, batter life, extra controller costs, paid online from a company that has been pretty awful with online… i mean first party titles will only take you so far.

I thought I was pretty clear about distinguishing between graphics quality in general (texture resolution, draw distance, shadow complexity, etc.) and resolution specifically. If you look online, there are a lot of discussions about “Which is more important: resolution, framerate, or graphics quality?”, so clearly people think about resolution as something separate from graphics quality.

That’s why I think that saying, “X doesn’t do 1080p resolution” doesn’t tell the whole story. I can take a low-end PC graphics card and get it to run at 1080p at 60fps (or even higher) by cranking all the settings down to the bare minimum. but that doesn’t mean you would necessarily want to play that way. And as others have said, a lot of Xbox One games run at 900p instead of 1080p.

Good point about the Atari 5200, although it didn’t auto-center so it was a little more difficult to use. Would you accept, “Nintendo had the first auto-centering thumbstick” instead?

In any case, after Atari stopped making home consoles (and abandoned the analog joystick before that), no one had an analog stick on their controller for several generations. Then Nintendo came along with their analog stick, and it was so successful that Sony revamped their PSX controller in the middle of the generation to add an analog stick, and then later, rumble. And now, everyone has an analog stick.

Not really. 4K adoption is still a relatively minor percentage of the installed base of televisions, and you can argue that most people who have 4K displays don’t even sit close enough that they would notice the difference between 4K and 1080p. Most people aren’t sitting 6’ away from their 60" screens.

The BBC Micro had an analogue stick.

Meh.

Most people wouldn’t be able to tell if a game was running at 900p or 1080p (and probably even 720p, I know I can’t) if it wasn’t pointed out to them by the internet.

Then why should I get Zelda for switch over Wii u if resolution doesn’t matter?

Dude, the environmental sound!

I’m getting it on the switch so I can explore in handheld mode while watching tv with my wife/shitting/traveling etc. Also as stated above there is more to graphics than resolution. Draw distance for example would be really important for this game.

If you already have a Wii U, then you have to decide if you want to buy into the Switch ecosystem. I think the idea is that most people don’t have a Wii U, so if they’re going to buy a new system, they’d want the brand new one and not the one that’s pretty much EOL.

Hopefully not all at once.

Agreed, though I can tell the difference when it’s 4K. I also can really tell if something is running at 60 vs 30 fps.

I’m buying Zelda for the Wii U since I don’t have a good reason to buy a Switch for at least a year (6 months if E3 has some surprises).

Only advantage I really see to the Switch version of BOTW for me is being able to play it portably, which while is a really good advantage to me isn’t worth $360.