Yikes!

Except that there’s this thing called science and math, and you can actually tell that certain things are literally not perceivable.

Except… ya can’t. At 8 ft, on a 55 inch TV, your eyes literally can’t perceive the pixels. Unless you got vision that is significantly better than normal people, which is possible… but really, a lot of us here are old men wearing glasses, not fighter pilots in their prime.

That link you got there even acknowledges that while in their test, people were able to perceive that there was some difference… that’s not really the same as being a difference in resolution. And they admit this.

At that range, on that size screen, you’re basically just getting to the point where you’re getting the full benefit of 1080p alone, much less 4 times that resolution. And again, some of this stuff is just simple math. Your physical retina simply cannot perceive detail below a certain level. You can clearly see a certain number of pixels per degree in your field of vision.

The bigger difference is that a 4k TV is likely going to have other things that make the picture nicer. And that’s what you’ll notice… but the resolution? There are physical limits to what your eye can perceive.

And again, this is even stated clearly, by the companies who make these TV’s, like Sony and THX.
From THX:

ON A 50-INCH 1080P HD DISPLAY, MOST CONSUMERS CAN BEGIN TO DISTINGUISH INDIVIDUAL PIXELS ONLY WHEN STANDING WITHIN SIX FEET OF THE SCREEN. THEREFORE IF YOUR VIEWING DISTANCE IS 10 FEET OR GREATER, AN ULTRA HD 50-INCH DISPLAY WILL LIKELY HAVE LITTLE PERCEIVED BENEFIT IN TERMS OF IMAGE CLARITY AND SHARPNESS

Also, I just noticed that Dave is gonna leave QT3 because of this? That’s nuts man.

I can. It’s not difficult. I’m sorry you’ve decided to believe dated mathematical extrapolations over actual experience. In my link the vast majority of people could see the 4K image was better on a 55 inch screen at 9 feet. THX is trying to sell theater screens so they may not have much interest in actual research. Their numbers are straight from an old visual acuity chart. And it should be noted that in both cases they are talking about video content, not real-time 3D graphics which feature aliasing, and numerous other artifacts improved by higher resolutions that are not a concern when filming live action. This amplifies the resolution difference by comparison.

Seems hard to believe there’s not a Linus video testing this very thing.

I knew about most of those. I didn’t know Immortal Fenyx was expected in December. It’s closer than I thought!

Dude, there’s a scientific limitation to the resolution that’s perceivable by your retina.

It’s not magic.

You might think you are able to receive some difference in those conditions… But there’s just not really any way you could be. Your retina literally cannot see the pixels.

Now, there are some people who might be able to… Is your vision significantly better than 20/20? That could potentially explain it, although at the absolute best possible result, the difference would be so tiny that it’s going to be lost in other far more dramatic impacts to the visual fidelity.

And let’s be real here, you THINK you can see the difference in resolution… But some large part of this is that you want to have gotten something for that money. But you never tested two identical TVs at different resolutions, right? Because it’s almost impossible to find such things. What you have, is a 4k tv that unquestionably looks better than your old 1080p tv… But it’s almost certainly better in a bunch of different ways.

Sony said the same thing. And they are actually selling 4k TVs.

And really… This is stuff that has established science behind it from before there were 4k TVs.

He does but it was more focused on 4k gaming on a monitor. Basically he said it’s not worth it as the difference is minor and you get more benefit from things like refresh rate than going 4K. The video is a couple years old and there weren’t 4k monitors that could go higher than 60mhz refresh rate so maybe that has changed.

The science you’re talking about only applies to black and white images on a piece of paper in a well lit room. It tells us almost nothing about what we can actually perceive in other circumstances.

It’s 20/20 in one eye, 20/30 in the other.

Incorrect. In my post above I was talking about flipping back and forth between the 1080p ā€œperformanceā€ mode and and the ā€œ4Kā€ mode in Avengers on the same TV. It changes the rendering resolution in real time so it’s very easy to do instant A/B tests on the same screen.

Curious, why do people buy an Xbox if they have a gaming pc? Is it just to play on a big ass tv?

Depends on how you define ā€˜gaming pc’, I guess. I play games on my PC, but it’s seriously underpowered, mostly for work. So my Xbox is where I do my AAA gaming, or stuff that I feel like would benefit from a big screen, surround sound, that sort of thing.

But that’s going to be the EASIEST thing to perceive. If you can’t see that, you aren’t going to see resolution differences in even less ideal situations.

Yeah, i don’t know what happens under the hood when you flip that switch, whether something other than resolution is also being changed.

But to be clear, when you are viewing it on 1080p, you are able to see individual pixels when sitting 8 feet away? That seems amazing to me, and very much at odds with what the scientific community seems to think the human eye can achieve.

I definitely cannot see the pixels on my tv at that distance.

I can’t believe you guys were mean to Dave Long.

You do realize being able to see a difference in detail and being able to ā€œsee individual pixelsā€ are not the same thing, right?

I really didn’t think I was that mean and I tried apologizing to him too. I feel bad about it.

I think whether you can notice 4k at a certain distance or not is besides the point. It’s about whether you care. There’s diminishing returns. I would never want to play new games at 480 resolution but I don’t feel the same animosity about 1080p. I have a 52" TV. It’s about 7 feet or so away. I like the pictures that it outputs. Now I would absolutely upgrade if my TV died but from what I’ve seen it’s not enough for me to chuck it in the dumpster just yet.

There’s a difference between games rendering resolution and TV resolution. I wonder how much of a difference you see with video content. The reason I say this is all video content is basically downscaling a higher quality image (reality). Games are creating an image from scratch, so with Avengers you aren’t comparing apples to apples. Avengers is generating an image at 1080 and then one at 4k. It’s not presenting a 4k image downscaled to 1080.

There is a benefit to cosoles being able to render at the higher res for supersampling reasons, but then this is where newer tech like DLSS comes into play to reduce that need.

Eh… they are the same thing.
Like, at some point, if I get closer to my TV, I get to a point where I can start to see issues with the edges of curves and stuff due to the resolution, where I’m noticing the actual pixels. At that point, a higher resolution would matter, because it would remove that problem.

But when I’m on my couch, I cannot see that. The resolution of the image my retina is sending to my visual cortex is is such that I can’t see that, because the pixels on the TV blur together and are just being recognized by my eye as a single ā€œpixelā€.

There’s certainly a continuum where at some point you may not be able to really perceive exactly the pixels on the TV, but can still see that ā€œsomething is weirdā€ but at 8 feet, I’m not seeing that from my 1080p 55 inch tv. Certainly nothing remotely close to the kind of stuff you see from the compression of video feeds and stuff that come with basically every media source that I watch other than my video game console.

But hey, I’m sure my next TV with be a 4k one, so maybe then it’ll be a total revelation. I’m skeptical that will be the case though. I feel like the OLED contrast difference will be of infinitely more importance.

I literally have a 4k and 1440p monitor side by side and I can’t tell the difference unless I make a huge effort to see the difference. In fact I constantly feel like my 1440p looks better but I think that’s because the framerate is higher.

The same goes for my 55" TV, swapping modes on X1X via performance vs not or swapping between 1080p games and 4k games (like forza) I just can’t tell the difference.

I’m sure some can notice the detail without even trying, but I’m not one of them.

Anecdotally, I can see a difference in 4K video content on a 55ā€ from about six to seven feet, but I believe that’s due to a couple things - higher bitrate on the streams and discs resulting in a better overall picture, and better mastering. Even when working with a 2k source, they’re bringing out a lot of those details - again, probably related to the bitrate. The HDR certainly helps, too.

For gaming content, I’ve only an XB1S, so I couldn’t say either way. I guess I’ll find out in November, with either XBSX or PS5 (I plan to get both this gen!).