The Xbox One X - Project Scorpio lives and I am a dumbass for thinking it would be the Xbox 10 S

And yet, not currently fixed, and also apparently present in the xBox One S.

Other thing as I was doing some TV/AV research over the weekend - no Dolby Vision HDR support. The posters have a pretty legitimate beef that a $500 4k/HDR console doesn’t support it.

The lack of Dolby Vision on the Samsung QLED currently has me thinking I’ll end up with the 65" LG OLED instead. Still kind of checking to see what price drops for black friday come to. I really did want to go from 50" 1080i Plasma to 75" 4k though. Ah well, 65" is still a pretty decent leap and with minimal bezels it should still look pretty fantastic…

Not currently fixed, but also not around for very long - it’s a recent regression that was introduced a couple of weeks ago, and it’ll be fixed again pretty quickly.

I’m just pointing out that since it does affect all consoles, it’s not a hardware problem with the drive used in the X, so it’ll be much easier to fix than something like hardware bugs or failures.

It should be something pretty easy to fix, like reading the metadata from the sources differently. It seems like it’s defaulting to a value that the data isn’t requesting. Over at HDTVtest Vincent Teoh grabbed the metadata from an Apple TV HDR10 stream and found the Apple TV was receiving Dolby Vision metadata and then when outputting HDR10 (if you set it to output to HDR10 and not Dolby Vision) was botching all the metadata and setting the same value for every source file. IE, setting the gray point on the histogram wrong. It’s highly likely something similar is happening here.

I think this issue is going to come up more and more though with HDR content (like i’ve described above). There’s a big difference between an HDR world and the previous world. In the past content just told the screen “go to max” and it didn’t care if the screen was a monitor or an LED the size of the sun. HDR content is actually mastered on displays that far exceed any consumer level display available, often reaching 4000 or more nits in brightness. That level of detail is then baked into the metadata of the digital content. Your content has to be processed and your whitepoint for your display - which is likely to be almost a half an order of magnitude less bright than the mastering display - set by software and calibration in order that this setting doesn’t “overexpose” the whole scene, almost like a camera ISO value. If you do this wrong, it “blows out” the black level detail, or crushes details with overwhelming black levels.

MS said they’d fix it, and it is clearly a software problem. I wouldn’t let this hold me back from buying a XboneX.

Thanks for the detail. A nice learning post.

I saw a post from him this morning saying users are reporting that the fix is already being tested in a preview build right now.

QLED: I hate that name. It reeks of marketing desperation in the face of a technological revolution they continue to be inexplicably absent from on the TV side, despite leading the world in the mobile application of it.

I’m waiting for an OLED with HDMI 2.1 and variable/adaptive refresh rates (freesync) to upgrade from my current perfectly-fine 60" LED.

Wait, what? So if I pop in a 4K UHD Blu-ray with Dolby Vision HDR, the XB1 will discard that data and only send HDR10 to my TV? That… can’t be right. Wow.

I think this is bullshit at the moment:

I agree with @stusser that HDR will (and should!) become table stakes, but right now it’s still in the formative / experimental stage. When there are more PS4 Pros and Xbox One X’s out there than the old models, maybe 3 years from now? But right now today? Fuck no man.

This is about as dumb as saying “I only want REALLY LOUD music. Give me the LOUDEST music you have.”

In my extended review at Reference Home Theater, I call it “the best looking TV I’ve ever reviewed.” But we aren’t alone in loving the E6. Vincent Teoh at HDTVtest writes, “We’re not even going to qualify the following endorsement: if you can afford it, this is the TV to buy.” Rtings.com gave the E6 OLED the highest score of any TV the site has ever tested. Reviewed.com awarded it a 9.9 out of 10, with only the LG G6 OLED (which offers the same image but better styling and sound for $2,000 more) coming out ahead.

Hardcore TV review sites had to recalibrate their review score systems once 2016 era OLED sets hit the market. That’s how much off the charts better they were. It is not a subtle or in any way close thing where any reviewer is saying, oh, now that LED is sooo much brighter, don’t waste your time with OLED!

Why can’t it be right? Plenty of UHD Blu-Ray players are HDR 10 only. As far as I’m aware, there are only a handful of models that do Dolby Vision. They’d have to pay to licence the Dolby Vision tech, and most haven’t. It sucks, but it’s not that surprising…

We’re getting a lot of mileage out of “table stakes” in this thread.

go_to_eleven

You’re right, I guess it’s not too surprising. You know what it is? I have an LG OLED being delivered tomorrow and I’m really excited about it, but it’s making me become one of those annoying Dolby Vision snobs. :)

That’s usually what happens when companies rush to put out competing standards, right? I’m hoping the dust will have settled by the time we are in the market for another TV.

The Xbox issue from the man himself.

Then we need to stop here. I’m not reading about HDR differences, im seeing them.

That video you posted has almost nothing to do with HDR TV quality; for one thing

But another, that video showed how badly Nvidia was botching that test - in fact they were probably cheating with a regular G-Sync panel. Showing some random dude Youtuber vs the thousands of other videos showing differences with HDR is a bit disingenuous. But HDR really isn’t a good fit for PC games anyway - it’s a much better fit for closed systems like consoles and TVs, sort of like how it’s easier to get Dolby on TV content than on PC content.

Computer nerds like us think in terms of TN vs IPS, and IPS is of course always better (as we assume), but that logic is dated with actual TV displays, which almost always use VA technology, which have 4-6x the contrast level of IPS panels. Another problem with the Nvidia display shown was that it clearly wasn’t very bright, which is literally one of the base specifications for HDR content certification.

I know where you’re coming from, for years floor model TVs had hyped up saturation and brightness levels to pop out from the crowd. But HDR content really isn’t that, and if you haven’t seen it in person i’d be reluctant to hand wave it away. Sony and Microsoft wouldn’t be pushing it as hard as they are if it weren’t noticeable.

I’m not anti HDR in any way. But if you’re viewing HDR content on a non-OLED display, you literally have no idea what you’re talking about.

LG’s flagship OLED predictably took home the gold when it came to black quality, perceived contrast, off-axis performance, screen uniformity, and overall night viewing picture quality. Given a calibration, a high-quality sample, and enough break-in time, we’d expect OLED to beat LED almost every time in these categories and the G6—our top overall TV—didn’t disappoint.

The LG also won for color accuracy and HDR/wide color gamut, though I have my doubts about these results since there was some difficulty in actually getting all four TVs to display HDR10 and Dolby Vision content. And though the LG G6 does have highly saturated colors, the Helmholtz-Kohlrausch effect does mean that the brighter Sony, Samsung, and Vizio sets all look just as vibrant to my eyes.

Ultimately, the shootout does confirm what we’ve seen with our own eyes: OLED is still king

But in the brightness wars:

Where the LED TVs did pull into the lead was in the bright light viewing category, where Sony’s massive X940D took home the prize. Daytime viewing is especially advantageous for LED TVs because you need to get bright enough to overpower ambient light.

So yeah, if you like watching shows in bright sunlight, like some kind of fucking animal, then sure, ramping LED display brightness up to retina scorching levels might be useful… I guess?

Not everybody lives like a basement dweller. Some of us have windows and ambient light.

Sure, but image quality is usually a bigger concern for people talking about HDR in the first place. Focusing on this one weird “bright room” criteria smacks of apologism.