The Xbox One X - Project Scorpio lives and I am a dumbass for thinking it would be the Xbox 10 S

Personally, I’m waiting for a Black Friday sale on an LG OLED 65B7V.

Bingo! Though I really wanted 75" and may end up with a 75" Samsung qled instead.

The Digital Foundry video on Halo 5 does make a good case for the Xbox One X finally providing the best Halo 5 experience.

It still has dynamic resolution, so you get true 4K resolution most of the time but the resolution coming down during busy firefights. The cutscenes used the in-game engine, so they look much better now too. And even on 1080p TVs, the 4K gets downsampled and make the 1080p look way better.

I’m kind of jealous of Xbox One X owners for the first time.

As it turns out so do I. And the noise was really from my Gears 4 special edition xbox not the drive, so I gotta get that squared away. The external drive is dead silent when connected to my Scorpio.

I bought the 940e 75" FALD panel earlier this year and was pretty set on not doing any further display upgrades until 2019, when I figure I’ll finally be able to afford a PJ capable of matching my 5 year old Epson 5020 but adding 4K and at least HDR10 (no chance of DV). And then one of the forum sponsors for a site I write for made me an offer on an LG 55" e model, which has the same PQ as the B and C panels. Into my office that went, and now I’m hooked on OLED. OLED all the things. And prices have been steadily falling since then. If you follow Slick Deals they have found ways to get a 65" B for $1600.

That
Is
INSANE.

It takes a bit of work and patience to get the rebates, but it seems legit. Crazy times.

I mentioned this up-thread, but you shouldn’t take that as gospel.

Many games get code updates to render at higher resolutions without actually having “Scorpio assets” installed, so it could still be misleading and make you think you don’t have an update when you actually do.

I believe Halo 5 already had the higher-resolution assets that the X is now using, because even on the base Xbox One, Halo 5 would use those assets in certain cases where it could handle the rendering load.

There was a deal of Halo 5 and Gears of War 4 for $30 Canadian, so I picked that up. Amazingly, given my minimal console gaming, I’ve actually played the 4 core Halo games through on Legendary, so I like the series enough to nab it. I played the first 2 Gears games but not the 3rd, although I bought it - will probably play that first when the enhanced patch is out for it. Both Gears/Halo exclusives seem to be significantly enhanced for the X, so that seemed like a worthwhile purchase.

Does anyone buy Sony TVs anymore? I say this as someone who will probably buy a 55 sony OLED if it’s on sale for Black Friday, even though it’s badly overpriced compared to the LG C7 series. Sony’s TV division seems to be hurting.

E is actually an upgrade over the B and C panels, although essentially the same.

The last Sony TV I bought was a 60" WEGA projection set. So, not in the current era.

Just this year Sony seemed to make the foresightful decision to push HDR as their TVs main selling point. Their high end LED TVs can hit 1500 nits of peak brightness now, almost twice that of OLED. OLEDs also use automatic brightness limiters which mean in an especially bright scene the OLED will automatically reduce brightness to reduce current draw and heat output, so in snowboarding scenes, white web pages etc, an ultra bright LED might be 3x brighter than an OLED. It seems the more or less universal opinion that at least in 2017 the LED brand to beat is Sony. The low end market had been seized by TCL moving into the western markets in force.

The sleeper feature however is Dolby Vision. Even with an Apple TV 4K streaming a 4K movie there are visible differences in quality between HDR10 and Dolby Vision, and if you’re an cinemaphile you’ll need to be sure to get a TV compatible with Dolby Vision along with a compatible player - which neither the Xbox X nor PS4 Pro are.

Yeah, HDR isn’t a fad and will stick around. It will very quickly become table stakes, of course.

Hopefully 2018/2019 TVs will come with HDMI 2.1 and support variable refresh rates. That’s a big deal for videogames.

I didn’t realize neither console supported DV. Wonder if they can by firmware update? I do like Dolbyvision on Netflix shows, which is the only use I’ve seen of the standard, but I just use my built in TV app.

I honestly don’t understand the “brightness” talking point. I’ve never seen a good TV not look bright enough, let alone an OLED, even though that’s a bullet point used by LED manufacturers vs OLED. I’d much rather have Sony’s OLED than its LED. If I was going to buy an LCD/LED, I’d probably go with a big Vizio or Sony though. I skipped the LCD/LED generation of TVs entirely, going from CRT/Plasma/OLED.

I think the idea is the sun be too bright to look at etc, that there’s an “experience” of watching that’s different than just representing. HDR is supposed to make highlights brighter not just lighter. So there’s a minimum amount of brightness the TV needs to support in order to be considered HDR.

More technically there’s a big difference between showing a single spot that’s bright and a whole scene that’s bright, and brightness levels differ depending on the % of the screen that’s being used up at max brightness.

Even more technically Dolby Vision and HDR10 are mastered at different peak brightness levels and their “electro optical transfer functions” have to interact with the TV hardware, which essentially has to choose what part of the histogram to preserve and what to wash out. I.E., let’s say the content tells the TV “max 9000 nit brightness” but the TV can only do 1000 nits; the TV software has to decide where to put the gray point on the display, which might blow out highlights (overexpose then essentially) or underexpose them to preserve black levels.

It’s all pretty rarified stuff however and unless you’re dropping 5k-10k on a home theatre setup you’re unlikely to notice or care. For consumers it really comes down to “do you want a super bright TV” or " do you want an OLED"? Lower end LED TVs do have HDR support but not the brightness of the higher end models, so are just inferior aside from price. Sadly the support for HDR has become widespread and the bar to clear for that support so relatively low that there’s still a big range between just barely squeaking by HDR and blow out of the water HDR and consumers are stuck without a clear guideline. Then there’s another HDR Premium certification on top of just HDR certification, that isn’t really clear wth it means in practice since a lot of higher end TVs never bothered to get it but TVs inferior to the top end do have; nor is there just a list you can easily look up.

Same PQ. Different features including glass bezel and a built in sound bar that I don’t use.

Yep. Just the way your previously comment was statement might have been interpreted like you were rationalizing a compromise to an E. You obv know this stuff, but thought others might not, and wouldn’t recognize what an amazing deal you got.

This means exactly jack and shit when the blacks are also 3x brighter. It’s the dynamic range that matters, not amping the overall brightness of everything to retina scorching levels because showroom floor. That’s why OLED is valuable because the blacks are infinitely black, aka OFF.

It’s telling that all TVs have showroom modes where brightness is jacked to “this one goes to eleven” and they look like emulsified ass at home until you turn that off. Same thing with computer monitors. They are stupidly bright at defaults out of the box, and you end up turning them down to about 25% brightness immediately.

It’s so so so profoundly dumb. The brightness war is kind of like the loudness war for CD mastering. Only morons buy into that, and we all suffer as a result. So please don’t promote that bullshit here.

image

That math can be misleading because an OLED TV with 1 nit of brightness would also have infinite blacks as well.

It really depends upon your use cases. In this thread, HDR gaming, it’s worth pointing out that most games just aren’t designed to be used with “perfect blacks”. I’ve lived with a plasma and its perfect blacks for almost 10 years, and modern HDR games absolutely look better, and in almost every way including color accuracy, despite the otherwise good performance of the plasma, on modern LED or OLED TVs. The plasma is absolutely dimmer in real terms, and HDR really brings out many of the complaints i had about the plasma in the transition from day to night scenes.

I’m not really trying to convince people not to buy OLED here. It was just a surprise to me that OLEDs had these things called brightness limiters that actually dimmed the display when showing a very bright scene. If - and only if - super bright displays are what you want, OLED isn’t necessarily the best choice. And this means basically HDR gaming. For everything else OLED is almost certainly better.

LED manufactures aren’t saying this stuff, i’m saying this stuff. It’s not like this is some kind of multilevel marketing earbug that i’m parroting. So you’re talking about the loudness wars, and i’m talking about why paddle shifters aren’t the end of the world. It’s just more information for the informed buyer.

They are stupidly bright at defaults out of the box, and you end up turning them down to about 25% brightness immediately.

Just FYI but have you played modern HDR games on modern HDR capable TVs? Because if you’re turning your TV down to 25% brightness, you’re not really the target consumer for HDR gaming content.

Looks like there are some significant problems with the 4k blu-ray player as of now:

as it stands right now, the Xbox One X’s 4K Blu-ray drive is by far the worst 4K Blu-ray player around. Except for the Xbox One S, which is only marginally better - something I’ll come back to later.

We’re not just talking about a small difference here, either; the Xbox One X’s 4K Blu-ray playback is off enough to be considered more or less broken if you’re any sort of home cinema enthusiast.

The problem is raised black levels when playing high dynamic range images (which are the star attraction of pretty much all 4K Blu-ray discs) into HDR-capable TVs. Watch a dark 4K Blu-ray scene, such as the one in Spider-Man: Homecoming where Spidey is rescued from a lake at night by Iron Man, and the HDR night sky looks much brighter, much more infused with blue and much noisier than it does if I watch the same scene on other 4K Blu-ray players such as the Panasonic UB900 or Oppo 203. Even the black bars above and below wide aspect ratio films look markedly greyer from the Xbox One X than they do via any other 4K Blu-ray player.

This is because something about the way the Xbox One X is outputting its video seems to be raising the base luminance level of the entire high dynamic range image.

The issue also affects color and detail, with some tones looking washed out compared with how they look on other 4K Blu-ray decks. Some very bright shots look flared and bleached, too, and noise levels are consistently elevated rather than this issue only affecting dark scenes. Which is all exactly what you would expect to see if an image isn’t being presented with the correct luminance information.

Per the edit in the article, those are software bugs and they will be fixed very soon.