Nvidia DLSS, this thread has been visually enhanced.

The videos “pop up” more, but they went a bit overboard with how they over bright the videos.

I have a HDR10 monitor and the HDR is rubbish both in win1 0 and 11 and all games but i bet with decent HDR it is another matter.

Yeah, I have to wonder what monitors some people use, because the HDR I see in some games is very impacting.

I wonder if it’s also a matter of everyone having different eyes. Many, for instance, can barely tell the difference between 144Hz and 240Hz, while for others it can be fairly significant.

I don’t think it’s eyes in the case of HDR (within reasonable bounds, of course). When it comes to PC monitors, HDR has been a ways behind TVs and there is a huge variance in quality. Windows 10 also had several problems which were improved in 11, but it still requires calibration and that sort of thing. And then you have varying implementation depending on the game itself!

HDR on my big screen TV on Win10 was terrible, even after fiddling with the calibration a hundred times. HDR (and auto HDR) on Win11 on the OLED display below has been chef’s kiss.

https://www.dell.com/en-us/shop/alienware-34-curved-qd-oled-gaming-monitor-aw3423dw/apd/210-bcye/monitors-monitor-accessories

The best way to experience HDR is with a console on a TV built within the last five years that can run at 120Hz for games. That will show you what you should expect from it on your PC with a monitor.

… and almost certainly won’t get.

It isn’t likely to be as good, yeah. It does give you the right reference though for what you want. I think a lot of people don’t even know what it should look like to begin with.

I have been pretty happy with it on the Dell S2721DGF I purchased a few years ago. With Windows 11 on the new PC, it definitely has its moments in Cyberpunk and Warzone. I can easily compare to the PS5 with Warzone too. I should play some Apex Legends again. That was a showcase game for me with HDR when I first got the TV (Samsung Q70R). That game really pops.

Great you’re enjoying it, but that’s a “fake HDR” monitor with very few local dimming zones and 400 nits max brightness. It’s probably noticeable when HDR is turned on like the “HDR400” monitor I’m typing this on right now, but it isn’t comparable to what you’d get out of a mini-LED or OLED.

I’d expect your QLED TV to have vastly better HDR to an immediately noticeable degree, and that only has 50 local dimming zones so isn’t comparable to a mini-LED or OLED either-- but it is real HDR.

Out of curiosity (as I probably can’t afford them anyhow) what are the good “true” HDR monitors (not TVs, specifically for PC use)?

You can start here…

I think @kevinc 's is on the list.

Mind you, I use hdr on Windows because I have the pc connected to the big, expensive tv…

RTings is absolutely reliable. On the video side I like Hardware Unboxed and Monitors Unboxed.

The real story is that vanishingly few monitors are “real” HDR at all. Only OLEDs and a couple expensive mini-LEDs really count-- and I would not feel comfortable with an OLED computer monitor due to burn-in unless I solely used it to game or watch media.

I can only speak from personal experience, but seriously the Alienware monitor I linked has been amazing. You certainly pay for it, but the combo of OLED + HDR on a PC display really makes things pop, especially on particular games. One example that comes to mind is Monster Hunter Rise, which was natively a Switch game. I found the HDR on that to be very noticeable particularly on that display. I actually said “Oh, wow” out loud when I first launched it since I had been playing it on a TN display previously.

Here’s the monitor I’m talking about, but I’m sure it’s not the only one with good HDR support. Just the one I have personal experience with.

https://www.bestbuy.com/site/alienware-aw3423dwf-34-quantum-dot-oled-curved-ultrawide-gaming-monitor-165hz-amd-freesync-premium-pro-vesa-hdmiusb-dark-side-of-the-moon/6536990.p?skuId=6536990

Sorry for rambling and junking up the DLSS thread, I just see people frequently dumping on HDR and the like on PC and I know where they’re coming from because that was my previous experience with it as well, but it can be so much better with the right display. I’m guessing good HDR on desktop monitors is going to become more the norm as time goes on, but for the past few years it’s really been in the early growing pains and it’s understandably left a bad impression on a lot of people.

That really depends on how quickly prices on mini-LED drop and if modern OLEDs can really avoid burn-in over a time period of at least 3 years in a desktop computer scenario. The first will eventually happen, and micro LEDs will come at some point too with zero compromises. The second remains to be seen.

I recall the DF guys(at least the regulars on DF Direct) talking recently about how they have pretty much exclusively used OLED monitors on PC for awhile now and none of them have seen any burn-in at all. Although obviously OLED monitors haven’t really been available for all that long, but they also inherit all of the years of improvements on that front from the TV side.

Even without the improvements from the years since, it took my 2018 LG OLED years to develop any burn-in. It’s noticeable now in reds but usually you have to be looking for it to really notice unless you’re watching something very red/orange, like Dune. Obviously Windows has the added static UI concern but looking at my dark mode Taskbar in Windows 11 I have to imagine that would take way longer to burn in than something like the bright blue XP bar. Although now I’m imagining having the QT3 eyeball burned into my monitor and smiling at the thought.

Yeah, that’s nice. A grand though…I think I’ll stick with my piddly faux-HDR 16:9 for now!

It hasn’t been long enough to tell.

TV use cases are fine for OLED as they’re moving content in video and games. I leave my PC monitors on for hours at a time with static elements on screen. Completely different.

You can attempt to address that by auto hiding your taskbar, using pure black wallpapers, setting the screen to sleep ultra quickly, etc, and that’s exactly the sort of crap I don’t want to do.

And here comes the connector for all upscalers? DirectSR by Microsoft.

Is it a connector or is it something that will hold back NVIDIA and allow AMD and Intel to sit back while NVIDIA is hamstrung by Microsoft’s implementation?

Not at all convinced this is something anyone wants other than the companies that trail in performance…