It’s a very cool concept; basically lowering the resolution in ways the gamer hopefully won’t notice. It needs per-game implementation as mentioned above (so no magical panacea for all games), but more and more new games are supporting it. Of note, this is a software enhancement and DLSS is the Nvidia version.

Super Resolution (as opposed to Virtual Super Resolution, because apparently people need a thesaurus) is the name currently being given to AMD’s “answer” but it’s not yet released and is rumored to be coming within a month or two after their graphics cards drop. Nobody knows if it’s going to be any good compared to DLSS, but rumors abound it’s very fast but doesn’t look quite as good (so more like DLSS 1).

We really don’t know anything at all about AMD’s answer other than its name. It may be amazing, and cross-platform, and open our eyes to the blissful pure light of riding a gentle unicorn bareback through golden fields of grain on a lovely summer’s day after your dad says he’s proud of you.

But I mean, probably not.

LOL - yep

If anyone had told me DLSS2 existed and does what it does, without proof I would have called shenanigans. It seems impossible. It can’t be trivial to replicate Nvidia’s magic beans.

The thing that boggles my mind is some of the shots where the DLSS version looked BETTER, especially with stuff like text on signs. That’s like some special kind of black magic

Do you have good recommended 1440 settings for 3080?

The 3080 is ridiculously overpowered for 1440p, it absolutely wtfpwnz everything but Watch Dogs 3, pretty much. That’s just a poorly optimized game which I suspect is also somewhat CPU bound at 1440p with only 4 physical cores. It’s why I’m upgrading my CPU finally.

But to answer your question-- turn everything to ultra. Except in WD3.

Super Resolution is a dumb marketing name given that it’s the opposite of super sampling.

Radeon’s marketing group has never failed to astound me with their bizarre decision making.

The problem with AMD’s implementation is that even if it’s amazing it’s at a sever disadvantage. DLSS2 has been out for what a year now, and the number of games that support it can be counted on one hand, despite Nvidia’s plastoring game logos on their 20xx announcement.

So even if AMD’s system is great we are looking at 2+ years before we can really utilize it, at which point many people will be on the next generation cards anyway.

DLSS doesn’t exactly roll off the tongue when it comes to marketing speak.

If AMD’s system is comparable technically, just as easy to implement, and an open standard or based on Microsoft’s DirectML used in XSX|S and Windows 10, I can see it being adopted very quickly much like Freesync was. Now how long Nvidia takes to capitulate and support it themselves, who knows. Hopefully not 5 years.

That’s really the only way I see AMD competing here, if they had already been working with Microsoft on this stuff for years.

I guess that’s a good point. If their system is usable by XS* games then it might have a wider chance at adoption.

I think the problems are more to do with producing a model that produces good outcomes, and actually running that model on the gpu without compromising performance (meaning a much, much less complex model than DLSS). The API challenges are real, and the strategic relationship with MS might help a lot there, but I don’t think that’s the biggest problem AMD have, and besides it’s not like NVidia don’t also have a strategic relationship with the directML folks.

I’m running everything in Ultra (even RTX) and it’s giving me an average frame rate of 79 with the 1% low being 62. I frame capped it at 60 and it’s smooth as silk. This is at 1440p with DLSS set to “balanced”. Even poorly optimized, the 3080 is eating Watch Dogs Legion for lunch.

I suspect you have a much faster CPU.

I didn’t realize you were saying the 3080 with a slower CPU was the bottleneck - rereading it I see what you mean a bit more clearly now. Yeah, I realized even my 1080 Ti was bottlenecked (in games such as Total Warhammer II - I had a decent i7 4770K but upgrading gave me, for example, 15-20 more frames in Warhammer II) last year so spent my upgrade funds going with an i9 9900K knowing I’d want whatever the next generation cards were, so when I run the Watch Dogs bench I show my GPU at 99% load and the CPU at 79%. I’m still surprised to see the CPU so heavily utilized, but there is a LOT of simulation going on here, so I suppose that makes sense.

Yes, there are reports that WD3 will scale all the way up to 12 CPUs.

will scale all the way up to 12 CPUs

These words , they please me.

Thanks. I was planning on moving to a 48" LG OLED CX but it just didn’t work for me so I pushed that over to the Xbox Series X / PS5 rig and went back to my ASUS 27" 1440p guy until I can get a 32" or smaller OLED or other HDR capable desktop display. i9 9900, 16gb, SSD and this is what I pulled with mostly default settings except RTX on. You can see the wall mount I had for the OLED in the background, it was seriously overwhelming to try to sit that close to it.