It will also help that, again, their DLSS equivalent will have the widest support, courtesy of PS5 and XB Series.

This seems like it’ll be G-Sync/Freesync all over again. Nvidia will get their first with a proprietary standard that is better, but AMD will come up with a good-enough standard that is more easily supported and offers greater adoption.

(emphasis mine)

This is why I think AMD’s price shenanigans are important. I think the reviews would have had a very different complexion if it was understood that the vast majority of 6800XTs would cost more than similar 3080s, rather than $50 less.

Given how good the AMD technology is likely to be (in my opinion not very, they aren’t a deep learning company in the way nvidia are and they lack the custom silicon on the GPUs) I think openCL/CUDA might be a better parallel… We shall see.

I would be surprised if many PS5 / XB games use any kind of AI upscaling, simply because every model computation you do on AMD reduces your rasterization bandwidth - I expect they will rely more on AMD’s great rasterization performance and then tweak the detail levels to hit 60fps@4K(*). Similarly I think they will use RT very sparingly, saving it for places where it will have the most impact given the high cost on AMD.

(*: And for all I like to bash AMD, the fact that the consoles will be able to do this without much of a compromise in terms of settings is a phenomenal achievement on their part).

The real problem might be that AMD GPUs lack any hardware dedicated to AI so it would need to be done via programmable shaders. We just don’t know if that’s a blocker or not. DirectML is certainly a more open technology. I want it to win, just like open VRR did.

It depends on what games you prioritize, but the games already supporting DLSS constitute the tech having ‘taken off’. The benefits in Control, WD: Legion, Metro Exodus, BFV, and of course Cyberpunk already make AMD an inferior proposition. That doesn’t even touch on the RTX benefits. If you want to prioritize fidelity and eke out the highest possible performance at the greatest fidelity, Radeon isn’t there yet. I wouldn’t be surprised if they continue to make gains in 2021, and I hope they do! I don’t like Nvidia having a stranglehold, and I’m hopeful for AMD to build on their current success. But for the games either out, or imminent, Radeon can only compete if you sacrifice Ultra/RTX visuals.

Except for Godfall which only supports raytracing on AMD cards atm and not Nvidia :P

:) I think I addressed that in my first sentence. Never heard of Godfall until you mentioned it.

The forums here have a Godfall thread with well over 3 dozen posts! ;)

3 dozen posts on Qt3 is two or three dudes saying good morning over a cup of coffee.

Godfall has seen abysmal reviews. Skip it.

Lol the awful reviews is why I specifically pointed it out

Yeah didn’t AMD already have their Radeon super sharpening thing that operated on an upscaled image? At the time sites were saying stuff like ‘DLSS is dead’, and Nvidia even did their own sharpening filter… :P

But that was before DLSS 2 of course…

I posted about image sharpening here last year somewhere, it’s really neat technology, effectively adds detail to the image. But upscaling it ain’t.

Damn consoles.

Yeah, that’s basically what I thought was happening.

Could someone give me some advice on upgrading? I have a dell XPS 8930 with an i7-8700 (3.2G) and 16G RAM and a nvidia 1080 that I purchased a few years ago when @stusser pointed out some good deal on them. It looks like I could put a new power supply in (750 or 850 watts?), which would give me enough power to put in a new GPU.

I’m unsure if the case that I have is big enough to fit that class of GPU - can anyone make an educated guess? Or is the new AMD video card offering going to be any smaller, so that might be my choice? Maybe this class of card generates too much heat and I wait for something better/smaller/less performant like a 3060 with a rear exhaust?

You would need to google it or just measure the clearance. Some of the 6800XT/3080s are really long.

A quick google suggests the 1080 had a TDP of 180W. Given the 3070 is supposed to be 220W, you might be ok with the PSU you have? The AMD cards draw a little less, and there are also new models incoming from both Nvidia and AMD, around the 3060 kind of range. Like everything else, availability is an issue though.

You may also have trouble if it uses some sort of proprietary case/nonsense - I’ve run into that before when someone wants me to upgrade a pre-built system that things are very specific and can’t always be swapped. I wouldn’t think the video card to be one of these issues, but you may want to crack the case open and get a look, or take some pics for us to have a better idea of what advice to give.

I’ve found pictures of someone claiming they fit inside on the dell forums. They did mention they did a bit of cutting to add at least one more case fan in to the front of the case, as airflow/heat was a problem. A “midrange” care like the 3070 sounds like it’d be fine for me, and I can wait until they are available - my backlog is big enough that I have plenty of games to play before I get to Cyberpunk 2077.


image