I wonder what the 3070 would be like. I should pay more attention to this since air flow to my computer room is pretty bad in the summer. Looks like my 1070ti is already up at 180W though.

Next-gen consoles have 16GB RAM and it’s shared with the GPU and CPU so no, I don’t think it will be necessary to have >16GB for PC gaming.

Yes they will have PCIe 4.0 but no it won’t matter for gaming.

AMD drivers were deeply problematic for Navi, but it was a completely new architecture for them. RDNA2 should be better, but who knows, really?

The RT cores themselves are “3-4x as fast” so even the lowest-end Ampere GPU should absolutely demolish the 2080ti in ray-tracing performance.

3070 wasn’t leaked, only the top-tier big Ampere, the Titan/3080ti.

Keep in mind-- all rumors from Nvidia’s competitor, nothing confirmed.

That is the real question. Twice in my life I tried an AMD (what were they before they were AMD) which were many years apart and both times the drivers were crap. From time to time AMD hardware does look really good, but their drives were always crap.

Its looking more and more likely Ill sit this generation out. I am still with my GTX 1060 which I have been wanting to upgrade for a while. Maybe if the 2080S gets a big discount I may buy one.

If AMD meaningfully competes with Nvidia this will not be the generation you want to sit out. Competition drives prices down, and it looks like both sides are reaching for very high performance.

Same. Last time I tried AMD will be the last, I like to play games at launch.

The architectures of the PS5, Xbox Series X, and a high-end Ryzen PC are basically identical (save for i/o). Can’t think of any time you’ve seen this kind of alignment before. We’re going to be talking of potentially hundreds of millions of gaming devices with basically the same top-tier CPU and GPU capabilities. It’ll be interesting to see how that helps AMD on the PC GPU front.

Well, it will indirectly help them because Sony and MS funded R&D of new technology like hardware texture decompression and treating very fast SSDs like level 2 system RAM, assuming their agreements don’t restrict them to using that IP only for Sony/MS, they could have been forced into clean rooms for each semi-custom SoC.

Other than that stuff, both red and green have to support fully documented APIs on Windows so there’s very little shenanigans going on there, only performance differences, but both run with DirectX and Vulkan.

I think AMD will have an answer to texture decompression and the SSD as L2 RAM stuff.

The real question is whether they have any way to respond to DLSS. It sounds like Nvidia is fully aware of how important garnering DLSS support will be in the coming generation. If most games support it and consumers accept that it offers a huge performance gain with zero image quality impact, Nvidia will simply be unbeatable unless AMD supports something very similar.

Imagine reviews benching the 3080ti against the 6900XT. DLSS off, they’re nearly identical. But you turn DLSS on and the 3080ti wins by, like, fifty percent. And it looks identical, or even better with DLSS on. Is that a valid benchmark? Well it isn’t apples/apples, but you would turn it on, right?

And then Nvidia shows off 8k gaming (with DLSS), and w00t there it is.

Very interesting. Do we know anything about incremental power use from DLSS? Is a lot of calculation being done in real-time or is heavy lifting done in pre-analysis somewhere?

DLSS 2.0 (and presumably 3.0) is supposedly pre-trained on a generic dataset, but game-ready drivers are still required for best results so it’s unclear if Nvidia is being completely truthful about that. That’s just the machine learning training the corpus, calculations for image upscaling are done in realtime on the Turing/Ampere tensor cores.

I have a 2080 Ti FE in my primary desktop, and a 1080 Ti FE in a mini build connected to my TV.

I’m sorely tempted to sell the 2080 Ti before these cards are officially announced and earmark the cash for either RDNA2 or the Nvidia 3000 series. Considering prices online for the 2080 RTX, I might be able to recoup my original outlay.

I’d have to ‘slum’ w/ the 1080 Ti for a while, and if supplies of new cards are limited, play Cyberpunk on the 1080 Ti, but I expect it’ll be fine. Especially since my 34" ultrawide only sports 60% of the pixels needed for 4k. Hmm.

I’m “slumming” it will a GTX 1080 on a 34" ultrawide as well. I think you’ll be able to manage, most things still run great for me even with the card getting a little long in the tooth. I’d probably opt to sell the card if I were in your position.

If anyone wants to pass along their 1080 Ti slum after upgrading while I’m stuck in my 970 cardboard box, feel free. /s

Right? :D How does the 970 hold up these days, anyway? That was the card I was on before I got the 1080.

While playing on 1080p and sticking to medium settings on newer games, it’s okay. Not silky smooth, but okay. Really, it was a good investment when I consider it’s a 6 year-old card.

If you can get a good price for your Turing and have another card to game on in the meantime I would absolutely sell right now.

If you have a 2080ti even more so.

As Stusser said, If you can get a truckload of cash, do it. Old cards are not going to age well when this gen hits.

I would assume you could get at least $600 for a gently used 2080ti, today.

More like $800 minimum.

https://www.ebay.com/sch/i.html?_nkw=2080ti

I was just offered $900, leaning towards making the deal.

nods