I have played it! I don’t really get the reference though, I’m afraid. 144hz is the most my monitor can do without over clocking, and for some reason when I set the OC on it everything feels off a little somehow, so I leave it at the manufacturer default.

I just mean it’s one of the most graphically demanding games I’ve played, I needed to crank settings down to get it anywhere near 90 at 1440p. :)

Oh, haha, I see. Yeah, I don’t remember what I was getting but I don’t recall it being an issue - but I almost always cap my frames at 60 on PC to keep my card running quiet/cool, I just don’t see much difference above 60* so I’d rather not hear a jet engine at my feet while I game. :)

*Not no difference, mind, just not much of one. 30 to 60 is a different tale, of course.

A must play!

I’m tired of rumor and speculation (not a complaint about the link you posted, Scott). Has Nvidia given any guidance on when they plan to reveal some details?

Nope, but probably August.

My body (and GTX 1080) are ready.

Was just reading that same article. Suckers are going to use some power.

My city’s power grid is ready.

So this power obviously isn’t going to translate to laptops.

I was thinking of buying an Asus G14 to have a laptop that could double as a PC gaming rig for my living room TV. But now I’m thinking it might make more sense to stick to my old laptop and just put together an ITX system using whatever the new mid-range card is come fall.

Almost 2 years since release of the first RTX cards, and barely a handful of games. Lets hope with the nextgen consoles coming soon, game devs get on the ray tracing train.

I’m sure with the consoles and AMD supporting it, it will be all over the place.

I think, from the nvidia presentation, that with the new version
it is going to be much easier to implement.

Plus once it’s handled by engines. Unreal, Ubiengine etc.

New rumors suggest the 3070 will have the same number of CUDA cores as the 2080. That’d be good, right?

Depends on how they improve performance per-core. If it’s the same, and they don’t raise prices, it would be middling to unimpressive. If it’s the same and they do raise prices so the 3070 costs the same as the 2080 did, like they did with the 20-series, it’s another sidegrade generation and screw Nvidia.

Yeah, the 2080 wasn’t very impressive of a card. I’d be pretty disappointed if yet another generation later the 3070 is only matching what a 1080ti was doing ages ago. Especially at Nvidia’s price points. Will have to wait and see!

Turing performance improved substantially since release with driver optimizations. These days the 2070S would slightly beat a 1080ti. At release date:

2060 ~= 1070ti
2070 ~= 1080
2080 ~= 1080ti

But today:

2060 ~= 1080 + 4%
2060S ~= 1080 + 10%
2070 ~= 1080 + 10%
2070S ~= 1080 + 30% (~=1080ti + 5%)
2080 ~= 1080 + 35% (~=1080ti + 10%)
2080S ~= 1080 + 45% (~=1080ti + 20%)

Think I’ve said this before, but my hope is that we will see a bigger improvement in raw performance this time around, since the 20x series was mainly focused on the introduction of raytracing more than huge general performance gains. Or, perhaps, a major reduction in the overhead raytracing currently brings.