When do the next generation GPUs drop?


Nice! Looks like you got a pretty functional VRR monitor there with Nvidia.


Yeah, I"m pretty happy with these LGs (even without this). I’m assuming the reason it wasn’t marked as compatible was mostly because the Freesync range is only 48hz-60hz.


I tried it on my AOC G2460PF and it looks to be working with no issues. I didn’t have time to really run it through a bunch of trials, but it looks promising. I ran 3dmark and didn’t see anything go wrong and also ran a couple of rounds of PUBG. Everything seemed a lot smoother. I did get a stutter here and there, but that might just be because it’s PUBG. :)


This rumour of a new, lower end, RTX-free, card seems more plausible to me. And less interesting than the rumoured 1180 :(


The RTX2060 can barely handle raytracing at lower qualities and 1080p. So what would a RTX2050 target, 720p? C’mon.

I do wish they called it a GTX 2050, though. The GTX says no raytracing, the 20 says it’s turing, and the 50 indicates its performance level, which in this case would be a bit lower than a GTX 1070.


If I hadn’t gotten such a good deal on the 1070TI I would have been super pissed at me for not waiting for the 2060. Thankfully I got the card for 360 so I still feel like it was a good deal. Bought that last month. I have zero worries that it doesn’t do ray tracing.


My 1070ti was $40 more and 5% slower. Totally worth it to be playing on this awesome PC for the last three months.


I think the 1070 Ti / 2060 are going to be a good purchase regardless, because it is the current front edge of mainstream performance.


Err is there any reason to get a 1070ti over an RTX 2060? Is the extra 2gb much of a factor?

The 2060s are priced at the bottom end of the remaining 1070tis in our market…mind you by the time I make up my mind there may be no 1070s left anyway.


Nope, get the 2060 :)


Definitely get the RTX2060 over a GTX1070ti.


That is what I figured. Thanks to you both.


Regarding VRR on Nvidia GPUs, they do actually implement low framerate compensation correctly. The Nvidia VRR implementation looks to be perfectly fine, and every VRR monitor that works on AMD should work similarly on AMD. Their CES demo was deceptive in using a particularly crappy/broken monitor.


If they actually release a series on non RT cards then Nvidia is admitting themselves it’s nothing more than a quaint gimmick that is still many years away from mainstream…which it is.


Not if it’s limited to the x50 series and lower.


I wonder to what extend DLSS and ray tracing were envisaged as complementary. Was fast AA supposed to compensate for slow ray guns?


DLSS is much more than fast AA. It actually renders the game in a lower resolution and then through the use of magical AI fairy dust, supersamples that to upscale it with limited quality loss to present an image that looks pretty good. It’s a trick like checkerboard rendering, just with a much better quality output.


The quality has yet to be demonstrated for DLSS in the wild. There are lots of drawbacks and it’s yet to be evaluated on anything but fixed path demos/benchmarking tools which are a best case scenario for an AI trained technique.


You’re 100% correct, and you’re right to say so. It hasn’t shown up in the wild yet.


Seems like regardless of whether these things work well or not, by the time the options have been explored the next gen will be along.