When do the next generation GPUs drop?

I feel like if I want to max settings and keep 60-80 FPS for games over the next couple years, I need at least a 1080 and perhaps a 1080ti.

That’s why I’d love for the 2070 to do that for me for around $400 but it doesn’t seem like that’s happening.

Amazon emailed me saying my RTX2080 will be late. I am ok with this, just incase reviews aren’t great, so I have an excuse to cancel. ;)

You can always resell it at a profit. I wouldn’t stress.

That video has a terrible framerate-- it’s astonishing that they chose it as a showcase. 1080p, too. Pretty, though.

I dunno, I find half the scenes looking better with RTX off, especially the outside of buildings they show off. I understand it may look more realistic but I don’t really care about that in my games if it makes it feel like it looks worse. In reality I would just never notice any of these things to make the premium worth it.

Better is subjective, but I do certainly find them more immersive.

It’ll take a while for graphic programmers and artists to master, like all shiny new tech.

Maybe ray-tracing will be a throwback to the old ways of graphic tech from a couple decades ago: it used to be pretty common for new hardware to have new features but not be fast enough to use them. It just allowed devs to start adding support for new functions. There wouldn’t be anything fast enough to actually use the features in-game until the next generation of cards… which usually arrived within the next year.

Looks like the base level cards (ones without factory OC) are getting the lesser chips, and may not be able to overclock very well.

Anyone seen any benchmarks on how the performance of the 2000 series compare with the 1000 series with CUDA, more specifically tensorflow / deep learning / etc.?

I am putting together the specs for my office to order a PC for use with training deep neural nets. If you compare the 1080 Ti with the 2080, it’s 3584 CUDA cores vs. 2944, same memory, and similar clock speed too, suggesting the 1080 Ti is a better buy. Yet, I wonder if something in the Turing architecture will help it somehow.

No Titan V with the tensor cores? :)

haha good one, if only!

In games where everything is basically imaginery, e.g. zombies, demons, aliens etc., we simply have no reference to what “real” zombies etc. looks like under “real” lighting. Unless something is horribly wrong,

Even in the abandoned school in the demo, there is only one scene where ray-tracing is discernibly correct: a pitch black room with a window opening to let in sunlight. The non-ray traced version is obviously too bright.

Even in movies, where ray tracing is by default correct all the time, we get artificially lighted scenes all the time (e.g. if you are in a church in day time, sometimes because of the architecture it is still too dark the camera can’t see shit, so you still have to light it somehow).

All the RTX reviews are hitting. I haven’t had a chance for a deep look, but quick impressions…

2080 Ti is a decent bump for conventional rasterized games.
2080 is comprable to a 1080 Ti, in a few instances… Slower!

Since ray tracing is more of a novelty for this gen, it seems go big w/ 2080 Ti, or stay w/ the 1000 series.

What about price though? Shouldn’t the 2080 be cheaper than the 1080Ti, even though you’re getting pretty much the same performance?

Ideally. Whatever ‘should’ happen, the 1080 Ti is available new for about $100 to $150 less than the $800 RTX 2080.

The 2080 has all the RTX real estate to factor in as well, though. And no competition from AMD, so…

I have a 1080 right now and have been wanting to move to 4K, so I was interested in this generation of GPU. Looks like I’ll be waiting for the 30xx series though.

OR you can upgrade this gen and next, and pack your lunch for work for the next 2 months. :p

I agree, and that makes me very hesitant. I play at 3440x1440, and that appears to be a 2080ti choking at 1080p, due to the “computational budget” the narrator mentions.

That video also showed me that ray tracing makes a huge visual difference. I definitely want to take advantage of it. But I wonder if that’ll take a 3080ti. I think I’ll stick with my current 1080ti for now.

Hopefully this release will mean a healthy market for 1080p monitors for years to come and slow the onslaught of 1440p and 4k monitors taking over.