I dunno, I find half the scenes looking better with RTX off, especially the outside of buildings they show off. I understand it may look more realistic but I don’t really care about that in my games if it makes it feel like it looks worse. In reality I would just never notice any of these things to make the premium worth it.
It’ll take a while for graphic programmers and artists to master, like all shiny new tech.
Maybe ray-tracing will be a throwback to the old ways of graphic tech from a couple decades ago: it used to be pretty common for new hardware to have new features but not be fast enough to use them. It just allowed devs to start adding support for new functions. There wouldn’t be anything fast enough to actually use the features in-game until the next generation of cards… which usually arrived within the next year.
Anyone seen any benchmarks on how the performance of the 2000 series compare with the 1000 series with CUDA, more specifically tensorflow / deep learning / etc.?
I am putting together the specs for my office to order a PC for use with training deep neural nets. If you compare the 1080 Ti with the 2080, it’s 3584 CUDA cores vs. 2944, same memory, and similar clock speed too, suggesting the 1080 Ti is a better buy. Yet, I wonder if something in the Turing architecture will help it somehow.
In games where everything is basically imaginery, e.g. zombies, demons, aliens etc., we simply have no reference to what “real” zombies etc. looks like under “real” lighting. Unless something is horribly wrong,
Even in the abandoned school in the demo, there is only one scene where ray-tracing is discernibly correct: a pitch black room with a window opening to let in sunlight. The non-ray traced version is obviously too bright.
Even in movies, where ray tracing is by default correct all the time, we get artificially lighted scenes all the time (e.g. if you are in a church in day time, sometimes because of the architecture it is still too dark the camera can’t see shit, so you still have to light it somehow).
The 2080 has all the RTX real estate to factor in as well, though. And no competition from AMD, so…
I have a 1080 right now and have been wanting to move to 4K, so I was interested in this generation of GPU. Looks like I’ll be waiting for the 30xx series though.
I agree, and that makes me very hesitant. I play at 3440x1440, and that appears to be a 2080ti choking at 1080p, due to the “computational budget” the narrator mentions.
That video also showed me that ray tracing makes a huge visual difference. I definitely want to take advantage of it. But I wonder if that’ll take a 3080ti. I think I’ll stick with my current 1080ti for now.