I dread to imagine what the performance would be like on GTX cards, given the hit even RTX cards take.
Games with Ray Tracing are perfectly playable on older GTX hardware at 480p.
:)
It just depends on how realistic Crytekās demo was. IF that legitimately ran that well on a Vega 56 at 1080p (what the video resolution isat) then a GTX 1080 should be able to do similar.
Vega56 is close to a 1070 than a 1080. But yeah.
My first thought when seeing the CryTek demo was āThereās no way thatās real time!ā
But they say it was 30 fps at 1080p. The Beyond3d people have examined the video and noticed a bit of latency when reflections update and also some reflections use fewer polygons (RTX real time demos have used similar tricks such as making shadows more blurry than they should be). Thatās not something youād expect to be left in if theyāre faking a real time claim.
Even so weāre probably another hardware generation away from getting such visuals in real games rather than just tech demos.
So Shadow Of the Tomb Raider came out with their RTX patch. DLSS actually seems to work really well and offsets a lot of the performance loss by having ray tracing features on, but the ray tracing features are just better shadows which while look better, the game already had decent faked shadows before.
Also I guess Nvidia has been working on their own version of Quake 2ās renderer that uses RTX that goes beyond what that researcher did previously.
They appear to be comparing the CPU rasterizer (ie, not 3D accelerated) with their version, which isnāt exactly fair.
Yeah I donāt know why they did that but you can still see the reflection and lighting differences that arenāt trivial even with OpenGL.
Especially later when they just refer to it as ābeforeā and āafterā. Unnecessarily disingenuous.
Especially when it makes your game run like shit.
āYou can turn on this thing that you probably wont notice at the low, low cost of 25% of your frames.ā
So I tried the new Tomb Raider patch on my 2080 Ti. Running at 3440x1440 in āhighā settings on my machine plus ultra RTX Shadows got me an average frame rate of 55 on my machine in the benchmark. Turning on DLSS pushed it up to 67. The DLSS was impressive in that I could barely notice it. Nice trade off to have that option with FPS eating RTX.
I think I could live just fine with 55fps in a single player game like TR.
You could turn DLSS on with ultrawide 1440p? Wonder how they accomplished that, there are several methods but all with trade offs unless they just provided a full DLSS profile for that aspect and resolution. Will have to give a whirl, thatās good news for my 2080.
With adaptive sync, sure. Otherwise no.
Iāve never understood this - Iāve been playing PC games for over 20 years and FPS dips and changes and the like have never been that big of a deal for me, except the screen tearing. Especially once I started turning on vsync when that became a thing. I even bought a GSYNC display a few years back that does 144fps and I honest to god canāt really (at all?) tell when Iām in the 90ās or the low 100ās vs being at like 60 or even 40. I mean, Iām sure there must be a difference but without seeing it side by side Iām not noticing it - which to me, means itās not very useful (since I rarely play games side by side).
It used to be anything over 30 was amazing, and I still donāt have a problem with 30 (not less than 30 though, yikes). If you are one that is extra sensitive about that though, thatās a bummer and I empathize.
Locked 30 actually isnāt bad, many console games run like that due to their slow GPUs. When the framerate isnāt locked and the monitor refresh is, movement doesnāt feel smooth, it feels stuttery.
Vsync is perfectly smooth if your GPU is fast enough, thatās why you liked it.
Locked 30 is an abomination! What is wrong with you people! :)
Yeah. Iāve been leaving fraps open as I play Anthem and Iāve learnedā¦ I donāt really care abut framerates. Upgrade averted :)
Locked 30 isnāt that bad? wth is this blasphemy?