Yes that is my hope as well. I just severely lack optimism when it comes to Nvidia after they bent us over dry and raw with the 20-series.
Not me they didnāt. I just went āmehā and skipped an upgrade cycle. Being a patient adult rocks at times.
I skipped the entire generation (and the super refresh) too. Still feel poorly used.
And I bought a 2070 for ~$500.
It does exactly what I want it to do and is a huge upgrade over my 970.
I upgrade infrequently, so I guess Iām just less price sensitive than some. Heck if it werenāt for Eagle Dynamics releasing their Hornet, Iād probably be just fine with the 970 still.
I just donāt want the space heater rumours to be true.
Iām running under the assumption that Nvidia will try and raise prices again, and thus I wonāt be able to get a worthwhile upgrade to my GTX 1080 anytime soon (granted, I bought my 1080 used a few years ago for $350 so that kind of skews what I mean by reasonable amount).
With that assumption I went ahead and bought a 1440p Gsync-compatible (officially) 144hz monitor to replace one of my 4k60 Freesync monitors. 4k performance really sucked in a number of games (even with things turned down), tearing would happen despite freesync, not being able to borderless windows was annoying, fullscreen at lower resolution always messed up my window layout on my 2nd monitor, etcā¦
Dying light finally made me give up, as I was getting 50-ish fps (with massive tearing) at 4k at middle preset, and trying to downscale the resolution caused it to only use a portion of my screen (and not scale up, despite fullscreen being on). On my 1440p at best quality I am always between 100-120fps and no tearing. Itās grand.
And with one monitor being 4k and one being 1440p I can definitely say my eyes canāt tell a difference (at 27" at least). Putting applications side by side I can barely tell. Although having two monitors at different resolutions is a bit annoying for windows reasonsā¦
Iām still sitting on my 970, but itās starting to feel age creeping up on it along with my ancient 2500K. My hope is to be able to afford a new build with this next wave. I was sorely tempted to grab a 2070, but given the CPU would be such an enormous bottleneck it made sense for me to wait a little more. Now I get to sit here in envy of your awesomeness, but my day will come, dammit! ;)
I saved up a bunch, want to build a whole new machine. But I might still be short of a 3080 :/
Yup, 2500k and 1070 here.
Really excited to pair zen3 & rtx 3070/80 with a b550 itx.
And a HP reverb G2. Aww yiss!
You are going to need that 3080 to drive that sucker.
3070 will manage, people are happily driving G1 reverbs using 2080 or even 1080 so I donāt see why a 3070 shouldnāt manage. Iād love to splurge for a 3080 but itās going to be prohibitively expensive.
Recent leaks are saying AMDās next GPU (ābig naviā, or RDNA2) will be 40-50% faster than the 2080ti. As usual, grain of salt. Most interesting part is the leaks came from Nvidia, those are the numbers Nvidia is looking at with Ampere. This is (again, rumors) done via very high power GPUs, >300 watts, from both sides. So theyāre eking out as much performance as possible by pumping the clocks, voltage, and cooling.
AMD being competitive this cycle should keep Nvidia from raising prices, too.
I think we are approaching a point where raytracing performance is going to be what matters most. It will be interesting to see how the performance overhead compares between the two platforms.
The rumors say Ampere RT performance is monstrously better than Turing too.
I donāt think thatāll be used for higher framerates, with DLSS2 a 2080ti is perfectly capable of running ā4kā with RT active. My feeling is RT image quality will increase, and itāll be used for more effects like perhaps even global illumination.
Yeah, exactly. Assuming for the moment that the ampere rumors are true, I wonder if there will be larger overhead on the AMD side, since nVidia has been focused on Ray tracing for longer. Then again, AMD seems to be killing it recently.
Most importantly, if AMD is more competitive it might make process more reasonable this gen.
Just saw the DLSS video you posted in the Death Stranding thread. Thatās incredible.
Oh really⦠goes to lookā¦
Goodness :/ Thatās quite something. Surely a best case, though impressive in any case.