KevinC
4131
If you’re referring to raytracing I don’t think this is the case at all, it’s just in early adopter territory which I think everyone recognized would be an issue when it launched. Given that both consoles are going to launch with raytracing support I don’t think it is a bust at all, there’s just not a lot of games that use it yet.
Control looks amazing with RTX-on.
Can’t wait to see the feature get across the board adoption once the new consoles come out.
rei
4133
By “bust” I’m referring to the low framerate and minuscule number of titles with support after V 2 years
stusser
4134
Raytracing is obviously the way graphics will be presented in the future for two reasons. First it’s correct rather than an approximation, so it looks “right”. And second because the product is less work to produce, there’s no need to worry about trickery, smoke and mirrors when you can just simulate the real thing.
Right now we’re in a transition period. We’re moving from Duke3D’s 2.5D fakery to Quake’s real 3D. The first-gen RT hardware is far too slow to really do it, even on the 2080ti, so they’re making all sorts of compromises, limiting the number of rays, resolution, framerates, only using it for reflections or shadows, etc, rather than full-scene lighting. So it still looks better than outright faked or prerendered/static lighting but requires a fair amount of optimization and work to produce that result. But that won’t be the case for long! Soon enough every game will work like the RTX Quake 2, with everything ray-traced.
That’s simply how games will work, soon. On high-end PC in probably 3-4 years, on consoles maybe 6-9.
I’ve been planning a spring build for months, and while I know there’s always something better around the corner, this has me seriously considering waiting.
stusser
4136
Probably looking at fall for Nvidia 30-series, so that’s a bit of a wait there.
vyshka
4137
Yeah, it will be interesting to see how close they get to the upper end of “up to 50%”.
Re: Naming convention. Are they seriously going to the 30XX names now? I thought they went to 20XX to indicate that they have ray tracing now, so it’s a generational leap. So the next cards in that generation should be 21XX right? And then 30XX should come in the next generational leap?
stusser
4139
Well who knows? But probably, yes. 21xx would signify a smaller generational change, which would logically be worth less money, right? Nvidia is almost certainly going to raise prices, given their obscene greed with the 20-series.
I know, but this feels like a more significant upgrade than last year. My rig is already going on 6 years old. I hoped to have it in time to play Cyberpunk, but I am patient. If I’d managed to get some good deals on components I had picked out during the holidays, I’d feel differently about it.
vyshka
4141
I’m getting close to time to upgrade my system (minus my rtx 2080 gpu anyways). I just read some report yesterday that nand flash prices are going to spike due to a power outage at Samsung, so thinking of grabbing ssd now. I think my next system will be using an amd cpu.
Fully ray traced modern games will take much longer than that to happen. It will be multiple more generations of GPU’s before they can make engines the are totally done with RT which requires obscene amounts of calculations.
stusser
4143
Nah, I think 4 years, 2 full generations, will pretty much do it at the high-end. Which, by then, may mean a $2k GPU.
My guess is that we will see both aproximative/ lower res ray tracing coupled with AI driven image enhancements (a much more advanced version of DLSS).
I can see something like that viable in 4 years. Full ray tracing at 4K maybe too, but I dunno, the performance hit might not be worth it
Also in 4-6 years I think we will see GPUs close to the $4k price, but not for consumer purchase, but for premium streaming services.
What do the soothsayers say about 1080p RTX then? Will that be priced low enough for common mortals, or at least not multiples of my monthly salary? :D
schurem
4146
All I want out of my 30x0 is better VR performance.
Had major FPS problems recently (on Nvidia).
Turns out by clocking my mouse down from 1000 hz to 500hz polling rate, it fixed the FPS issue.
Basically went from 13-25 fps in Destiny 2 (with 1000 hz) to 144 fps (with 500 hz).
Just a FYI. - Had same FPS issue in Warframe.
For the GPU discussion:
I do want a RTX but impression is that unless you get the 2080 there’s not enough performance to have it work as it should in any game? Wonder when Nvidia will roll out 2nd gen RTX cards, hopefully they will be better suited for this.
rei
4148
Outside of the RT features, the 2000 series has better 2D scaling (potentially for emulators) and better hardware video transcoding performance I believe. Not worth it for me personally at the high cost. Maybe when 4000 comes out and 3000 gets a price cut.
Daagar
4149
Holy cow, how’d you even make that connection? I have apparently always had my mouse set to 1000hz, but now I’m going to have to play around with 500hz just to see if it does anything (granted, my frame rate isn’t in the toilet like yours was).
FWIW, I have a 2060 in my laptop and I had no issues playing Control (at 1080p admittedly) with most RTX settings on. I’m not a framerate purist though, so it was mostly around 45-50.