When do the next generation GPUs drop?

I think it was AMD’s epic failures that allowed Nvidia to get away with it. They are so far ahead of AMD at the moment they could afford to make the jump now.

That could only be true if Nvidia had insight into AMD’s dev process, if they knew that AMD wouldn’t get Navi out in 2018/2019, if they knew that Vega wouldn’t be competitive with Pascal, all in advance. Short of corporate skullduggery, that isn’t the case.

Nvidia made a severe miscalculation and were simply lucky that AMD couldn’t take advantage.

Gotta remember too the massive difference in company size. Nvidia is what, like 5x AMD and then Intel is another 10x off that?

Did AMD not publically choose to not compete on performance, instead focusing on low cost, and low power? Why would skulluggery be needed? AMD still hasnt beat the 1080 have they?

Back at the RX-480 release in 2016 they were set on owning the $200 pricepoint, and did so very well. That was a hell of a GPU. Until cryptocurrency mining took over, anyway.

Vega was supposed to be their enthusiast flagship, and Vega 64 competed with the GTX1080 at the same $499 pricepoint while consuming a ton more power. It was another loser in a long line of AMD GPU losers, where they pump up the power utilization and heat to the limit to compete with Nvidia’s cool-running GPUs.

Navi is their next generation flagship GPU, and it was supposed to release in 2018 but 7nm production issues broke that. It’ll come out, well, probably sometime in 2019. Whenever TSMC has enough capacity after servicing Apple.

I would love to see AMD get back up there. I used them for a number of years until the 1070 came out.

Edit: oops, this wasn’t meant to be a direct reply.

Nvidia still needs to move units, regardless if AMD can compete or not. I was all set to upgrade but I’m not going to bother, the performance isn’t much of an improvement and the pricing is absurd. And ray tracing? Who gives a shit? Maybe it’ll be something cool fine years from now.

I can’t imagine I’m the only Nvidia owner who opted out of an upgrade but I am curious if it impacts their bottom line at all.

I imagine a lot of people opted out. It is very expensive. If I’d bought a 1080 last time instead of the 1070 I probably wouldn’t have upgraded, but that and I wanted to play around with the tensor cores made me jump. I don’t know if I’ll ever get to see the ray tracing in action. Windows still doesn’t want to install the 1809 update that will enable DXR.

I think part of what is happening as well is Moore’s Law is running into economic laws, the cost of developing for these modern cards might be so expensive that it’s hard to scale up- so few things need all that power.

Especially when games of that budget also would have to have console ports as well.

Something doesn’t add up with those BF V 2060 “leaked” numbers or the settings they are using. The 2080 barely does a solid 60 FPS under BF V at 1080p with RT going and high setting how can this thing do it too?

They’re playing a BS marketing game here.

I think recent BFV patches and driver updates gave quite large ray tracing performance gains.

They did, yes. They can simply generate less rays, lowering the resolution of reflections and occlusion and greatly improving performance.

So what does that do to quality? I personally haven’t been that impressed with how much RT in a game like BF V adds. At least when looking at the cost of doing it.

I haven’t seen any articles exploring it deeply but I assume it reduces the resolution on shadows, reflections, occlusion, etc.

Here’s a review of the impact of the update. I think the promised Tomb Raider update never happened but Metro Exodus is supposed to have a more fully fledged RTX implementation in Feb - lighting rather than just reflections I think.

I’m sure that’s part of it, but if they didn’t waste so much space for the dedicated ray tracing stuff I think we’d see noticeably higher performance, cheaper prices, or both.

A videocard manufacturer? I’m shocked, shocked I tell you.

Is raytracing something likely to be in next-gen consoles? If not, I’d prob just grab a 1080 for my next PC. I tend to base what I get around console games.

Not while it’s still a high cost Nvidia-only solution. Though DLSS might conceivably be enough of a draw to get the platform holders to switch from AMD.

My understanding of the information I saw was that they only use raytracing for actual materials set as reflective. They fixed performance by optimizing the algorithm which detected which pixels on the screen need to have a ray shot out from them. The newer algorithm is better at detecting which areas of the screen are guaranteed not to need ray tracing, thus less rays are shot out and thus higher performance without less quality.