When do the next generation GPUs drop?

Could be true about ray tracing, but from what I recall in some reviews the DLSS is almost no effort. Not sure what that means in practice though, maybe just a few person-weeks of software engineer time for a major AAA release? So it could be ok.

You still need the RTX hardware to use it though. So if user base is negligible, even if it is relatively cost free for developers, they may still not be bothered.

The last I read, Nvidia has pretty much cornered the gaming GPU market, with AMD adoption rate hovering near 15% vs Nvidia ~80%. Nvidia still needs to lose BIG in the next gen before AMD can catch up.

The question is how much developer integration is needed for ray tracing. For example if Unity and Unreal Engine can implement ray tracing at the proper level then it may just not require much from individual developers to integrate, and they’d get it for free by hooking into existing light sources.

DLSS will (presumably) require some effort by every individual dev to deliver data to Nvidia for processing though.

That is the question. But it’s hard to believe it truly will be “free.” And if that’s the case, any dev who goes to their exec producer and asks if they can spend time wiring up RTX effects for the 3,500 people with 2080 cards will be promptly told to go back to their desk and fix bugs. If their EP is smart, that is.

This possibly will change over time, but I’d say that is a 5-year horizon, at least.

I’m not sure this is even comparable to DX12. Perhaps the best comparison is PhysX. It hurts to think about, but proprietary standards that require specific hardware to work are bad ideas.

My guess is that this generation will be fairly successful despite the high prices, just because Nvidia is pretty far ahead, plus it’s been awhile since the 10 series release. There are a lot of 9 series owners like me, who wanted to skip the 10 series and are ready to upgrade.

I checked the steam hardware survey and Nvidia has around the top 15 cards, and even Intel has two spots above the top AMD card (which is the R7). Also the market is dominated by mid range cards; the 1060, the 1050 Ti, the 1050, and the 960.

So I guess the answer to “will DLSS be adopted” is the same as “will the 2060 and 2050 have DLSS”.

DX12 is great technology. All that stuff Brad wrote is pretty much spot-on.

The problem is that 10-series and earlier Nvidia GPUs don’t perform well in DX12, and Nvidia has a monster marketshare advantage over AMD. Nvidia 74%, AMD 15%, Intel 11%. AMD barely beats Intel. For gaming.

Additionally, Steam only shows a 60% share for DX12-capable GPUs on Windows 10. So if you target DX12, 40% of your users can’t turn it on. And of the 60% that can, 74% of those are Nvidia GPUs and it’ll perform like crap. Do the numbers, if you target DX12 right now only 16% of gamers will take advantage of it.

https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

Does the 20-series do any better with DX12, or is the data just not available yet?

Nope. Ashes is the classic DX12 benchmark, and the 20-series performs about where you’d expect compared to the 10s. The 2080 actually comes in a little below the 1080ti. Pay special attention to the 1080p graphs, the game should be CPU-bound in that benchmark, which is where DX12 is supposed to offer the most improvement. It doesn’t.

https://www.anandtech.com/show/13346/the-nvidia-geforce-rtx-2080-ti-and-2080-founders-edition-review/8

Now that doesn’t hold with other games where the 20-series performs much better relative to the 10s. Vulkan with the idtech engine, for example, and other games that are technically DX12 but don’t use it to the extent that Ashes does.

Oh I’m not saying Brad was wrong about the technical side of things. But if games don’t use DirectX 12, then my shiny DirectX 12 card (or better performing DirectX 12 cards like AMD cards over Nvidia) isn’t going to be used to its full potential often, if at all. So why bother with a better performing DirectX 12 card, especially if I’m in the mainstream where price is as important as performance?

Now I see more DirectX 12 adoption gradually. Last time I checked FIFA 19 is supporting DirectX 12, but I don’t know if there is noticable performance gain over DirectX 11, while in theory there should be plenty. After, what, 3+ years?

Replace “DirectX 12” with “RTX” and you get my argument.

Once RTX is in people’s hardware, it isn’t going to go away. Sure. But how long do we have to wait for game support, after the initial hype dies down? The hardware sales figure will come into play if we want to see large volume of second wave titles. At RTX price point it is hard to see wide adoption, even if there is pent up demand after Nvidia’s last refresh. So games have to be lighted in the old fashioned way for some time to come anyway, given that there are soooo many older/AMD cards out there. There is absolutely no hurry for publishers to throw money at RTX titles, other than as a low cost experiment hedging their bets.

Nvidia also sold us quite a few hardware white elephants over the years. @Menzo mentioned PhysX, which Havok sort of can do similar things, without dedicated Nvidia hardware. And 3D Vision.

Yep, you’re correct in every particular. Raytracing won’t take off until GPUs that can do it are reasonably priced and have a reasonable marketshare. Until then, only games that Nvidia pays off will support it.

Pic of RTX2080 (grey) and GTX1080 (orange)

So the RTX 2080 is installed and …glowy pixels all over my 4k screen.

I remember this once happening! My HDMI cable is going bad!

I grab an old one from a box in the basement to test the theory, and its even worse, lol, just glowy pixels everywhere, like a star show twinkling. I drop down my 4k screen to 2k and they go away.

Time for a new HDMI cable capable of 18Gbps or more.

@Editer did you use a new cable? or rather what cable are you using?

I have to use HDMI to my 4k tv.

4K/60hz capable HDMI cables can be a crap shoot. I’ve bought multiple brands specifically advertised for that which still don’t work. There are lots of fancy looking ones on Amazon so it’s hard to know what will actually be reliable. I had the best luck with Monoprice’s 18Gbps options.

Cnet seems to really like Amazon cables.

Sorry for going off GPU topic. ;)

I went with a 3 pack of Amazon cables.

I’ve had a few of those fail. The biggest problem is… Amazon changes manufacturers. They have the same problem with their batteries. So one year you get batteries that are right up there with Enenloops and then the next year you buy them and they don’t last past a couple of years.

Yeah, thats why I ordered the 3 pack, to be safe!

I’m actually using a DisplayPort cable. But you definity need a high speed cable for 4K. When I upgraded my TV I had to replace a bunch of otherwise perfectly good HDMI cakes.

World of Warcraft lets you choose DX11 or 12. DX12 is a ~15% performance hit.

be a bro and give your gf a GPU upgrade to play with while you are out

At the prices cards are going for now, it’s cheaper to higher an actual artist to paint your game as you play.