When do the next generation GPUs drop?

Well you are probably right, seeing as how selling a dedicated RTX card for what like 2 games that currently support its features 4 months later isn’t very impressive.

Someone should ask him in an interview how shareholders feel about the rollout of the rtx 20 series. :)

Yeah, there’s always that catch 22 with new tech. People won’t spend the resources to adopt it unless there’s enough of a market, but the market won’t grow until there’s enough of a reason to spend the money. Like you say, I’m not going to drop money on a dedicated card when there’s like 2 games that use it and no idea if there will be more.

I think my forcing it on their video cards, all the people that are looking to upgrade or buy a new computer are going to be part of that market/install base. Now developers can look at all the people with 20-series cards and say yeah, lets do some ray tracing.

If AMD had a competitive product they would be eating Nvidia’s lunch, because the performance games were minimal and the prices outrageous. But alas.

Well it is only 1 more year until Intel unveils their gpu, right? RIGHT?

RTX is killing Apple too!!! (my point is the whole market is down, tech in particular, so I don’t know if that drop in stock price can be attributed to RTX as opposed to coming off of the crypto high and overall bearish market)

Red Hat suffered no such fate :

Of course it was on that trend as well until IBM announced they were buying us.

Haha, yeah. I mean, I hope the RTX series is a dud for Nvidia, because good lord is it a stinker at those prices. I just don’t know if that’s bearing out in the chart or if it’s primarily other factors right now.

Just a thought… Is it possible that unit sales in general are down significantly since the console generations slowed down its graphical advances to such an extent that a 4 year old 970 is still holding its own?

Perhaps desktop gaming PCs (& their gfx cards) are not as prevalent in the modern era of consoles, laptops and tablets, thus each gfx card needs to be pricier to recoup its development costs?

I don’t think so. I think for both Nvidia and Apple 2018 was a “cash in” year where they tried to leverage their leads, respectively. It didn’t work for Apple, and I have a feeling the higher prices are going to hurt Nvidia as well.

Yes, I agree. Nvidia saw that miners were willing to buy their GPUs at faaaarrrrrrrrr higher prices, and figured they could just increase MSRPs to match. But the miners have gone the way of all flesh, and gamers aren’t willing to kick in twelve hundred freaking dollars for the only card that’s actually faster than the previous generation’s $699 card.

I don’t know why an entirely new gen video card launches before console refresh. Console refresh basically drives the bottom of what a desktop video card should be, because if you go multiplatform with your game, you have to at least run reasonable well for the lowest common denominator, which is console hardware. That in turn sets the bottom of the desktop equivalent video card. If you get better desktop video card, you will be guaranteed a better experience than on console.

It is games that drive hardware, not the other way round. If you got lots of games for the hardware, then the hardware will sell. Just look at VR and Switch. Switch with so many games people want, and that drives hardware sales. OTOH VR still doesn’t have any compelling games (porn may be one but I kind of doubt they want to make it a selling point…), and hence VR still hasn’t gone mainstream.

Also look at how many multiplatform games still use DirectX11 only, and play well on relatively modest hardware that performs the same or better than console equivalents, i.e. 1060 and 480/580s. That bottom was set by console video card, and its modest price.

So as long as the new consoles from Sony/MS aren’t supporting ray tracing, then frankly the bottom in desktop will not move to ray tracing hardware (unless the price is so cheap that buying a ray tracing card is a no brainer).

Right now I’m on RX480, and I’m in no hurry to upgrade from 1080p gaming, given the next upgrade is a hefty 1440p or above monitor AND a RTX 2070 or above video card.

There’s going to be a divergence here. Next gen consoles will push 4K 60Hz gaming, not raytracing. Theoretically Nvidia would be out of position with their current offering, but they have no competition.

But don’t forget that next gen won’t launch until holiday 2020 at the earliest, so there’s time for Nvidia to extract full price before they have to react.

They could assume that they’ll launch the 3080 in late 2020 and the the consumer level, true 4K 3060 for $250 in spring 2021, just as next gen is ramping up.

Remember, AMD is making a custom chip for the PS5. If it doesn’t do raytracing, there’s going to be a massive install base that won’t support it.

I was reading an article the other day that there’s not really any actual Raytracing specific chips on the card. Volta cards (Titan V) is actually capable of running Battlefield V with raytracing on (though at about half performance than RTX cards). Part of that is because BFV uses DXR and not full on RTX, but mostly it’s because the RTX cards just have cores (I think the tensor cores?) that help with Bounding Volume Hiearchy calculations, which can be used to give some push for Ray Tracing calculations.

They also make the custom chip for the Xbox. Neither will support ray tracing, I’d bet lots of money on it.

They are going to be focused on 4K @ 60Hz and they need all the silicon they can get.

“Support ray tracing” is a pretty nebulous statement. Devs can make ray tracing stuff on PS4 level hardware already. What RTX adds is fixed function accelerators for certain workloads like BVH (RT cores) and denoising (Tensor cores). You can build DXR drivers without hardware acceleration for either and there are presumably ways to improve your programmable compute hardware to a point where the fixed function hardware is not entirely desirable.

2070 may be a bad deal, but I don’t think the above is true. Based on the anandtech benchmarks the difference between the 1070ti and 1080 is much smaller than 2060 and 2070. Also compare the 2070 against the 2080. High price aside, the cost difference ratio seems to match the frame difference ratio for the three 20xx cards.

The thing that’s got me scratching my head is how close the Vega 64 is to the 2070 in those tests. If that is true, the Vega 64 seems a better deal than expected at $399.

My main complaint with AMD/ATI cards is they’ve always run hotter, louder and consumed more power compared to NVIDIA who keep shrinking their PCB sizes and are nigh noiseless at idle.

The 1070ti was so close to the 1080 that Nvidia stopped AIB partners from selling pre-overclocked cards because they were afraid it would cannibalize 1080 sales. The Vega64 has always been essentially a hotter running 1080, and the Vega56 same with the 1070 non-ti.

I’m looking to upgrade from my 970 now that I have a 1440p g-sync monitor. My budget is under $400 so I’m thinking the 2060. I was looking at getting an EVGA 1070 ti from their ebay store or Amazon because I have some gift cards but the stock seems to be dwindling. The benchmarks between the two seem pretty close. my question is will the 6gb of ram on the 2060 be an issue?