When do the next generation GPUs drop?

Same. I don’t want to pay more than 300 for a video card. The 1060 I got was perfect price/performance for me.

Maybe their decisions on pricing here have a lot to do with the use of 10 series cards for mining? Perhaps they overestimated demand/willingness to pay after watching cards selling for well above retail?

Well, Apple did it because they were selling (relatively) fewer phones than before. At this point, everyone in their markets who wanted a smartphone had a smartphone. Even China was at saturation. So how do you grow revenues when you’re moving fewer phones?

Raise prices.

I think that’s probably true, yes.

If you want to buy a videocard for $300, Nvidia will sell you one. The GTX1660 will probably come in at $299 and offer 1070 non-ti performance levels.

They just got caught with their pants down with the collapse of crypto. The 2080 series cards are monster crypto cards, ~50%-200% faster than the 1080s, which compares to their very modest increases in gaming performance. That’s why they released with these prices, the intent was to “say” they are making gaming cards, but really, sell them to the crypto guys at huge increases in profit. Why, exactly, they seem to be afraid to market their cards as mining cards first must be some kind of marketing issue.

I can’t prove this - i don’t have the technical chops to understand the underlying architecture that well - but i very strongly guess that the ray-tracing was a ‘pivot’ technology they turned to try to squeak out “something” from the new cards, that something about the underlying architecture for mining and ray tracing is similar enough that they quickly shifted marketing gears to ray tracing once crypto collapsed and the architecture they actually had on hand wasn’t really an improvement in gaming.

It’s also why i would guess that ray tracing is going to end up like PhysX cards and be a “marketing” technology for a couple of years and then fade away either into the background technology from computationally cheaper, similar techniques that are platform agnostic, or just fizzle out once the need to push ray tracing diminishes a couple generations from now when the cards aren’t being optimized for crypto anymore.

They use shaders for cryptomining not dedicated ray tracing cores, so I don’t see how that could be true.

Yeah, if they wanted to make a card primarily for mining they would have just loaded it up with CUDA cores, right? That would have had a direct impact on raw gaming performance, too.

I don’t think you can come up with RTX on a whim just to try to squeak something out of a launch. They must have been working on that for years. I think they were/are looking for something where they have a demonstrable advantage and RTX makes sense for that, but they failed on the pricing.

At $200 less at all levels of the 2XXX launch I think they’d have sold way more product and thus have a better story to tell developers about supporting RTX.

Right, the ray-tracing stuff wasn’t part of a last-minute plan to capitalize on cryptomining, Nvidia really was working on it for a long time.

$200 lower would still be a $999 MSRP for the RTX2080ti. That’s way, way too expensive for a gaming card. Nvidia can get away with calling it a Titan, because that’s not really targeting gamers.

There added circuitry is less RTX specific and more Machine Learning specific from my understanding, so I think they are trying to capitalize more on the AI hype than RTX hype.

Yeah, you’re right. If I were them, I’d aim for price points at:

$199 - low-end card (XX50)
$299 - average gamer card - push for volume (XX60)
$399 - high-end card (XX70)
$499 - luxury gaming card (XX80)
$799 - ludicrous card to milk money from rich gamers (XX80ti)
$999 - ultimate performance (Titan)

That would still be a bit high at the 80ti range, 14% higher than the 10-series, but it wouldn’t be complete insanity. Here’s how the 20-series should have been positioned, to be in-line with previous generations.

$999 - Titan T, 30% better performance than the RTX2080
$699 - RTX2080ti, 30% better performance than the RTX2080 (but released ~12 months later)
$499 - RTX2080, 20% better performance than the RTX2070
$399 - RTX2070, same performance as a GTX1080ti
$299 - RTX2060, same performance as a GTX1070 or GTX1070ti
$199 - GTX2050ti, same performance as a GTX1060
$149 - GTX2050, same performance as a GTX1050ti

If they released this stack at those prices, which are completely in-line with previous generations, it would have been very successful.

With no console launch driving software to generate hardware demand, it’s very possible this gen was always going to be a soft one for both Nvidia and AMD. RTX may well be performing pretty much as expected while accomplishing other strategic goals, like demonstrating future console tech and reinforcing partnerships with the DICEs of the world.

That and Microsoft obviously was working towards it was well with DXR support.

Edit: Man they took a beating on revenue. $500M shortfall from what they projected.

I think they are different bits, they have the tensor cores in the there for machine learning and the dlss stuff, but think the tracing cores are their own thing.

Nvidia lost half their market value over the past 4 months. It’s downright catastrophic.

Not to imply overpriced underperforming Turing cards are solely to blame, we’re also in a trade war with China and cryptocurrency crashed. But it was certainly a major contributing factor.

Everything I’ve read (and I’d have to find previous sources for this) but there’s no hardware dedicated to ray tracing, but the RTX (and DXR) APIs are just able to use the tensor cores to speed up the BVH calculations to help ray tracing calculations. The tensor cores are also used for denoising the ray trace result, but at the core it’s all tensor cores (and that’s why they’ve gotten RTX working on Volta cards, just a bit slower).

Edit: I guess I was partially wrong, the RT cores are different, and they are cores dedicated to BVH calculations.

https://devblogs.nvidia.com/nvidia-turing-architecture-in-depth/ and their whitepaper look like they go over the architecture pretty well.

All I know is I was willing to spend around 800 dollars for what I thought would be a top tier video card. Since I wasn’t willing to spend 1200, I wound up spending closer to 300 for a just fine card, and now i am sitting this generation out and maybe the next one too.

I probably wouldn’t have upgraded anyway, because I have a 1440p monitor and a GTX1080. But if Nvidia offered something sexy like 1080ti+20% performance for $599 and raytracing, I might have bitten. Maybe.

And before everybody starts scoffing and saying stuff like you’re dreaming, you’re in fantasyland, etc, the GTX1080 had a launch MSRP of $599 and was 20% faster than the 980ti.

And yeah, you couldn’t buy them for that price for a couple months, before someone interjects. But that was the launch MSRP.

You’re ruining this discussion thread by telling us our counterpoints before we get to them.