Neither that $840 MSI 12 GB or the $870 EVGA 10GB (that’s been on the EVGA website a while) have sold out. Changed days. Soon it’ll be buy one get one free.
A few ebay prices are higher for some weird reason.
Prices are coming down because supply exceeds demand, that’s all-- and the next generation of GPUs is going to drop shortly.
AMD just launched the Navi refresh this morning. About 5-15% more performance due to slightly faster GDDR6 and increased power consumption. The main thing is they’re also resetting MSRP at a higher level, which probably tells us everything about how the next gen is going to be priced.
The days of a decent midlevel card at $300-$400 seem to be over.
Yeah, those days are likely done. I’m not expecting the 3000 series to drop significantly below MSRP; they’ll just get replaced by the 4000 series instead of sharing the marketplace for long. Slightly better drops on a handful being held in limited shelf space and of course used cards will hit, but I couldn’t imagine they’ll be THAT low ever again. Kinda sucks.
We don’t know that’s true for certain… The last years have been weird in a bunch of ways. Maybe plentiful consoles will make expensive GPUs less successful. Maybe Intel will try and buy market share. Maybe another weird and wacky event will roll up.
Or Taiwan will get invaded :/
Intel could possibly, except they’re relying on TSMC for their GPUs as well.
All those Intel fabs, and yet…
I wouldn’t be so sure. Free money from the government to convince people to stay home during the pandemic is over. Inflation is rampant. The fed is raising interest rates. We’re in a bear market. TL;DR, we’re about to enter a recession.
The atmosphere is about to change, where people are flippantly quitting jobs because they can’t WFH and can trivially get another job that will allow it. It’s a recession; even those people still gainfully employed will be less likely to blow over a grand on a graphics card.
AMD (and certainly Nvidia too) are working off 2021, where they could charge whatever they wanted and it would sell. Already, today, supply exceeds demand for everything we were obsessing over in 2021 other than the PS5. It isn’t 2021. Fall 2022 will be a very, very different place. AMD and Nvidia should be shoring up their $200 and $400 price-points.
morlac
10366
That would be nice. I’d like a PS5 at some point already having the expensive GPU albeit at launch pricing.
Too bad Sony just reported they only sold 2 million PS5s in the most recent quarter compared to 3.3 million for the same period last year. “Plentiful” is not a word they’re using.
Well as it’s pretty much impossible to get one unless you want to troll twitter feeds so I’m not surprised. It’s insane that they still haven’t caught up to demand. I know I personally would have bought one if I could have found one. Now? I don’t think I would as the Xbox has been great. Shrug
I guess. But from what I can tell, they’ve been over for a while, no? Perhaps since nVidia was in the 10xx series? The 2070 was a $500 card.
But I will say this: the local Microcenter here has a stack like Smaug’s Hoard of 6600 XT cards at $420. A 3070 will set you back about $650. That’s not great, but compared to where things were during most of the 20xx cycle, it’s really a nice price.
stusser
10370
The 3060ti MSRP is $399. That is obviously an inflated price as the GTX970 launched at $329, but it’s an excellent performer at 1440p, only ~15% below the 3070. That’s what I would get in its price range. The 6600XT is not competitive if you find them at similar prices. Of course you still won’t find a 3060ti for $400 today, but it’ll happen.
I hope FSR 2 adds up to something. DLSS makes Nvidia kind of a no brainer IMO. Deathloop to be the first game to support FSR2 for some reason, in a couple of days. MSFS to come too.
I don’t know how universal these results are, but I happened to watch a video yesterday which found FSR to be better than DLSS on image quality, with fairly comparable performance gains. I expect ray tracing performance will remain Nvidia’s biggest advantage.
Those videos are all bullshit. FSR1 is much better than nothing, but is not comparable to DLSS2 in the vast majority of games and scenes.
Like I said, I don’t know how universal those results are, but unless he’s lying about the settings & framerates, it’s just side-by-side comparison of data & image quality in the game he plays. That’s no different from when CGM would run side-by-side comparisons of Nvidia & ATI cards, considering things like texture filtering & anti-aliasing quality in addition to framerates.
I couldn’t tell you which version of each algorithm is being run there, but Nvidia only got substantially faster results when it also introduced more visual artifacts, and both seem to improve performance moderately (by roughly similar amounts) when minimizing visual artifacts.
Ray tracing, on the other hand, is something that massively improves image quality, and Nvidia is a clear performance leader on that front. DLSS vs FSR seems more reminiscent of gsync vs freesync, but I could be convinced otherwise by actual data from other games or an understanding of the algorithms themselves.
FSR has not until now used data from previous frames (unless I’m misremembering). That is part of how DLSS works and I think will be a part of FSR 2. As of tomorrow :)
Aceris
10376
I suspect from a couple of his comments that he just likes oversharpened images.
Also if the TAA implementation in that game is broken that will have a huge impact on DLSS, so this could be something specific to that game.
DLSS uses a machine learning algorithm (known to be an extremely powerful technique in image processing) using custom hardware on the nvidia card so as to not impact performance too much (Although given power limits it’s not like you can run everything on the card at full blast). While the initial implementation was not great, DLSS 2 has had generally good reviews and this is the only time I’ve heard someone say FSR has the same or better quality.
FSR uses traditional image enhancement techniques, and needs to run them on the same compute units doing the rasterization.
XeSS is the “freesync” version of DLSS, but even that will probably run badly on AMD cards because unlike the NVidia and intel cards they don’t have dedicated tensor arithmetic hardware.
The history of DLSS is interesting. It really feels like someone in nvidia dictated that they would put 2nd gen tensor cores on the RTX consumer cards and then the gaming team had to find a use for them. I don’t know if that’s because the consumer cards use the same dies as the prosumer cards and they wanted the tensor cores on those?
Right. FSR1 isn’t freesync, because freesync was good and FSR1 is charitably “better than nothing”.
FSR2 looks promising, though. I don’t love their use of post-process sharpening, but the end picture is what matters. A couple FPS worse than DLSS2 since it runs on shaders not tensor cores, but it works on every GPU. Main downside seems to be low resolutions, since it just doesn’t get enough data there and doesn’t have magical AI to invent it out of thin air.
That sounds like no big deal initially, but the most obviously useful application for FSR2 is on the Steam Deck.
Maybe XeSS will work better at low resolutions. My feeling is it will, but only on Intel GPUs.