schurem
7125
I have a position in a dutch stores’ queue for a MSI 3080 something something. Fell 4 places today, so yay! Perhaps sometime this century… I missed the boat on the first load of G2 headsets as well, so my wait for the raison d’etre for the 3080 is delayed as well.
Flying like it’s 1991 now, but at least it’s full VR! My bro’s from the mudspike are already gearing up for their annual christmas flight. I had hoped to finally join them this year with MSFS2020, but that ain’t happening, unless I get my stuff in a week or two and someone releases a fast jet with good long legs for that sim. I’m not flying it in 2D and I’m not interested in the default machines with their glass cockpits. Steam gauges or bust!
They’re flying to Cape Town this year, and trekking all the way from Europe down to the tip of Africa sounds like fun, but not in a single engine bushplane. I had hoped to do it in a C-47, a spitfire or perhaps even a 70s fighter jet of some sort. Something interesting to fly and navigate and with a great view.
No idea, I figured it was 2021 like most games afraid to release vs Cyberpunk. :P
It’s out tomorrow! Get excited!
(Not really, nothing about that game looks appealing to me).
EDIT: To be fair, you never know. The important thing about an ARPG isn’t how it looks, but how it plays. So even though I don’t like the look of it, maybe it’s addictive.
KevinC
7128
My 3080 is out of date and hasn’t even arrived yet! :)
Thats how PC hardware works!
Time to upgrade already!
jpinard
7130
See stuff like this makes me wonder if Nvidia gimped the 3080 on purpose for RAM. Like, I bet there will be some people who buy it, discover they can’t go full ultra features, and decide they have to buy a NVida 3080 TI or the 3090. Thus NVidia gets two sales. It doesn’t seem like the extra 2-6 Gig on the base 3080 would have hurt the price (or Nvidia) that much, but it sure would have helped the cards legs.
I think so, they sure as heck don’t want another 1080ti situation where people don’t want to upgrade.
jpinard
7132
That kinda makes me not like Nvidia.
This is why I’ve stuck to my guns on skipping 4K for now - it’s just not the resolution to play Ultra on.
Houngan
7135
I came here just to figure that out, my 1080 GTX is still doing everything so well 3 years later. Not a Ti but I still feel I got lucky. Same with my Inspiron work laptop, other than buying a new battery I can’t discern any degradation in performance vs. when I bought it 3 1/2 years ago.
Tortilla
7136
Yeah, with the 200XX series not being much of a jump over the 10XX series and 30XX’s being rarer than hen’s teeth, I’m feeling pretty good about continuing to game on a 1080ti for another year or two. I won’t be on 4k with Ultra settings but as someone who doesn’t have a 4k monitor I’m not terribly concerned on that angle.
Aceris
7137
Unless they do something incredibly novel - literally groundbreaking - I don’t see how it can possibly be competetive with DLSS 2.0, simply because NVidia can use the tensor cores (which are super optimized for this) to do the DLSS compute, whereas AMD will be stuck trying to run whatever CNN model they come up with on the same FP units they need for the shaders.
RAM performance is a big bottleneck on the cards, so they went for GDDR6x. Which is super expensive. So they went with the smallest amount they thought they could get away with.
AMD’s smarter memory architecture is a much better solution, but probably less good for compute workloads, which is where I suspect a lot of NVidia’s engineering talent is focussed.
I do suspect the 3090 was supossed to be a prosumer compute-focussed card until some marketoid came up with the loltastic 8K GAMING marketing speil. Hence the huge amount of RAM and unreasonable price.
stusser
7138
It’s very possible that AMD could do a lot of the DirectML stuff using their FP16 cores. Don’t count them out just yet.
KevinC
7139
Could be. I don’t think it’ll be too big of an issue, though. I can probably just run at a lower resolution with DLSS and get the best of all worlds. I’d likely be doing that anyway especially for games that have raytracing features. If I have a choice between resolution and FPS I take the latter.
Aceris
7140
I dug into this a bit. It looks like every RDNA2 compute core supports packed mixed-precision (NVidia does not seem to do this), which is a very elegant way of doing things and exactly the kind of thing I’d expect from AMD’s engineering teams. This does give Big Navi moderately impressive ML capabilities. But it’s still using the same shader cores as everything else, and it’s not as fast at tensor operations as the custom tensor compute silicon nvidia are using.
So hypothetically AMD could take the DLSS 2.0 model (if it wasn’t NVidia IP) and implement it on their architecture and run it. But I’m pretty sure it would be too slow. It might well be slower than just rendering at the higher resolution in the first place! (Even on the Nvidia cards with their custom silicon DLSS is compute intensive). So AMD would need a substantially simpler model that gives comparable image improvement.
Woohoo, managed to order a 3080! Considerably more expensive that I was wanting to pay a month ago, but fuck it, it’s a decent model and I (hopefully) don’t have to worry about it any more. Now I just have to get my CPU… Haven’t had any stock alerts on that at all yet.
You can see your queue data here – but it doesn’t tell you how close you are:
https://www.evga.com/community/myNotifies.asp
I’m feeling a little bit of angst, as I’m TRYING to build two systems; one for my nephew and the other for me. His gets priority, but it also means I lose out on the “one per household” stuff. My hope of a new PC build by year’s end is starting to feel like the memory of a dream, slipping away as my eyes open to the stark reality of supply shortages everywhere. Well, at least he should be gaming by the holidays.