Funny how when the time remaining gets shorter, the wait becomes more agonising. I feel the same about both the G2 and the card to drive it.

Apparently the first Australian shipments arrive at HP here around end of October/early November.

Yep. It’s challenging.

If the upgraded 3080 was, for example, a 16GB card and only cost a little more (say $100 - $200 tops) then you might do it just to be safe. But 20GB is likely above the memory sweet spot for the next generation & nobody will assume that much as baseline. The big risk is that 10GB is also below that safe baseline configuration!

Diego

Yep, so lets hope for a 3080ti with 12GB or 16GB. :)

Stusser keeps saying 8 is plenty.

Stusser’s hardware recommendations are based on his own usage case, though. ;-) (See “Core i5 is all you need for gaming.”)

For us fringe flight sim wackos, people with super ultra-wide monitors, and people who want to play at high frame rate and stupid high resolutions, we’re edge cases who can use the extra oomph.

Cyberpunk 2077 at 4k will lets us know if 8gb of vram is enough or not. :P

Right, getting questions like that answered before buying might be smart :)

According to the speculative Moore’s Law is Dead video that Woolen Horde linked…

While the 10GB of VRAM on the initial 3080 is a frustrating small step back from the 2080 Ti, but no games I played used all that memory anyway. 10GB is a jump from the 8GB on the vanilla 2080 or 2070.

The latest Flight Sim seems to be designed to make use of as much memory as your card can provide. So for that specific use case, more is certainly better.

** and people with VR kit

Does raytracing have much of an impact on memory usage?

As I understand it, ray tracing will gobble up a LOT of spare VRAM if it’s available.

No current games use >8GB of video RAM even at 4k. Nobody can predict the future, it’s possible that will change.

Next-gen consoles effectively have 16GB, but that’s shared with the system. They both also have mechanisms to treat very fast NVME storage like RAM, with new hardware texture decompression. I predict both Nvidia and AMD will announce something similar in their next GPUs too.

I have 8GB on my GTX 1080 and with ray tracing + next gen consoles coming out, I’m a little wary of dropping a bunch of money on a 10GB card. Maybe it will end up being enough, but I worry about shelling out for a new card and then a year or so down the line when next gen console games start rolling out I won’t have enough VRAM to enable certain ray tracing features and that sort of thing.

There’s a slight possibility that if UE5’s Nanite system is utilized by devs then we may see storing a lot of vertex data in vram. That could potentially increase vram requirements, though with UE5’s timeline that probably won’t be a factor until next generation.

I wouldn’t hesitate to buy an 8GB card today. I would not buy one with less than 8GB unless it was very cheap and being used on a living room computer or something.

This RAM thing bugs me. We’ve seen it on the low-end cards (where you typically did not want the lower of the two), but if this is true and they’re bifurcating the penultimate card on RAM alone (and staggering the release dates on top of that) it’s just annoying.

Well, it’s conceivable that if some of the stuff Moore’s Law said it’s true that the cards that wind up getting more memory could end up being better samples (not sure that’s the right word). For instance, if 3090s are going to be super-limited, maybe some boards that could have been 3090s wind up as 3080s with extra memory. That’s pure conjecture, mind you

Yield is the issue on very large chips; 3080s are just binned 3090s, where parts of the chip are defective. If yields are high enough that 3080s are getting what would have been fully functional chips then perhaps they may be better samples, but I wouldn’t bet on that happening soon.

I found some stuff over here which was interesting but perhaps not germane as it’s about ongoing projects and not a released game (or at least it wasn’t released in January).

(NC3D is an animation studio, btw)

So perhaps nothing’s using it now, but the current state of VRAM seems to be applying some constraints.