Gonna need a muscle car-style hood scoop on the side of your case for some of these big bois.

You said there was some rumour of a process shrink to come next year? That might do it for me.

Yeah. 320w is not reasonable. It just isn’t. I wouldn’t buy a 320w 3080 even if it beat the 2080ti by a solid 40% and cost $699.

Planning to see what AMD has coming. This isn’t a day 1 purchase for me.

If it really is 320w, I hope the reviews don’t just look at raw performance and value. They need to roast it on the coals of its own making.

Wait what? Are we waiting for a process shrink early next year? :)

I admit, a 3080ti on confirmed 7nm, at under 300 watts would be ideal.

I feel the heat of this burn. ;)

It’ll be interesting to see how good the idle power savings is. If it’s 320W/350W when gaming or rendering videos, but drops to just a few watts during idle, desktop, productivity usage, it’s not that big a deal to me. (My case is well ventilated, and the electric cost won’t be that much in that situation. Particularly compared to the investment in the card itself!)

On brands… What are people liking nowadays, as I prepare to camp on a preorder? I’ve always preferred EVGA, but went XFX on the 2080 Ti because it was the only brand I could find in-stock when I bought that card. Card was fine and had a three-year warranty. (So was able to sell with one year remaining.) Any downsides to the Nvidia-branded cards as far as service/support?

I was using Gigabyte for a good 6+ years for GTX cards (460/670/970), but my last card from them a gtx1080 had a wonky HDMI port , that they couldn’t reproduce after a return while under warranty, and sent back to me. Card is fine on DP but HDMI still randomly doesn’t work.

So I got a EVGA 2080 FTW3 almost a year ago to this day at a decent discount, and its been wonderful, quiet and they have great customer service , as I did have warranty questions. I like that I can log into their site and see I have 700 or so days remaining, and that I can transfer that warranty if I ever sold it.

What does the 1080/2080 use by comparison?

The 2080S is 250 I think.

That doesn’t seem like a huge jump over the current gen then. I know it’s 25-33% or whatever, but given that’s likely under max load I don’t see the problem here, I guess. It’s like 50-80W. Did we think newer cards that were able to draw more power (I thought that was the benefit of smaller chips, but I could be mistaken) wouldn’t eventually start drawing more power?

(and power consumption) – mine as well. Meanwhile AMD is about to release their second generation of 7nm CPUs, with the second generation of 7nm GPUs not far behind.

If correct, then it seems like NVidia spent too much time dickering on prices and not enough time securing modern fab capacity.

The feeling is that AMD is a full process ahead of Nvidia, even though they’re all labeled 7 or 8nm. TSMC is that good. Samsung’s process is not as good, and a lot more power inefficient.

AMD looks like they’ll move to TSMC 5nm next year, too

Assuming announcements are as expected on 1 September, how long are we looking at before its possible to get these cards in hand? Is there going to be a significant shortage/hoarding/price gouging by hoarders? Of course we don’t know, I’m just wonder if that is typical early in the release cycle?

Max load is probably where your GPU runs, right? Otherwise you’d have bought a model down :) Whether it matters, well each to their own. For me it’s already a bit toasty in here.

Smaller processes allow less power draw in general… We just spend the savings on more cool stuff. Hopefully these cards will use that power to be awesome but you would imagine there’s an upper limit for a home PC’s power use. If the total system is using 500W… you can get space heaters that are 500W. They’re pretty small, but still.

Enthusiasts are their own breed of course.

Not necessarily. A lot of games I play don’t use nearly all the power of my 1080Ti, and I have a cap of 60fps set because I honestly don’t see much difference between 60 and 80 or even 100, but I hear it when the card fans kick in and I don’t see the benefit in making the card work at max (just in a bid to extend the life of the card, but also I like my PC running quiet and again, a steady 60fps is great, imo). But playing Pillars of Eternity 2 at 144fps is… not necessary.

Obviously, there are some exceptions, some games (like Warhammer 2) I want the card at full load for, and that’s where I’m very glad to have such a power house card. But it’s not something that happens all the time, or even really very often, given the kinds of games I tend to gravitate towards.

That all looks plausible, thanks.

So the big question (for my fall system build plan): will it be worth the extra cash and wait for a 20GB 3080 vs the 10GB model (assume same speed and specs otherwise)? Past experience says no way, but then you’d also hate to be the guy with a gimped card that Flight Simulator (or whatever) doesn’t run well on in a year because everybody bought and assumes that 3080s have 20GB.

Diego

Great question, and of course the answer has too many variables to remotely pin down right now. Whatever the reality of RDNA2 winds up being along with the final pricing landscape will impact what’s within the budget for gamers over the next couple years, and I’d imagine this in turn will impact game development studios to some extent; nobody’s going to want to throw millions toward a game only 1% of their potential market could play at enjoyable levels, after all.

My 2080ti is using 300w under heavy stress (according to GPU-Z).

Aargh, late September? Here I was frustrated already that the original rumors had shipping on 9/9. Oh well, maybe the Reverb G2 will ship at the same time…