Gonna need a muscle car-style hood scoop on the side of your case for some of these big bois.
You said there was some rumour of a process shrink to come next year? That might do it for me.
stusser
5114
Yeah. 320w is not reasonable. It just isnât. I wouldnât buy a 320w 3080 even if it beat the 2080ti by a solid 40% and cost $699.
Planning to see what AMD has coming. This isnât a day 1 purchase for me.
If it really is 320w, I hope the reviews donât just look at raw performance and value. They need to roast it on the coals of its own making.
Wait what? Are we waiting for a process shrink early next year? :)
I admit, a 3080ti on confirmed 7nm, at under 300 watts would be ideal.
I feel the heat of this burn. ;)
Editer
5116
Itâll be interesting to see how good the idle power savings is. If itâs 320W/350W when gaming or rendering videos, but drops to just a few watts during idle, desktop, productivity usage, itâs not that big a deal to me. (My case is well ventilated, and the electric cost wonât be that much in that situation. Particularly compared to the investment in the card itself!)
On brands⌠What are people liking nowadays, as I prepare to camp on a preorder? Iâve always preferred EVGA, but went XFX on the 2080 Ti because it was the only brand I could find in-stock when I bought that card. Card was fine and had a three-year warranty. (So was able to sell with one year remaining.) Any downsides to the Nvidia-branded cards as far as service/support?
I was using Gigabyte for a good 6+ years for GTX cards (460/670/970), but my last card from them a gtx1080 had a wonky HDMI port , that they couldnât reproduce after a return while under warranty, and sent back to me. Card is fine on DP but HDMI still randomly doesnât work.
So I got a EVGA 2080 FTW3 almost a year ago to this day at a decent discount, and its been wonderful, quiet and they have great customer service , as I did have warranty questions. I like that I can log into their site and see I have 700 or so days remaining, and that I can transfer that warranty if I ever sold it.
What does the 1080/2080 use by comparison?
The 2080S is 250 I think.
That doesnât seem like a huge jump over the current gen then. I know itâs 25-33% or whatever, but given thatâs likely under max load I donât see the problem here, I guess. Itâs like 50-80W. Did we think newer cards that were able to draw more power (I thought that was the benefit of smaller chips, but I could be mistaken) wouldnât eventually start drawing more power?
(and power consumption) â mine as well. Meanwhile AMD is about to release their second generation of 7nm CPUs, with the second generation of 7nm GPUs not far behind.
If correct, then it seems like NVidia spent too much time dickering on prices and not enough time securing modern fab capacity.
The feeling is that AMD is a full process ahead of Nvidia, even though theyâre all labeled 7 or 8nm. TSMC is that good. Samsungâs process is not as good, and a lot more power inefficient.
AMD looks like theyâll move to TSMC 5nm next year, too
Assuming announcements are as expected on 1 September, how long are we looking at before its possible to get these cards in hand? Is there going to be a significant shortage/hoarding/price gouging by hoarders? Of course we donât know, Iâm just wonder if that is typical early in the release cycle?
Max load is probably where your GPU runs, right? Otherwise youâd have bought a model down :) Whether it matters, well each to their own. For me itâs already a bit toasty in here.
Smaller processes allow less power draw in general⌠We just spend the savings on more cool stuff. Hopefully these cards will use that power to be awesome but you would imagine thereâs an upper limit for a home PCâs power use. If the total system is using 500W⌠you can get space heaters that are 500W. Theyâre pretty small, but still.
Enthusiasts are their own breed of course.
Not necessarily. A lot of games I play donât use nearly all the power of my 1080Ti, and I have a cap of 60fps set because I honestly donât see much difference between 60 and 80 or even 100, but I hear it when the card fans kick in and I donât see the benefit in making the card work at max (just in a bid to extend the life of the card, but also I like my PC running quiet and again, a steady 60fps is great, imo). But playing Pillars of Eternity 2 at 144fps is⌠not necessary.
Obviously, there are some exceptions, some games (like Warhammer 2) I want the card at full load for, and thatâs where Iâm very glad to have such a power house card. But itâs not something that happens all the time, or even really very often, given the kinds of games I tend to gravitate towards.
That all looks plausible, thanks.
So the big question (for my fall system build plan): will it be worth the extra cash and wait for a 20GB 3080 vs the 10GB model (assume same speed and specs otherwise)? Past experience says no way, but then youâd also hate to be the guy with a gimped card that Flight Simulator (or whatever) doesnât run well on in a year because everybody bought and assumes that 3080s have 20GB.
Diego
Great question, and of course the answer has too many variables to remotely pin down right now. Whatever the reality of RDNA2 winds up being along with the final pricing landscape will impact whatâs within the budget for gamers over the next couple years, and Iâd imagine this in turn will impact game development studios to some extent; nobodyâs going to want to throw millions toward a game only 1% of their potential market could play at enjoyable levels, after all.
My 2080ti is using 300w under heavy stress (according to GPU-Z).
Editer
5131
Aargh, late September? Here I was frustrated already that the original rumors had shipping on 9/9. Oh well, maybe the Reverb G2 will ship at the same timeâŚ