With the increased power draw does this mean everyone’s UPS units will need upgrading too?
Lamalo
4933
Heh, last upgrade I went with 1500VA because the 850VA started beeping when I was gaming.
I’m thinking I will pick up an eGPU and an RTX 3080 at some point next year. My TV does 4K@120 via HDMI 2.1 after all!
Tim_N
4935
Just what the environment needs, PC components moving back in the direction of insane power draws, thanks Intel and Nvidia.
I think it was inevitable though, right? As Moore’s Law reaches its limits, the only way to go is to stay on the same die size, but to have more chips, which means more power and heat?
stusser
4938
Hopefully true! If so that brings Nvidia’s performance relative to previous generations back where it should be.
Of course price is still in question.
I was quite looking forward to the Megawatt future. PCs could go back to having big valves and frankenstein switches everywhere.
Waiting to see if the 3000 series brings ray racing into mainstream pricing or will it just be left to the new consoles.
If the rumors of dramatically improved rt performance are true, it should, especially with dlss
Maybe I don’t know what I’m missing, but I still don’t really care about ray tracing.
stusser
4943
Next-gen consoles all have it, so if you want a console-equivalent experience, you will need a GPU that supports it.
Tim_N
4944
I am squarely in the “maybe, hope to” demographic for a major GPU upgrade later this year from my already-good 1080 Ti. But I have a number of needs that I hope either the 3070 or 3080 match:
- I don’t want it to draw more power than a 1080 Ti (or if so very marginal). Sydney summers are getting hot enough as it is without having a heater next to my legs that sounds like a huge fan. I also don’t want to replace my aging Power Supply, which I think is either 700 or 750 W.
- It can’t be physically larger than an aftermarket 1080 Ti. I have a micro ATX case and the 1080 Ti is as large as I can hope to fit inside it.
- It can’t have less than 11gb memory (i.e. I don’t want to downgrade my memory). As a part of work I need to be able to estimate deep neural nets and memory can really come in handy in some of those applications.
- Regarding speed, anything equal to or better than a 2080 Ti is good enough for me. With a 4k monitor and being a VR dabbler having a bit more oomph than a 1080 Ti will really help in some games.
If it is true that the 3070 is around the same speed as a 2080 Ti, then that would be fantastic, but I really doubt it’s going to have 11+gb of RAM onboard. For the 3080 I would hope it would have at least 11gb, but I am now worried about the power draw and size considering all the crazy rumours about the 3090.
Thraeg
4945
Pretty sure I’ll end up going with the 3080 this time around. Sounds like it should be a nice bump up from the 2080 Ti I sold, with the addition of HDMI 2.1 to enable 4K/120Hz.
Err, hmm. That’s an odd way to put it. My desired experience is independent of what the latest game consoles are doing. It probably aligns with that experience 80-90% of the time, but it’s not an end in itself.
I’m a simple man. All I’ve ever wanted is consistently high framerates at 1440p. That resolution will probably keep increasing until some sort of ultrawide retina level.
Of course that’s already achieveable at the lowest possible graphical fidelity, so now you’re making me think that I must have some standard of graphics quality in mind. Perhaps it’s a question of balance. The games look good enough to me right now, but not smooth enough for me to enjoy them without becoming distracted. Ray tracing tips the scale further in an unnecessary direction for me.
Again, I could change my tune once I experience more ray tracing in games.
Do we think the 12-pin connector is only required for the 3090, or all the 30x0 cards? Because if they all require it that is some epic levels of bullshit and may push me to an AMD card.
It would be a crazy business decision to require it for all cards - they’d be cutting off a huge part of the potential market - so I can’t imagine they will. Also, presumably they’re planning to make some laptop variants of these cards, which can’t possibly have that sort of power draw.
We don’t know if it’s required for anything yet. It will likely be on the ‘founders’ edition of the 3090, but that doesn’t necessarily mean it will be on every 3090. It’ll probably be on the 3080ti too I reckon since they already have the dual 8 pin.
It looks to replace the dual 8-pins we already have on the high-end RTX cards and seems far preferable to me (if you already have a modular PSU).
Here’s a size comparison next to the 16 pins it replaces:
Where is cold-fusion when we really need it?!?