No benchmarks have leaked, how is that possible?

I’ll let you in on that secret. The AIB partners have all been prepping their cards for months now. They have the products, engineering boards for a while. NVIDIA however, has not released a driver that works with anything other than the test software they supply. So get this, I am writing this article on September 1st, hours before the presentation, and still, the board partners have no idea what the performance is going to be like. We need to advance on that as the board partners even do not know the thermal capacity effect of their products. NVIDIA has provided them with test software that will work with the driver. Basically, these are DOS-like applications that run stress tests. No output is given other than PASS or FAIL. We know the names of these test applications: NVfulcrum test and NVUberstress test. For thermals, there is another unnamed stress test, but here again, the board partners can only see PASS or FAIL. Well, we assume they have tested with thermal probes. What this paragraph, well, to show you the secrecy that NVIDIA applied for this Ampere project.

Digital Foundry seems to have one.

Yep, and they were only permitted to show comparative performance, not absolute. Nvidia is playing this pretty close to their chests.

Their BF5 benchmarks were without raytracing, by rules from Nvidia.

image

35% increase. Seems that the hand picked demos with all the RTX goodness enabled showed 70-80% improvement. So non-RTX may only be 35% that would make the 3080 equivalent to the 2080Ti for non-RTX enabled games.

?

134 vs 87 is a 54% increase, and it’s over a 2080Ti.

That said, undoubtedly the performance gains will be biggest for RTX heavy titles.

I don’t think the power draw was mentioned once in all the comment I read. Doesn’t seem to be a concern for the pc gamer.

you are right, good job I am not an accountant!

Indeed :)

Seems to me that a current medium gaming pc, say a 3600X / 2060 build will draw about 300W in total. For 350 you could throw in the monitor :) So all the heat that’s currently dissipated in that system via the cpu, gpu, psu and case fans, now has to be dissipated from the GPU only. Interested to see what that leads to for overall system builds.

That founders design with the rear fan cooling heatpipes inside the case seems a bit like poor man’s water cooling. Surely moving that heat to the edge of the case and exhausting it directly would be more efficient. None of the AIB designs have water blocks as far as I’ve seen though so maybe I’m wrong.

I suppose if the cpu is cooled with a radiator in the side or top of the case then you’d have fairly clear air flow from front to back above the cpu. That would probably work ok.

350W at 8 hours a day is the equivalent of $10 a month electricity cost according to this:

The 2080 Ti was 250W so it will cost you $3 a month more in electricity if you switch up to this, or $10 a month if you use your PC 24 hours a day.

If you use your PC at full GPU load the whole time.

Who doesn’t? :D

Plus 50% for New York :)

you have your PC on 36 hours a day? I envy you.

3080’s and 3090’s are showing up as an option for new PC builds at a couple of the bigger shops. It shows as just a new 3080 with no other details. Could just be place holders but I was able to “place” an order for a 9/30 shipment date for a new pc with a 3080 one. Didn’t pull the trigger as I want to see how things shake out a little but patience is waning rapidly.

I’m going to mod mine with a 90 degree bend so that rear fan points at the back case fan. Lovely jubbly :)

It might actually make sense to install a little plastic baffle to redirect the air behind the CPU HSF, now that you mention it. I won’t bother though.

I think I would definitely want two fresh air intake fans on the front of my case at minimum.

Sure, you’re dumping an extra 100w or more of heat inside the case, at least if you get an AIB card without the reference hybrid blower. Remains to be seen how that reference cooling actually performs.