Rumor has is the SUPER cards come with a tiny cape to attach to them. :)

This allows you to see the direction of exhaust airflow.

Oh, is that what they’re for? I just cut them off since I thought they were impeding airflow.

What does everybody use for CPU + GPU temp monitoring? I used to use Real Temp which had both Intel and NVIDIA temps in the system tray but have switched over to Core Temp which had CPU temps in the tray + on my Logitch G13 game keyboard. I don’t know what to use to show the NVIDIA GPU temps these days except open and keep GPU-Z running.

Anyone else?

I used EVGA Precision X1 to overclock my card, and it also does temperature and fan monitoring.

Is your GPU made by EVGA? I suppose it shouldn’t matter.

Loaded old reliable SpeedFan but it doesn’t support/detect Ryzen sensors (hasn’t been updated since 2016-2017) only HDD/SSD and GPU.

It is, but that shouldn’t matter.

In an August driver update all Gen 11 intel graphics chips will support integer scaling!

Maybe this will convince AMD and nVIDIA to add support for it as well.

I have a 55" 4K screen that I do almost all of my PC gaming on, but because 4K is so hardware intensive I often find myself having to use 1440p, which does actually look substantially worse because of the scaling. Witcher 3 and Doom 2016 have a sharpening option which does go a long way towards remedying the issue, but maybe integer scaling with anti-aliasing on top would be a better answer.

this is just in the video scaler?

Just ordered an EVGA 2070 Ultra from Amazon, based on my theory that their warehouse deals prices reflected the incoming price cuts. If I can get a card with an MSRP of $600 for $438, I think the wholesale value of the item has dropped. Maybe I’m wrong but this was close enough to the price for my 970 that it felt fair to me anyway.

That seems pretty darn good to me. I haven’t been watching the GPU prices, but that seems a heck of a lot closer to what these cards should have been released at to what they were.

The 2070 should have been as fast as a 1080ti and should have launched at $349.

But in truth, the 2070 was as fast as a 1080 (much, much slower) and launched at $499, with Founders at $599. Overclocked cards came in at $530-$600.

See, this is how Nvidia moved the goalposts. You think a 2070 at $438 is a good deal 8 months after release, when in fact it sucks because the card was so insanely overpriced and underperforming in the first place.

This is why we need AMD to actually compete. Without competition, consumers get screwed.

That’s not the standard card… it’s the Ultra and it’s EVGA which always has a premium.

That just means it’s factory overclocked a bit more and comes with better cooling. It’s still a 2070, nowhere remotely near as fast as the cheapest 1080ti or 2080.

That’s not the bar I set, but EVGA OC cards… they don’t come in at base MSRP even during more reasonable years.

Since we can’t go back in time and grab the neckties of execs at NVIDIA… this is a fine price for someone in the market right now, and that’s a premium OC card. If someone wants base base… they can look at something like Zotac or maybe PNY. EVGA… not so much.

Indeed, it’s a fine price for the reality in which we are sadly forced to live. But it isn’t ā€œwhat the card should have cost in the first placeā€. Far from it.

I wholeheartedly agree.

I’m pretty sure the 2070 lands between a 1080 and 1080ti, so maybe a bit better than what you are saying. From what I recall when researching, the below is pretty representative of a lot of games.

Can’t believe it took me this long to figure out how to disable the GeForce Experience splash when games start: ā€œPress Alt-Z toā€¦ā€