WTF NVidia????? Login now required in GeForce Experience

Heh, the troll posting feature was kinda funny.

This seems pretty exciting.

I wonder if the integer scaling is only for fullscreen? Can individual windows be scaled too? Either way, it is a long-awaited feature.

Seems optimization for older GTX cards is no longer a thing. :(

The new 436.02 driver doesn’t even come up in their search tool when I select my GTX1080.

Is 436.02 the version talked about in that article? It doesn’t come out for another 30 minutes (8am CDT).

EDIT: Not that all these features will work with anything but the newest generation cards, mind you, just that the driver will definitely replace existing for each GTX product, not just the newest ones.

Hmmm, will integer scaling help with some of the issues running games like Starsector on 2K or 4K screens, I wonder (alongside their UI scaling)?

Well, not totally sure. But - it seems like it would make the text less blurry, so in theory…?

I don’t know too much about integer scaling, so reading up a little on it now. For my couch gaming, I play on a 4K TV. I don’t like running at 4K, though, because I vastly prefer more frames than the higher resolution, so I typically play at 2560x1440. I’m wondering if integer scaling helps in this type of situation.

I mean, for the most part, things look fantastic like that but I know the scaling isn’t a perfect multiple between 1440p and 4K. In a game like Starsector, there’s also the factor of the game not having UI scaling (yet).

Anyway, this piqued my interest. Thankfully (or not, it’s really not worth the cost), my couch gaming laptop has a desktop RTX 2080 in it, so I should be able to experiment with the feature. My desktop is on a GTX 1080, but I don’t need it there so much as I’m always playing at native resolution.

Integer scaling would obviate the need for UI scaling for e.g. Starsector—set it to run at 1920x1080, set integer scaling to 2, and you have 1080p-size UI etc. with 4 screen pixels for each game pixel.

It’s completely ridiculous that integer scaling, a feature meant to make ancient games look better on modern hardware by DISABLING bilinear filtering, is only available on Turing. Bad showing, Nvidia.

So am I to understand that if I don’t have one of the swoopy new cards, I don’t need to worry about installing this driver?

Now that I have a 1920x1200 monitor, integer scaling could be really useful. (For old DOS games.)

It’s still a newer driver with various optimizations, just the integer stuff is Turing-only.

Man, the title of this thread is so prescient.


Update, 2:10pm ET : Nvidia has temporarily taken the standalone version of the 436.02 update down without any explanation. Users began complaining that the latest driver update would automatically install the associated GeForce Experience app, even when users elected not to install it in the installation process. Use of GeForce Experience requires agreeing to a terms of service document and (for many of its features) an online login, while the drivers on their own do not. We’ll update this report when Nvidia cuts its installer back down to the bone.

Carp. I don’t care one way or another about the GeForce Experience thing.

It was just a bug on their part, they stopped requiring the GFE a long time ago.

Step 1: https://www.majorgeeks.com/files/details/nvslimmer.html
Step 2: https://www.geforce.com/drivers/beta-legacy
Step 3. Use NvSlimmer on file from Step 2.
Step 4: Profit.

I would actually have a use case for NVSlimmer if it works for other stuff besides nVidia’s drivers package.

I was just wondering why everyone wasn’t excited to talk about integer scaling. Apparently I have this thread on mute. (With a title like this, it’s no surprise!)

Anyway, we did it. Pixel doubling!

Only if you have the newest GPUs, though.