Aceris
7665
I think I’ll wait until 19th when my vacation starts to try out Cyberpunk, will give nvidia a chance to sort their drivers out and maybe CDPR will fix some bugs too.
Will be interesting to see what performance I get with my slightly lamed 3080(*), I might try actually using the undervolt/overclock and maybe pushing it a little further. Failing that I’m sure dropping some of the more OTT RTX options will bring the framerate up.
(*: I dont think it’s defective, just gets bad power efficiency either due to silicon lottery or the crappy Palit power delivery, and since the 3080 is typically power limited anyway it does have an impact)
Thought I would ask this question in here as you are all smart cookies. When Nvidia update their graphics drivers for optimizations to specific games, what analysis are they doing specifically for this that wouldn’t be covered by the old drivers anyway?
stusser
7668
What you’d expect, they optimize drivers specifically for that game.
I am asking what that involves exactly.
stusser
7670
Oh, that’s proprietary but I’m sure they have tools to profile each application.
Tortilla
7671
I have no inside info, but as a general development statement it’s usually quite possible to do workload-specific optimizations by just shutting off certain features and devoting more resources to others. If the driver team can profile a game and say it uses features X,Y,Z but not A,B,C and it’s using Z a little wrong then they can improve things. Behind the scenes they can have the driver suite detect that game and shut off A,B,C and correct calls to Z. Boom, instant performance enhancements.
Chappers
7672
Thanks to both of you, that was interesting.
Skipper
7673
I’ve always thought, “game ready,” just means it allows GeForce Experience to pass input of the game to the drivers for automatic enable/disable/optimize settings that are best for that game. Nothing groundbreaking, just ensuring the game runs well.
In the past though, I know I’ve read of them specifically patching things within the driver that were causing issues or did not allow the game to fully support the cards in some way.
stusser
7674
No definitely not, the optimizations apply without GFE even being installed.
Yeah, as I understand it, GFE will simply choose best recommendations for various graphics settings. I avoid it these days, however, so I may be wrong.
I remember asking this somewhere many years ago, and the explanation I was given was that despite most games using “off the shelf” engines, most games (especially AAA) heavily customizes their rendering pipelines. In a lot of cases they end up doing things not quite right (like a draw call without an end call that sometimes isn’t required), or just do things that aren’t the most efficient.
So Nvidia (and to a lesser extent AMD, I think Nvidia has a bigger team for this work) actually has drivers that have game specific modifications that end up being invoked on that specific game’s rendering engine to smooth things out and make things efficient (and sometimes less buggy).
Outside of that, I also came across this comment the other day on Reddit (so take it with a grain of salt):
Not that this really matters to the end user but - the problem is really not that AMD is exceptionally bad, it’s that NVIDIA is exceptionally good. DX11’s capacity for multithreading is not very good, and NVIDIA did an enormous amount of work at the driver level to inject multithreading where it is not supposed to go. NVIDIA’s driver will actually rewrite the draw calls into multiple command queues in order to spread the load across multiple threads, which is a really crazy approach. They have put an enormous amount of work into making that work correctly and making it perform well, and it pays off. AMD has never done that work and just more or less passes the draw calls onwards, which means they are much more single-thread limited. This is one of those NVIDIA Software Advantages ™ that helps keep them dominant. Software matters, you aren’t just buying a graphics card, you are buying drivers, and ongoing driver support.
fdsaion
7677
The linked polish benchmark of some older (unoptimized?) DX11 games (Watch Dogs 2, Kingdom Come) is really interesting.
Crazy, so one of the bigger name review sites isn’t getting RTX cards from Nvidia anymore because they keep mentioning how AMD’s newer cards are better when RTX isn’t being used in a game.

Yeah, several techtubers have been ranting about this and for good reason. Apparently they’re all familiar with occasionally getting ghosted by companies when they don’t provide positive enough reviews, but this email goes beyond the pale (and frankly the provided rationale is BS).
mono
7680
I’ve seen some rants, and it is typically evil corp BS behavior on the part of Nvidia.
However… If you’re talking the top gamer cards, RTX and DLSS are such advantages, and IMHO indispensable for next gen, including the few games that have already hit, it’s just dumb to recommend an ATI card. If you’ve got a super old card and you want to max out current gen games, then sure, go for it. But if you’re spec’ing a system to get all the neat next-gen stuff and push it at 1440 or 4k, RTX makes a world of difference in atmospherics, and ‘Quality’ DLSS is essentially a free 30-40% boost over the best ATI can offer. Those Radeons are also-rans this gen. I greatly hope that they bake in RT and a DLSS equivalents in short order so we can have real competition, but you’re missing out on the high end without RTX & DLSS. ATI just doesn’t have a place at the table.
Still waiting for these next gen GPUs to drop
Was thinking about this yesterday. Seems like if one of the consoles had rolled with nVidia, they might have had a huge advantage in being able to produce higher framerates with DLSS. If one of the consoles had Cyberpunk (or other big games) playable at 60 FPS/4K whole the other was at 30, that would have been huge news
I think the 3xxx series are mythical beasts, like unicorns and manticores.
morlac
7683
Mines a beast all right, the blood ritual to capture it was totally worth it.
stusser
7684
DLSS is magical but games need to support it. CP2077 is a huge deal as it runs like crap without. It’s the first truly next gen game. Nothing else looks like CP.