Shiny! As the Firefly guy would say. Maybe we’re due a generation of games with no non-reflective surfaces at all. That’ll be nice.
Agree on the uglyness. Why can’t they render something like an open field or a normal living room.
If raytracing is supposed to render realistic looking stuff, then render something we have an everyday reference for. I look at that video and think, “I don’t see anything realistic at all, what’s the big deal with raytracing again?”
stusser
4315
This is the same tech in Wolfenstein Youngblood which was very well reviewed. Magical, even.
vyshka
4316
It is the next iteration of it, isn’t it? Did DLSS 2.0 exist last summer when that shipped? It will be interesting to see how well it works when they aren’t using specifically trained nets.
stusser
4317
No, they added it recently. Wolfenstein Youngblood has DLSS 2.0. Pity it isn’t a better game.
vyshka
4318
Sweet. It does look nice:
rei
4319
DLSS was supported in Anthem too, not to be. mistaken for DISS. That was its technical and critical (lack of) acclaim.
stusser
4320
DLSS 1.0 sucked and was worse than a sharpening filler. 2.0 is nearly magical.
Yep, as I said a while ago, I think this is the future.
As they iterate on this and the networks are more powerful, native render resolution before DLSS is going to mean less and less. It’s also a way of improving raytracing perfromance by allowing lower render res (I think DLSS scales better with resolution than raytracing?).
You’d think it would benefit game steaming, since you could render at a lower resolution, maybe?
If you mean streaming services being able to stream higher quality with less horsepower, then yes. You lose some fidelity in streaming already, so it’s a no-brainer to not render at native res, upres, compress and send.
The other application is to stream a lower res image to the client and have the client graphic card do the upres. That’s weird because it would imply graphic cards geared towards this specific use, but I could see a cheapish streaming box with this technology to receive a lower res image and upres to 4k).
Yeah, I was thinking the latter.
stusser
4325
I’d expect everything to happen serverside.
Didn’t want to start a new thread, so how can I force my desktop to use my graphics card instead of defaulting to the crappy onboard intel card? I noticed this the other day when messing with the Modern Warfare client and now that I’m messing around with my PC it appears to be on all games, not just MW. I’ve looked up a few things and my Nvidia Control Panel won’t come up due to me 'not being connected to a Nvidia GPU. Now my computer sees my 1060 in the device manager, and I just reconfirmed my drivers are up to date. I’m trying to not disable my onboard graphics and possibly totally screw up my screen, so before I do so I wanted to check with the hivemind here.
Not sure when it happened, but I can’t believe it’s been like this for very long. My last driver update was on March 4, so maybe then.
vyshka
4327
Is there a way to disable it in the bios?
Edit: I missed the trying to not disable :)
That’s one of the suggestions, I was just wanting to go thru any possible fixes on the 1060 first before disabling stuff. Just in case it does something weird and can’t find the 1060 at all, even though it shows up in device manager.
I did go to the Nvidia site and it had a newer driver from a couple of days ago so I installed it. Still can’t get Nvidia Control panel to come up though.
vyshka
4329
Both are plugged into the monitor?
So, I was able to figure out exactly when this started happening. It was Feb 8th when we moved into our new house…
and I plugged the monitor into the motherboard hdmi instead of the GPU hdmi. Genius!
vyshka
4331
So solved? :) This is a bad time to have pc issues. Amazon is pushing out deliveries to about a month now since they have run into covid issues at some warehouses. After getting us hooked on same day delivery, it is going to be rough :)