Be careful with New World (possibly bricking video cards)

I’m more likely to be careful taking anything said on Reddit seriously.

Very true. But can’t hurt to watch those temps.

seems it’s a specific model that is having the issue. Still frightful. I didn’t even know this could happen.

Yeah, not having wasted over a thousand bucks on a 3090 and having played over a hundreds hours already in Old World with no issues, I’m feeling ok about this. ;)

Edit: Looks like Bluesnews picked the story up as well: New World Beta Bricking 3090 GPUs? [Updated] - Blue's News Story

OH and also, I clearly need to drink some coffee this afternoon, I read New World as Old World. Carry on.

Could this be a PSU issue? I know some PSUs have been listed as flaking out when the draw is too high from 3080/3090 class GPUs. But then I suppose that would brick the PSU and not just the GPU…

Same here. :D Soren needs to sue Amazon for trademark infringement.

Holy shitoli, Blue’s News is still around?

Hell yes. When Blues goes I go.

No, it’s from people that have frame rates uncapped in their card settings.

I may be misunderstanding what brick means. I thought it was a deliberate shutdown of the device at the discretion of the maker.

Frying is probably the more appropriate term here.

Oh, then that is something I can relate a lot more, having disabled gpu’s in my laptops over the years to prevent the frying.

Aye, read a bit more about it. Limited to mostly EVGA cards.

No, “bricking” is when a device is broken to such a degree that it’s only useful for its physical properties, for example propping open a window.

Wildly guessing it’s not heat, the card would shut down. Maybe the uncapped frame rate main menu is smacking the cards with voltage. I also read everyone screaming about this is all nVidia or this is all EVGA. If so, why have the cards been out this long but there’s been no news story until today.

It’s EVGA’s fault for sure. It’s because their hardware can’t handle the power their firmware tells the GPU it can use. The FTW3 cards use a shitton of power out of the box, my card used near 400W before I put in a voltage curve to tame it.

If people are uncapping framerate in the settings, isn’t some measure of responsibility for this on them?

If you’re thinking “I have a $2,500 video card that runs anything on any setting I throw at it”, I’m not sure I blame the card owners for going into the New World options and uncapping frame rate. Especially if they’ve got G-Sync monitors. Heck, there are games where I turn off the frame rate limit on my 3080.

In fact, I was sort of under the impression that if you’ve got the card with the horses and a G-Sync monitor, the first thing you should do in the settings or options of a game is either set the frame rate high or disable frame rate limiting altogether and turn V-Sync off as a standard practice.

My question still stands.

If EVGA 3090s have been popping off since September, how was that not news until today? And what happened to EVGA then? They used to be fabulous.