Or a ribwich.

image

But not as good as incubation pants.

image

Looks like 50% more than the first to me.

Digital Foundry looked at the latest FSR update in God of War, vs DLSS.

Spoilers

The main challenge facing AMD is in addressing disocclusion issues - quickly revealing previously hidden imagery causes a noticeable fizzling effect that DLSS doesn’t suffer from. Transparent elements, especially water, also see a smearing of detail that isn’t quite right. Sub-pixel detail from foliage and hair also has trouble achieving an effective resolve.

…if you’re using an RTX card, Nvidia’s technique is still the way to go: it runs a touch faster than FSR 2.0 and addresses many of the issues AMD has still to address, providing an image that’s generally of a higher quality level - and can even give native resolution rendering a run for its money in some scenarios. However, for non-RTX cards (remember, there’s still many GTX GPUs out there) and for AMD cards, FSR 2.0 works well and can only get better.

Does it make sense to get a 3090 TI with the 4080/4090 right around the corner?

The 3090 Ti is an absolutely ridiculous, $2,000 card, and you do not need it unless you need 24GB of VRAM for creative work.

Hell, I refuse to buy anything above a 3070 due to the ridiculous power draw. I’m waiting to see what will come first - a price I’m willing to pay for the 3070, or new cards that might be more efficient in FPS/watt than the previous ones. (or my 1060 could die)

Once I get the fusion bottle installed (friend had an old BattleMech he didn’t need, so I bought the core), I might be able to buy a new video card!

IMHO, the 3090 TI never made sense; it’s a new flagship from a generation that’s getting ready to be put into drydock.

Debateably it makes sense as an alternative to an A5000 :)

Just wanted to say I wasn’t asking for myself, it just seemed crazy to me that anyone would even consider that card with the 4000 series right around the corner, even if one was doing creative work. So I was wondering if there was something else I was missing about this, because I don’t even know why NVidia is releasing it.

The 3090ti is a whaling expedition. And probably fairly successful at that.

I guess maybe? I usually think of the A-series as being in farms, and no farm would want a 3090’s heat and power draw. But you’re right; could be perfectly cromulent for individual applications for someone who NEEDS that extra 5% or whatever.

Huh; Chrome no longer flags “cromulent” as a typo. Seinfeld, what have you done?

Nvidia internal top secret image.

download

No doubt, but there are a lot of them in workstations, and you get more performance for less money going with a consumer card.

But yeah, it’s totally a whaling expedition. Kind of my point was that the only people this makes sense for are not actually the target market at all.

I suspect that you’re going to be disappointed. Cores DO get more efficient at smaller process nodes, but those efficiency gains are being used to squash more cores on a card or die. And so the overall power envelope of the whole package keeps growing by generation.

Isn’t the 3090ti the fully enabled version of whatever the top chip is from the 3000 series? If so, it makes sense as a thing that they can only release when the yields are good enough for the generation that they have enough chips available to release as a thing.

Hopefully we can look further down the product stack.

Holy shit this is 90€ cheaper than what I had seen before

And a bold blower design too :)