Nvidia going to take advantage of those government checks.

3070 Q3 w/ near 2080Ti performance would be great, if that happens.

That article shows the 3080 Ti as having 256 RT cores vs the 2080 Ti’s 68 RT cores. Should be a hefty bump in ray-tracing performance if true.

At some point their marketing is just going to say ā€œWe just kept cramming cores onto the card until it wouldn’t fit in the case any more, then we removed that last core. We have NO FUCKING IDEA how many cores are actually on that thing!ā€

Seems pretty clear they’re using all the additional area gained by the process drop on cores, so the 3080ti will be just as expensive to produce as the 2080ti will. I don’t expect any price decreases, although you will get much faster cards for your money-- unlike the 20-series. So that’s OK, if not great. If that 3080 comes in at $699 I’ll probably buy one.

Also note the RT core counts-- even the 3060 has more RT cores than the 3080ti. So flipping ray-tracing on will have a lot less impact on framerates.

Looking forward to Turing being a footnote in history where it belongs.

I wonder if the RT cores are the same design as 2000 series cores or if individual cores will be faster and not just more numerous.

There was an annoying lag period about a decade ago, but one big positive of the console revolution is that devs learned that making a tight, good game was better than just cramming more pixels into the same old mold. My current PC on a 1080 GTX is still considered a medium-high spec three years later.

Because hardware hasn’t gotten appreciably faster since then.

I kind of want to get a new gaming laptop because Microsoft Flight Simulator and living room TV, but I’m guessing even if mobile chips don’t have stupid numbers of cores, the 30x0 series will still be dramatically faster than the 2060/2070 I’d end up with now in a smaller gaming notebook.

Patience… Something like a Zephyrus G14 with performance close to a 2080 would be awesome.

And hasn’t needed to, that’s what I was pointing out. In the 90s you had to upgrade every couple of years to play the latest games. Suits me fine.

In the late 80s, my brother and I talked my mom into buying a ridiculously expensive 16k memory expansion for our Apple ][+. We told her it was for educational purposes. (Narrator: It wasn’t)

In 1990, I remember prying off a 512k chip from a friend’s mobo so I could smush a 640 onto it. My career was pretty well laid out from there.

On laptops, there was about a year between the normal 20 series GPUs and the Super 20s. I’m wondering if the gap between the Super and the 30 series will be similar, or dramatically shorter.

Same thing for the ā€œ10th genā€ Intel laptop chips. If it’s dramatically shorter this cycle, a lot of current buyers are going to feel burned. Those laptops are just now hitting the street.

Dang, I’m gonna need a bigger case for the 3080ti I guess.

After seeing the Unreal Engine 5 demo (running on a PS5), I think I need a 30xx in my life for future games. The graphics is amazing!

Id love to have some kind of open world adventure game with those graphics.

How about a Crimson Skies-esque flight sim?

Sisually vtunning!

Sold.

And it does not even use Ray Tracing!