When do the next generation GPUs drop?

Eh, it can do 60 if you want ultra usually, but not 100. Depends on the game of course.

I loooOOoooooove my new ultrawide 1440p and very glad I didn’t wait any longer. 1200p was getting old. But paying for the 2080? That just hurt from a “I am going to regret this purchase when the next gen comes out even more than normal” perspective.

That’s what I said, you need a 2080 to get to 100Hz comfortably.

Yeah, I’ve been waiting this long, and can hold out a while longer. I’ve just been holding off on a few games that are playable but could be a lot smoother (Monster Hunter World, AC Odyssey, and Subnautica, plus DMC5 and Sekiro about to come out).

Maybe I’ll see how things are looking once the Ryzen 3 series shows up.

I’m at the point where I’m waiting for PS5 specs- I try to build around what the consoles are offering, because that’s the point where most PC games won’t try to go beyond for the next few years, and PC exclusives so below.

Then again I’m generally more concerned with speed than graphics quality.

In case it wasn’t clear I am running a 2080. You need a 2080ti if you want to max your graphical settings and have a hope of hitting 100Hz in most modern games. With ‘most modern games’ being an almost nonsensical statement of course.

At 1440p? Maybe if you have everything pumped up to Ultra and/or ray-tracing on.

Wait, so the 2080 can’t even max almost everything on most today’s games at 1440p?

3440 X 1440 and 100 Hz.

I am not sure if you are replying to me or not.

The 2080ti card is the card you’re supposed to get for 4k in addition to ray-tracing. This sounds like the 2nd tier, 2080, can’t even handle the 2nd tier resolution properly, and of course it should be ultra.

Looking back as of today none of these cards should have had ray-tracing , 6 months after release and there are 2 games that support the features, and the 2080ti can barely do so at 4k 60fps. Nvidia sold a bag of promises and the next gen of cards will be out before you know it.

Right from PC gamer regarding Metro Exodus at 4k with RTX and DLSS:

4k 60fps at high quality isn’t really within reach of most GPUs right now—even a Titan RTX comes up just shy of that mark, though there are plenty of areas in the game where it would break 60fps as well.

And that isn’t even the EXTREME graphics setting, I wish they would have done a run of the game with everything maxed.

YhDHpgGrAmpmnP4LUBEgvg-650-80

On a practical basis you are right, but think back to the introduction of the first 3D cards in the late 90s. Then too there was a dearth of content, and I think there were only a couple games that supported them - Mechwarrior 2 and TIE Fighter I think. Maybe Interstate 76. But it was the same situation, so I guess maybe the lesson is that someone’s gotta kick start the market.

Anything can happen though. VR hasn’t caught on even though both Oculus and HTC have tried their best. There’s no guarantee that ray tracing will get huge support, but I think no matter when they launched it there wouldn’t be a lot of games at first.

2080Ti handles 4k at 60Hz, not 4k at 100Hz. Likewise the 2080 handles 1440p at 60hz but not 1440p at 100Hz.

Though of course you can probably have games at High at 1440p @ 100hz with a 2080 just fine and not notice any difference.

Sure, but these are their top two gaming cards. I mean we can throw TItan in there, but there’s a reason it often only shows up on benchmarks because the other two are struggling. It’s not really in the same park as the others.

Remember the 2080 is basically as fast as a 1080ti and that wasn’t a great 4k gaming card. It could do it, but not at high quality or framerates. The 2080ti, which has a MSRP of $1199, is the first real 4k card available to consumers.

There’s that little difference between a 1070 and a 2080? 1070 ought to be getting 30s-40s at 1440p on ultra. Guess Stusser already answered this.

Yes, this. I am talking about Ultrawide in response to @Thraeg, aka 3440x1440. It’s closer to 4k than regular 16:9 1440p. I think we all got confused because some of us was talking about UW and some just 1440p 16:9.

So my 2080 can usually hit 60fps with UW if I don’t insist on turning absolutely everything up to Ultra/Insane. Which to be fair doesn’t look appreciably better in most cases.

This has been an amazing compromise for me vs 4k. 1440p is usually more than good enough in games for higher resolution while the extra real estate to the sides is what I always wanted with my rarely uses 1200p x 3 eyefinity/nvidia surround setup.

We’re not talking about 4k, much, we’re talking about 1440, or I guess another way to put it is not 3840 × 2160.

It isn’t the standard 1440p; 2560x1440 is 3.7 million pixels. 3440x1440 is 5.0 million pixels, a hefty 35% increase. And 4k is 8.3 million pixels, 66% more than that.

Anyway, both the 1080ti and 2080 are great 1440p cards, just not all the way up to 100Hz at ultra quality.

I was theorizing that it would take similar GPU horsepower to drive a 1440p 21:9 monitor at 100 FPS as it would for a 4K 16:9 monitor at 60 FPS.

FWIW I’m running 3440x1440 on a 1070ti that I picked up cheap back on Black Friday. I wasn’t aiming for 100FPS at max settings. Just wanted something that would last for a couple of years, and didn’t cost an arm and a leg.

Hey Me too!

Yeah I am passing on this new gen. It’s ridiculous. I can wait a year or two for them to become reasonable again.