When do the next generation GPUs drop?

This has been a real improvement - I had a somewhat bleeding edge rig 4 years ago and it’s still able to run nearly anything I want at high or better. I imagine I’ll be able to squeeze another few years out of it before the video card is only good enough for medium settings. (Note that FPS is a secondary genre for me, which gives me some more breathing room.)

I was playing on a Dell XPS laptop with a C2Duo and a 8600M GT circa ~2008, until only a year or so ago. It was basically a productivity machine and I played games on it whenever I can. I even played XCOM 1 on it no problem. It was the same 8600M GT that was deemed defective because of over heating issues. I guess I won the silicon lottery that time.

Now I have a upper-mid range skylake and Radeon. As long as I stick with 1080p or thereabout, I should be set until the next console refresh.

Yeah this PC upgrade treadmill is sooo expensive, most of the time it isn’t worth it. For example a 1070 costs about twice as 1060, but it DOES NOT offer double performance of 1060, only around 10-50% at most. The marginal gain at the very top is even more expensive.

On a tangent here, but I find reviews in general are overestimating the performance of video cards. They always do average FPS, but if the standard deviation of the frames are high, e.g. 20-100 FPS, then even if the average is 60 it still means the card cannot deliver a smooth experience. When the game hits 20 fps, at that point it is practically unplayable, especially for a shooter. So a card that pumps out 20-100 fps, but at 60 fps average, would be overestimated to be an adequate card.

IMO steady 30 fps should be considered the basic acceptable framerate. It is kind of standard in console already. Steady 60 fps is good. That is the CoD gold standard. If the card can go beyond that, even better. And benchmark should be realistic gameplay session of games we can buy right now. Maybe even stream the gameplay session and tweaking to show that reviewers are not cherrypicking results.

So my RX480 is considered “adequate” at 1080p by most reviewers. In real life gaming situation it bloody isn’t adequate. Playing Ubisoft games like Steep sometimes the framerate goes as low as 20. It is a twitch game like a shooter so 20 fps just doesn’t cut it. I have to turn down the resolution to get a steady 30 FPS. (Quality settings change very little for Steep at least.)

Now that I think about, maybe I should do a “honest” review of my RX480 and post it on youtube. I’d like to see how AMD like them apples.

This thread made me so envious that I upgraded to a 2nd hand GTX 980 from a GTX 660. Yes I’m cheap :)

Going from a 660 to a 980 is one hell of a jump. Nice. :)

Many reviews do frame-time / jitter / % time above target FPS analysis now exactly for this reason.

Techreport was the first larger site that really started pushing this information starting I think back a few years back.

E: Has it really been 7 years?

The Core i7 860 I bought in 2009 is still totally viable with a GTX970 installed in it today, in 2018. I use it as a second PC now to the Alienware Core i7-6700 I bought two Christmases ago now with a GTX 1070.

That GTX970 BTW only cost me $293.98 in November of 2015. They are totally gouging you guys three years later IMO. I love Newegg’s ability to search all my past orders.

Apparently this is why I never got a revised ship date email. Per a Reddit thread:

We had a problem on our supply chain visibility due to our product build coming along slower than we anticipated on the launch. We sent notes to all customers prior to the 9/20 ship date notifying about a shipment delay and followed up on 9/24 with an new ship date. Unfortunately, our customers with system profiles opting out of marketing communication were overridden by the system and these notifications were never sent and this omission most likely included yourself.

Now I’m in order processing limbo.

Thanks. That will now be the first site I go to for realistic review of GPU.

Unfortunately, Scott Wasson, the founder, was the workhorse behind The Tech Report and their stellar reviews. He’s since taken a job at AMD/ATI, and the site has deprecated quite a bit.

Yeah I find the conclusion of their review of 2080 Ti a bit timid. The raw stats are there: even 2080 Ti cannot at a mininum guarantee 30 FPS for games they tested (i.e. 0 frames that took more than 33 ms to render), at 4K Ultra/highest. To me that sounds like even 2080 Ti isn’t good enough for top of the line 4K gaming. But they don’t outright say it. But at least I got their data and I can interpret them.

I’m guessing because they depend on hardware vendors to gift them hardware to test, so they don’t want to bite the hands that feed them.

Eh out of 1 minute of testing time only 2.9 seconds of that was spent in lower than 60 fps. If you look at the graph it’s at 60fps 95% of the time. That 35fps number is the 1% mark. So I’d say that says that 2080 ti is just fine for 4k ultra highest as long as you aren’t looking for > 60fps.

Well, my old 750 has finally started giving up the ghost. constant restarts and freezes with the good ol’
brown screen. I can’t really afford any new GPU, as I’m broke from medical bills. Was seeing if anyone on here knew of a good deal on a 970 or something, or if they had one laying around they would let go cheap. Or even an old 750 to replace what I have now.

I don’t know about you, but I don’t really care about 60 FPS. Mininum 30 FPS is the mininum standard for me. Anything below 30 FPS, I notice, and it is the trough that is annoying. 2080 Ti can’t do at at least 30 FPS in Shadows of Tomb Raider or Deus Ex Mankind Divided on 4K Ultra/highest. That is an instant fail in my book. (Just for reference, on 1440p Ultra, even a lowly RX 480 will guarantee at least 30 FPS for Fallout 4, a pretty undemanding game).

You can probably get away with lowering some settings, like turning off MSAA (at 4K I kind of doubt you will notice), but IMO those benchmarks show you still can’t get a smooth 4K Ultra/highest experience out of the box, no matter how much money you have. Not yet. Tweaking is still required.

I’m jealous, 30fps is an aggravating stuttery mess to me. It’s one of the things that kept me away from consoles for so long. I can’t even handle it in regular OS stuff, let alone something with a lot of motion like games!

Here you go Soma. I wondered who the market was for these.

So much this. My old boss insisted that I couldn’t possibly tell the difference between 30 and 60 and that I was just fooling myself.

That was frustrating, writing for a gaming pub…

I still think you are misreading those graphs. Their stats show in 1 minute they spent 180 milliseconds under 30 FPS in Dues Ex (so 0.003% of the time), and only 3.3 seconds under 60 fps. So only 5% of the time you are between 30 and 60 fps (and you could be at 50-55 fps for that time). You are not going to notice those dips, especially if you have g-sync on. 94.5% of the time (literally) you are going to be at least at 60fps. Both of those graphs show that out. The likelihood of anyone actually thinking that’s a stuttering mess while actually playing the game (and not analyzing performance with a fine tooth comb) is just low.

Edit: Just to clarify, I"m not saying 30fps vs 60fps isn’t noticable, I’m arguing that those graphs clearly show that all settings maxed at 4k you’ll be at 60fps enough of the time that you won’t notice the small amount of time you dip to 35 (the minimum on those graphs).

Yeah, I hear a lot of that, which is complete bollocks. You can easily tell the difference between 60 and 120 FPS, let alone 30.

Once I got a 144hz monitor, I found an easy demonstration. One of my monitors is standard 60hz, the other is the 144. In Windows, I’d just grab a small window (like of File Explorer or Notepad) and quickly move it in circles on first one monitor, then the other. I got a lot of “ohhh’s” when it looked like a smooth motion on one screen versus jumping/stuttering from place to place on the other.

At 30 FPS? That results in serious eye strain for me, followed by a splitting headache if I keep at it.

If 120hz was impossible to see i guarantee Apple wouldn’t have made 120hz panels for their iPads.

One of their selling examples was scrolling text from a book smoothness, not gaming. I’m not sure they had a gaming example (although being Apple i’m sure they did).