When do the next generation GPUs drop?

Yeah and considering the 780ti was something like $600 when I bought it a few years ago and the 1050ti was about $120 when I snagged one earlier this year, not to mention the fact that it’s quieter and sips power in comparison as well, it’s pretty impressive that it gets pretty close in my experience. Granted, the wife’s computer with the 1050ti also has faster DDR4 RAM, a slightly newer CPU than my computer and she plays games at 1080p vs my 1920x1200 screen so it’s not an apples to apples comparison exactly. It’s likely the 780ti is a handful of FPS faster on average at 1080p with everything else being equal. The 3GB of video memory let’s the 780ti down on occasion, but I’m a 60fps snob so I’ll admit to having somewhat biased standards in this regard.

If 60fps isn’t a strict requirement, especially with gsync, then a 1050ti is probably a great choice for gaming on a modest budget

Their monitor will be my old one, which is a 1440p with no adaptive refresh rate. I’m getting a new one with g-sync. Hearing that about g-sync makes me feel even better about keeping my 1070 and planning on using it for quite a bit longer.

Ahh, OK. No reason to upgrade from a 1070 unless you want to game at 4k.

Speaking of Gsync vs Freesync - do any TV’s support either for use with Steamlink (and would that make a difference)?

Planning the 4k home theater setup, and steamlink is a factor (along with the xBox One X). Looking at the LG OLED’s if they come down enough on or by Black Friday.

Steam Link doesn’t support it, no. Yes it would make a difference.

Some Korean grey-market TVs support Freesync, but my guess is HDMI 2.1 with its variable refresh will win out in the end over AMD’s VESA-based adaptive sync in the home theater world. TVs with HDMI 2.1 will be widely available in 2018.

Hmmm…so if I wanted a stopgap measure to upgrade my video card so I could play things like Witcher 3, Fallout 4 and Prey at 1080p (I just have a plain old ACER monitor, no fancy sync modes) while having them look good and play smoothly, the 1050i would be the way to go?

I don’t mind paying $150-$170 now for something that will play everything I own and everything I’m likely to purchase over the next year at decent resolutions and framerates while I wait out the miners on a higher end card. Is a 1050i with 4GB DDR5 going to run most stuff at high quality with acceptable framerates?

1070 + 1440 g-sync is an awesome combo.

Yes, the 1050ti will handle 1080p just fine, just not at maxed-out quality and locked 60fps.

We had people on these forums saying the 750ti played Witcher 3 just fine, which… I guess if you temper your expectations, maybe.

The cheapest G-Sync monitor over here is $490, and that’s a 24" 1080p TN screen.

The cheapest 1440p IPS/AMVA screen is the $780 AOC AG271QG.

My 32" 1440p AMVA Freesync HP is $390.

Screw that.

Yeah, they are so damn expensive. I’m seeing prices of around $650 for a 1440 27" ASUS monitor with g-sync.

I don’t mind expensive monitors, the most money I have ever spent on anything to do with a PC was on a NEC LCD20WGX2 11 years ago (and was still going strong when I sold it after 8 years of use) but paying over twice the price for G-Sync and higher refresh rate goes beyond the pale.

Awesome, thanks. Looks like I have my affordable upgrade path moving forward, at least until the miner monkey business fades away and 1060/70/80 prices return to sane levels.

Did you see the motherboard ASUS announced for miners?

I’m not sure how gpus are supposed to fit in all those slots

You use a ribbon cable, the card is not in a slot.

God forbid people buy graphics cards for what they’re meant for. >:-(

Heh, all that money spent and it’s sitting in a $5 plastic crate.

It’s a fucking Rubbermaid. Don’t be dissin’ on it. That’s gotta be $7 at least.

The crates are actually good because you can just stick a $20 dollar full-size box fan next to it and it cools all a whole bunch of cards down a lot better than the tiny little fans on them.

Reading this out of context, but - I kid you not, my son is currently playing Witcher 3 on a GTX 660. I won’t even tell you what CPU he’s using. I realize the 660 is slight ahead of the 750, but still… I wouldn’t have believed it myself if I didn’t see it. Is it max quality? Of course not. Is it playable? Much much more so than I would have expected.

Sure. The Witcher 3 is a current-gen title, intended to play on launch Xbone and PS4, which had GPUs roughly equivalent to the 750ti. So it’ll play, but not well.