When do the next generation GPUs drop?


You’re right, I saw they dropped the price but misread what it was dropped to. I bought my 1080 when they were $600 (and unavailable at that price).

So the Vega 64 launched at $100 cheaper than the 1080, even though it was pretty close on performance. How the hell did I miss that? I read like 5 separate reviews!


Yeah, last I priced the GTX 1080s, they were going for 550+, especially if you wanted one with a couple of fans and a back plate. These bleeping cryptocurrency miners have us all over a barrel.


Massdrop has a 1080 Ti for $659; cheaper than what I paid for mine, although this is an Aorus – I’m unfamiliar with them:


It’s Gigabyte’s gamer vanity brand like ASUS’s ROG.


I recently ordered a HP Omen 32 (32" 1440p 48-75hz freesync range).

I have a 290x which struggles at 1080p sometimes, so I had a look at anandtech’s GPU bench to see how much of an upgrade a 480 or 580 would be. Turns out, they wouldn’t be. Only a Vega 56 or Fury X would be.

I had no idea AMD were so uncompetitive.

How is Crossfire these days? Seems like the best option is to get a second 290x and bang it in.


As persnickety as any two-GPU setup. For some games (here’s looking at you, DCS), it reduces performance. For others (less cobbled-together things), it’ll buy you something less than twice the frame rate of one card.

Having run a Crossfire setup for more than six years now, I can’t say that it hasn’t made my 2011-era computer with its Radeon HD 6970s an acceptable choice even to today, but I don’t know if I’d recommend going that way unless you like fiddling with things.


I’d be happy with 40-50% extra at 1440p but I had hoped compatability might have improved.

Although, looking at frametime graphs of two 480’s in crossfire, I’m not sure if it wouldn’t still be choppy when it dips below and above Freesync.


What are you playing on a 290x that’s slowing it down?

That board screams at 1080p.


Dishonored 2 & Prey were the most recent. Fallout 4 was almost unplayable but I think that was a general thing with AMD cards at launch! The Witcher 3 also wasn’t the smoothest at 1080p in a lot of areas.

I’ve checked around (e.g. Digital Foundry for The Witcher 3) and it doesn’t seem to just be my subjective experience. ~30-40 was what I generally got in TW3.


290, 480, and 580 are all in the same tier on the gpu hierarchy at Tom’s., along with a 1060. That’s not going to make for a great experience at 1440p, imo. You could get a 1070 for a lot less money than a furyX. That’s what I use on my 1440 monitor.


VSync tearing annoys me more than outright FPS and I still find vsync to be problematic in games -> hence why I bought a freesync monitor so there isn’t much point in anything other than waiting for miners to piss off.

I’ve stuck with ATI since my XFX 6800GT. (x1900xt, years later a 7870, and then the 290x).


Preaching to the choir. That’s why I spent extra on gsync. When I got the 1070 it was actually a move away from amd, which I’d been using for some time. Looking at what’s out there, as you have, I’m sure you can understand why.


That would be great if the cryptocurrency miners hadn’t run up the prices of 1070s to bonkers levels.
I’m still rocking a 1920x1200 monitor, so my 970 is “good enough” for a bit longer, but I’d love to upgrade to a 27 inch 1440 one at least, for which I’d need something better in the graphics card department. But I can’t touch anything that would be an improvement for less than 500 bucks, and that’s a lot of cheese for the card alone.


In retrospect, I didn’t do so badly buying a founder edition…


My 290x OC was €300 two years ago, secondhand they’re still going for around €220-€250. It’s bonkers.

Though RAM is as bad, my 8Gb ddr3 cost less in 2012 than it does now.


Likewise. For me, it’s been the perfect balance of cost/framerate/1440p.

Um, yeah, there’s that now. I bought my FE last year, and felt guilty about the cost until I saw what an improvement it was over my 970.


I’m handing my old desktop down to the kids and putting together a new one. I need to decide whether to give them the 1070 and buy something new for my machine, or buy something cheap and carry the 1070 forward. On one hand, it seems excessive to buy a new high end video card when I already have the 1070.

On the other hand, I have no idea what the midrange/low end market looks like, and it seems full of questionable values. Does anyone have experience with the 1050 ti (or equivalent)? I’d like for them to be able to run some 3D games.


The 1050ti handles 1080p great. It doesn’t wtfcrush it like the 1060, but it will play games at high quality and framerates, just not ultra/max quality and locked 60fps. Go for it.


My wife’s computer has a 1050ti. It’s a great little card for games if all you need is 60fps at 1080p resolution and don’t mind adjusting a few settings down to medium on occasion. It gets some bonus points in my book for not needing a separate power cable from the power supply.

By my estimates, it actually handles most games about as well as my old 780ti that I’m trying to keep for another hardware generation or so.


The 780ti should be a good bit faster and capable of 1440p gaming at non-maxed/not-60fps. It’s comparable to a GTX970, which the 1050ti is comparable to a GTX960.