When do the next generation GPUs drop?

Hmmm. I am almost ready to do a new system build. I wonder if I should wait for this.

I would, but i donā€™t have much hope that Vega will compete with the 1080, much less the 1080 TI.

As far as iā€™m aware, various rumors and leaks are putting Vega well behind 1080 (not even TI) still. Obviously these are rumors and leaks (unless they have been confirmed) and the leaked numbers, even if true, could be for a lower end Vega model, maybe.

Even if true, we donā€™t know pricing or how it might affect nVidia pricing.

In 1080p resolutions, but anticipating better 4k performance.

Latest leaks guesstimate 1070-level performance for Vega, but thatā€™s just based on cores and specs, not any real testing.

Iā€™m likely picking up a 1080 Ti, but I figure Iā€™ll wait till June to see if Vega affects Nvidia pricing at all.

I really wouldnā€™t believe any leaks. Iā€™ve been following AMD releases relatively closely since last year as I was waiting for the new Radeon series then, and I can say that pretty much every AMD release since then has been way off performance wise from any of the leaks that came out, even days before announcement.

Funny.

Hereā€™s the article, and it makes sense with HBM2:

Thoughts on these rumors:

Iā€™m taking these results with a huge grain of salt, but letā€™s just play with what we have here, shall we? If AMDā€™s new Radeon RX Vega graphics card loses to the GTX 1080 at 1080p, that would make sense given itā€™s not a card for 1080p - but for 4K and beyond.

HBM2 will scale incredibly well with high resolutions, but it will be the new High Bandwidth Cache technology inside of Vega that will be the most surprising thing (at least IMO) about the Radeon RX Vega. HBC will allow the 4GB HBM2-based Radeon RX Vega to perform closer to a high-end GeForce 10 series graphics card with 8GB+ of GDDR5X, something you can read more about hereā€¦

AMD needs to absolutely crush NVIDIA at 4K and beyond, where Iā€™m expecting at least 15-20% additional performance over the GTX 1080 Ti at 4K/Ultra graphics in games like Quake Champions, DOOM, Rise of the Tomb Raider, etc. If AMD comes out with a Radeon RX Vega that canā€™t beat NVIDIAā€™s new $699 graphics card, then it needs to be priced ultra competitivelyā€¦ which is going to be very hard for AMD thanks to the exorbitant costs of HBM2.

Thereā€™s no indication that Pascal is bandwidth-constrained with GDDR5X memory. I basically think thatā€™s bullshit, and fully expect Vega to perform comparably to a GTX1070 in most games. Itā€™ll probably equal a GTX1080 in DirectX12 games, where AMDā€™s architecture has a bit of an advantage.

Hopefully we get to find out sooner rather than laterā€¦ but I can wait, as weā€™re in the middle of a remodel, so high-end PC-gaming has been on the back burner, as Iā€™m mostly stuck with a laptopā€¦

Hmmā€¦ Personally, Iā€™m not interested in 1080p or 4K performance. I want optimal performance in VR.

But thatā€™s more important for the news cycle than actual sales, as far as actual success goes for the manufacturer, I understand. Still a niche, even if it gets a lot of press focus.

A RX480 or GTX1060 will handle current-gen VR perfectly well. Next-gen, probably not-- but nobody is talking about next-gen VR yet.

Iā€™m doing hardcore advanced VR stuff. Running flight sims with VR retrofit utilities supersampled. I need all the power I can get, Scotty. A 1060 ainā€™t gonna cut it.

Then buy a 1070 or blow your wallet out with a 1080/Ti. If you want crazy power its there for cash.

Yeah, depends how much power you actually need. The 1080 is reasonably priced now. The 1080ti is quite expensive but provides a worthwhile power bump for the money.

The idea that Vega would be roughly 1070-equivalent makes no sense. This is a Fury X replacement with years of R&D, probably 80% more transistors (~same die size after moving 28nm to 14nm), and higher clocks. Those advantages have to mean something. Some kind of symbolic 5% speed improvement on the Fury X would mean a huge failure in engineering planning. If an increase like that in the transistor budget doesnā€™t translate to any performance increase, they would have been better off just doing a quick die shrink of the Fiji instead.

If Vega 10 is really just a 1070 competitor, AMD would not launch it. Thereā€™s no way they could price it even remotely competitively with a 1070, so itā€™d be a total flop. (Vega is 75% larger than the 1070 + thereā€™s the other extra component costs like the interposer).

And yet, thatā€™s what the leaks said. Being unsubstantiated, obviously they could be wrong.

So Nvidia unveiled Volta, at first a ā€œcomputeā€ card aimed at businesses and a mere $149,000 for one with eight GPUs on it.

Given the history with Pascal and Maxwell, when are we likely to see consumer chips? Holiday, or not till sometime in 2018?

The GP100 announced April 2016, and the consumer 1080 released the next month in May. So the GTX 1180 could be as soon as next month.

But it wonā€™t, because AMD has nothing to compete with the 1080ti.

Also the GV100 is an insanely expensive chip to fabricate, and an entirely new architecture to boot-- unlike Pascal. So it might not be so easy to translate to consumers.

Vega ā€œFrontier Editionā€ (I see what you did there, AMD) announced. Other than 26 TFLOP at half precision, not much new is disclosed. Price & game performance still missing.

http://www.anandtech.com/show/11403/amd-unveils-the-radeon-vega-frontier-edition