When do the next generation GPUs drop?

FreeSync is free, the standard exists and is fine. Nvidia just decided not to support it.

Because it’s ATI & they have a competing standard. I am, perhaps wrongly, slightly hopeful they may support a standard build into the HDMI specifications. Because every TV will.

Actually it’s a VESA standard: http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

Because AMD actually opened up and gave away their standard so others could adopt it, while nVidia said “Naw, we’ll just chill with our proprietary stuff” and then sold the hardware (or licensed the tech, can’t remember) to enable g-sync at higher costs to monitor manufacturers.

I realize it’s an open standard, but ATI created it. No way Nvidia is going to lose face and use it unless held at gunpoint.

Yeah, pretty much.

So regarding Vega, it’s still 6 months out, and performs between a 1070 and 1080. Nvidia appears to have nothing to worry about.

nVidia created gSync for the express purpose of selling their hardware modules to monitor makers as a premium upsell feature. Supporting the VESA standard is basically admitting there is no actual need for their expensive hardware solution.

Sign up for the newegg newsletter. Over the break they sent me a 30% off coupon which I almost used to buy a 1070 for about $300. Then I realized that all of the games I had recently bought which could take advantage were for the xBox, and that while it would be a great deal, I didn’t need it.

What’s the source for “between 1070 and 1080”?

Yeah, ‘significantly north of a 1080’ has just as much support.

Adaptive-Sync is an awkward term, since adaptive vsync has been in use for a few years now. Oh well.

A very recent TechReport article on the unveiling, where they got to crank Doom to the ultra nightmare mode:

By my rough estimate, that puts the early Vega card inside somewhere between the performance of a GTX 1070 and a GTX 1080 in Doom.

http://techreport.com/review/31224/the-curtain-comes-up-on-amd-vega-architecture/4

PC Gamer looks at the same demo and presentation and estimates >1080, and the leaked Ashes of the Singularity benchmark score is even with the 1080.

It’s too early to say either way with any real certainty, although the trend with RX480/1070/1080 was that AMD outperformed expectations with DX12 and Vulkan renderers, and underperformed expectations with DX11. I wouldn’t be surprised to see that continue. (Especially since the benchmark they leaked and the demo they gave both come from games where AMD’s most recent offerings overperformed.)

Has AMD caught up on efficiency/PCB length and (lack of) noise? They’ve always run hotter, louder and the cards much longer.

Not with the RX4x0 series, at any rate. We’ll see about Vega, but past trends suggest it’ll probably also be hot and loud comparatively.

(For myself, I have a pair of old 6970s in my desktop, so pretty much anything would sound less like a vacuum cleaner. <.<)

Yeah I last had a set of 79xx CrossFire and the Team Green offerings at the time were faster, quieter and shorter.

AMD performs extremely well in that specific game, Ashes, due to vendor-specific optimizations and dramatically superior asynchonous compute performance. So far that hasn’t translated to better performance in any actual games. AMD also performs very well in DOOM with Vulkan for the same reasons.

The RX-480 can’t come close to the GTX1080 even in those two games, but I can easily see Vega beating it. But only in those specific games.

The PCgamer article estimated performance beating a 1080 based purely on AMD’s numbers, which are usually bullshit (AMD exaggerates regularly), while the Techreport article used actual performance in a videogame.

AMD usually doesn’t get their drivers optimized until months after release. Any leaked benchmarks 6 months before release are 100% ignorable.

It’s interesting that in the TechReport post, the author mentions that all of the vents were taped-over (including the GPU card exhaust).

Not 100% ignoreable, no. Performance will of course improve over time.

Yes, they didn’t want any leaked pictures of pre-production hardware, another indication that Vega is still many months from release.

It also indicates that performance was representative not only of early silicon & driver state but also sub-optimal thermal conditions. I’m cautiously optimistic about Nvidia finally having some competition once again.