When do the next generation GPUs drop?


No, but you might already have a freesync monitor. And like the Fish said, freesync monitors are around $200 cheaper than g-sync. That’s really AMD’s only competitive advantage.

Pity Nvidia is too stubborn to enable freesync support in their drivers, which they could trivially do.


Yeah, I skipped paying the price for a gsync monitor last year when I built this system and grabbed the 1070. It was enough work getting my wife to not veto spending what I already was. I would love it if they enabled freesync support, or opened up g-sync. I will possibly spring for a monitor when the next gen nvidia cards come out.


Just so I understand right:

  • I know gsync is only for NVIDIA, but how much does gsync normally add to the price of a monitor?

  • While freesync is an open platform, today only AMD uses it so it’s effectively AMD-exclusive. If that’s the case, how much does freesync usually add to the price of a monitor?


GSync usually adds $200 to the price of the monitor because it requires special chips to be integrated. My understanding is that Freesync is heavily based on adaptive sync standards and thus does not require any extra hardware, just software.


If anyone’s in my neck of the woods I’m selling my extra 4K 27" Gsync monitor!


I don’t buy nVidia GPUs because of their many ethical lapses as a company.


What if we don’t all remember where that is?

I’m… asking for a friend.


Edmonton, Canada.


I’m a little surprised questions are still coming up about this, so for anyone who doesn’t know, here’s a quick summary.

Freesync isn’t just an “open platform”, it is a VESA standard. AMD developed it, opened it up, and sent it to everybody to use freely. VESA adopted it after.

FreeSync is royalty-free, free to use, and has no performance penalty. [Wikipedia, sourced]

It also doesn’t raise the cost of monitors. Monitor manufacturers bake the VESA spec into the hardware of their monitors themselves.

G-Sync is proprietary and requires hardware that typically adds $200 to the price of a monitor. Nvidia manufactures chips that monitor manufacturers have to use in order to support G-Sync.

Since FreeSync is a freely available standard, Nvidia could make all of their future GPUs compatible with all of the available FreeSync monitors on the market. Whether or not they could do this simply with a driver change for existing cards I don’t think has been directly answered by Nvidia and I imagine they are the only ones who would know for sure, but maybe. AMD can never make G-Sync capable cards because Nvidia didn’t share their tech like AMD did. No judgment here about that, that was Nvidia’s business decision to make. As for not supporting FreeSync, I do judge a little about that. :P


Right, it’s not accurate to say freesync is just software-- it does require hardware support, but that support doesn’t cost them anything so a ton of newer monitors support it. Even some grey-market Korean cheap monitors support freesync these days.

Gsync on laptops doesn’t require the module. It is essentially freesync, just restricted to those specific laptop screens. So yes, Nvidia could allow it with a trivial driver update.


In unfortunate if true rumors, there are growing rumblings that the $399/$499 pricing for Vega is launch only, and will be going up later in the year. Which is terrible news for AMD if true. Performance parity at a deficit in power and price both is a bad spot to be. (Not that AMD hasn’t been there before.) Maybe that’s why they rushed out the mining driver so soon after release.


I don’t see how that’s possible, unless they give up on gaming entirely and price based on coin mining performance.


HBM2 pricing could cause it. I have no idea if that’s the case however.


Doesn’t matter. If they take a card that basically performs like a 1080 in games and increase the price over a 1080, they’re giving up on gamers.


There’s a thread on the AMD subreddit where a poster estimated the cost of a functional Vega 56/64 chip at $160 or so, based on RX 480/580 costs, relative die sizes, and fault rates per unit area. Combined with pricey HBM and all the other required components, and the need to pay off a long R&D period, I could see the pricing rumors being true.


Suppposedly the XBox One X supports Freesync, but not sure if officially confirmed. Or if any TV sets support it yet. That would be a big win. For the same hardware you get both higher average performance since you can leave vcync off, and a higher maximum frame rate.




Appears to be confirmed. SMH.

If only supply were the only issue. Not long after cards hit retail, prices jumped a cool $100 USD. In the UK, an even bigger punch to the stomach was delivered through a £100 increase. Our friends at Gamers Nexus reached out to a couple of AIB partners and were able to confirm that launch pricing for the Vega 64 was $499 USD for only a limited time, which was thanks to vouchers provided to etailers. Once those vouchers ran out, Vega 64 jumped $100 in price.

I don’t think enough stink is being raised about these shenanigans. AMD told a ballroom full of press and analysts that Vega 64 carried an SEP of $499. As we understand it today, this price was never meant to be permanent. Launch reviews were based on that $499 pricing. AMD had to have known that within 15 minutes of the card going on sale, that price would no longer be relevant. Fortunately for us, we took a compute look at the card first, where its hiked price could still be justified (it still beats out more expensive competition in some cases), but even so, there is some serious sketchiness going on

You can expect much the same for Vega 56, too, because otherwise, AMD is surely going to have super-thin margins (if it can make any money off of them at all). Both the Vega 56 and 64 are built like the $1,000 Vega Frontier Edition, with high-quality components, one of the best VRMs out there, and what I’ve been told (but haven’t been able to confirm yet), somewhere around ~$150 USD for the 8GB of HBM2. How can AMD make money on this card?


It’s been like, a week. The launch price only lasted a week? That’s insane!

So just to get this straight, AMD’s selling a card that performs roughly equivalent to a GTX 1080, which sells for $399, for $499. And it uses a shitton more power too.


Aren’t 1080s in the $500+ range?