AMD FreeSync: AMD takes a shot at Nvidia's G-Sync

I just saw this article on AMD FreeSync over at Anandtech:

Using two Toshiba Satellite Click notebooks purchased at retail, without any hardware modifications, AMD demonstrated variable refresh rate technology. According to AMD, there’s been a push to bring variable refresh rate display panels to mobile for a while now in hopes of reducing power consumption (refreshing a display before new content is available wastes power, sort of the same reason we have panel self refresh displays). There’s apparently already a VESA standard for controlling VBLANK intervals. The GPU’s display engine needs to support it, as do the panel and display hardware itself. If all of the components support this spec however, then you can get what appears to be the equivalent of G-Sync without any extra hardware.

AMD doesn’t want to charge for this technology since it’s already a part of a spec that it has implemented (and shouldn’t require a hardware change to those panels that support the spec), hence the current working name “FreeSync”.

So that has me curious. If there’s already a standard for this and if some displays already support it, why all the need for extra hardware on G-Sync monitors?

Cheddar. Dough. Scratch. Moolah. Money. Cash. Greenbacks. Bucks. Cabbage. Jack. Clams. Simoleons.

While there’s a standard, support’s more or less non-existent on stand-alone monitors (where Nvidia is aiming). That’s not true on a lot of the newer laptop panels, though (where AMD is aiming).

The question is, which will panel manufacturers prefer? Adopt existing open standard, or adopt Nvidia’s technology and embed licensed hardware components in each and every panel?

One is effectively free (or at least very low cost to implement), the other is not.

It is interesting as well that both these solutions are aimed at solving pretty much the same problem for different reasons. NVidia want to reduce tearing to improve gaming and video playback, AMD wants to reduce screen refreshes to make better use of power on mobile platforms.

Panel manufacturers that provide for the mobile markets will be more driven by the latter, since that is where the growth is. With that in mind, it is feasible that AMD’s solution gets implemented because there are clearer returns.

Because only some laptop screens support it.

I just feel like there’s something more, beyond a cash grab by Nvidia. For instance, the hardware component in GSync monitors has something like 768MB of RAM or something like that, I assume for some sort of buffering? If that’s the case, is there going to be a performance difference between the two? Is that the reason the current VESA standard isn’t utilized in desktop monitors?

Is it a core part of the VESA standards? (Also what do they call it in them…hm…)

Apparently yes. The VESA standard apparently includes a defined method for controlling VBLANK intervals, which is what AMD is leveraging for this. At this stage there is just very little adoption of it, probably because there has not been a good reason.

Yeah, now that AMD is making it a “Thing” monitors will start advertising support (if they have it) or implementing it in new models.

G-Sync sounds crazy in light of the VESA standard waiting in the wings, but some of the speculation out of CES is that none of nVidia’s current GPUs supported the variable output required in the VESA defined standard while AMD has built it into everything for several hardware generations. So they not only had to create a hardware kludge to implement similar (or even superior) results, but they had to announce first to get out ahead of the issue lest they get snowed under. The happy side effect is they get to sell a premium feature at a premium profit to loyal enthusiasts who are already invested hundreds and hundreds of dollars in their product line.

I think AMD’s solution is more sensible, if it works as advertised. They need to find a better marketing name for it though: no laptop vendor will want to put “free-sync” on the box.

G-sync may be better optimized, being able to transmit a new frame at the precise time when it’s ready, making it both a smooth and low latency solution. AMD’s solution, with triple buffering, may have a slightly higher latency, but seems to solve the core issue (the smoothness - read the G-sync previews to see why this matters more than the lower latency) just as well.

And G-sync has two weaknesses: not only does it need a costly add-in board, it also requires a 120Hz screen. Using a 120Hz screen alone in combination with vsync will already drastically improve smoothness over a 60Hz screen.

So in theory, G-sync is slightly better, but you’ll need a $400 monitor (23" 1080p 120Hz + $100 add-in board), while free-sync could be made available in a $120 monitor (23" 1080p 60Hz).

And, of course, G-sync locks you into Nvidia hardware (which is, admittedly, a relatively rich ecosystem with Nvidia 3D support and the game-streaming Shield), while Nvidia is free to add free-sync support.

Interesting times.

Nvidia’s Tom Petersen responds to AMD FreeSync:

However, Petersen quickly pointed out an important detail about AMD’s “free sync” demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.

That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia’s own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, “we would know.”

Yep, that’s what I said halfway up the page.

Well, no shit:

DisplayPort is a digital display interface developed by the Video Electronics Standards Association (VESA).

VESA developed DisplayPort and unsurprisingly adopted their own standards in the process - saying DisplayPort supports everything required actually validates AMD’s position! In fact it is even better news as it should mean current DisplayPort panels support AMD’s technology today. At no extra cost!

Of course the bridge will be to get the standard adopted in non-DisplayPort panels, which may well be a challenge.

Arise!

Okay, actual hardware is now available, and three major scaler vendors have signed up to support the VESA standard, and hence Freesync.

Reviews -
http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion
http://www.anandtech.com/show/9097/the-amd-freesync-review

So…the monitors coming out later this month with the wider refresh windows are a reasonably good bet. And yes, they’re cheaper than G-sync monitors.

If Nvidia doesn’t support Freesync, it doesn’t matter that it’s free and clear for monitor vendors to implement. It’ll still be Nvidia/G-Sync vs AMD/Freesync and never the twain shall meet. I’d assume that Nvidia will continue to make their product offerings price competitive.

Freesync is built-in to scalars and is thus free. G-sync is a separate module, and while nVidia may price it at cost, that cost will never be 0 unless they subsidize it. G-sync is obviously dead technology. Nvidia isn’t publicly committing to supporting it because their partners are still currently selling gsync monitors. There probably won’t be a gen2 of gsync monitors, and certainly won’t be a gen3.

Clever move by AMD, not really choosing to get in direct competition with Nvidia, and instead just making it such that NVidia ended up wasting a ton of money and can’t really hope to get a return on that investment now.

That Gsync chip currently adds about $200 to the price of a display. That’s substantial when talking 3-digit pricing. And to my knowledge it doesn’t add any benefits over Freesync.

Yes and no. If you read the articles, it actually handles frame rates below the bottom of the “window” of varying refresh rates better.

It’s really not a staggering advantage though - and it has other drawbacks.

Freesync being license-free and built into scalars means it wins, period. Only question is how long nVidia holds out, the conclusion is not in doubt.