AMD FreeSync: AMD takes a shot at Nvidia's G-Sync

I hope they don’t hold out long, as I’m in the market for some new monitors but want to stick with my Nvidia card.

RPS does a G-Sync vs Freesync piece. The author had some issues with ghosting on the FreeSync display.

Either way, I’m not entirely clear on the prospect for fixing the ghosting problem on these early FreeSync panels. As they currently stand, they’re impossible to recommend. At best, I’d characterise FreeSync in it’s existing state as a fun little extra I’d be happy to have for free. But I wouldn’t want to pay a premium or have it dictate my choice of monitor.

G-Sync, on the other hand, is a trickier question. Having seen the two technologies side by side, I’ve a clearer idea of why it’s undoubtedly the better choice right now. Whether I think it’s worth the price premium is another matter.

Not sure if that’s related to that specific monitor or what.

Anandtech said freesync worked perfectly fine. I would trust them over RPS on technical matters.

Does seem to be one of the monitors has a ghosting problem, it was mentioned in what I linked earlier.

I’m strongly inclined to think it’s down to maker’s design, as the other monitor using the same panel didn’t show it.
RPS seem to be getting it wrong, technically.

Well, the RPS piece does address the panel issue, but, yeah, I wouldn’t take their word for it:

Moreover, this seemingly response-related problem means the FreeSync panel tends to look that little bit blurrier with FreeSync enabled. Not good. This ghosting issue has been fairly widely reported (there’s a video here showing the issue fairly clearly on a BenQ FreeSync screen) to the extent that I’m fairly confident it’s a general FreeSync issue and not specific to the Acer monitor I saw.

Well, it took a couple of years.

This is good news for everyone. I love Gsync (I just bought a new Gsync monitor for Christmas, actually) but options are welcome. They’re especially welcome given the cost of the upgraded Gsync modules required to power 4K/100+Hz/HDR were $500. Something obviously had to give.

I’d still take Gsync over Freesync, but when we’re talking $500? Freesync will do.

Yeah I just replaced my dying Acer Predator 27" with an AOC G-G-Sync on clearout and it was obvious G-Sync monitors weren’t worth it for manufacturers.

Well they have to pay Nvidia for the G-sync module and licensing/validation, and they pass those costs on to consumers. Obviously you’ll sell less monitors if they cost $200 more.

I think at the $200 price point they could manage. But so much now is going 4K/100+hz/HDR, and the base Gsync module can’t support that. You need the bigger Gsync module at that point, and that thing is $500!

What I noticed is that some manufacturers have come out with Freesync and Gsync versions of those high end models, but the Gsync monitor is $200-$300 more expensive and is missing stuff like HDR, or has a lower refresh rate, etc. That’s when I knew the tech was doomed unless something changed.

This is really great news for me! As an owner of a 4k monitor that is only Freesync compatible, for some games I often dip below 60fps and being able to use adaptive sync is such a boon. I don’t think it’s one of the ‘certified’ freesync monitor that Nvidia is talking about, but I also read you can enable the option for any freesync monitor and see how it goes.

About time. Looks like I’ll be in the market for a new monitor in a few months.