Ok, so your monitor is constantly refreshing. Even when you’re just looking at the background, it’s displaying a new imagine 60 times every second(60hz).
When a game is playing 60FPS, on a 60hz panel, everything is great. The frames from the graphics card sync up with the new frames on the monitor, and everything is super smooth.
When the graphics card puts on greater than 60FPS, the monitor receives extra frames. It tries to render these extra frames as fast as it can, which results in screen tearing.
If your graphics card dip below 60fps, to say 59fps, the monitor doesn’t get a new frame. Basically, it’s expect exactly 60 frames over a period of one second. If it only gets 59 frames during that one second time scale, it will display the last image given to it a second time. So if we look at 1000 milliseconds(which is one second), and divide it by 60, we get 16.6ms. That means the monitor outputs a new frames once every 16.6ms.
If we drop to 59FPS from the graphics card, the monitor will display that last frame twice. This means you’ll have 58 screen refreshes showing a new frame every 16.6ms, and then the LAST frame will be 33.2ms. You’ll feel that in game as “stutter”.
Freesync allows the monitor to refresh outside the normal 60hz band. So when your graphics card produces 59FPS, and the monitor has freesync, it will display each frame as it comes in. So you’ll have 58 frames at 17.2ms, and then the LAST frame will be displayed for 17.5ms.
So with standard refresh, the last frame will last 33.2ms. With freesync, it will last 17.5ms. 17.5 is MUCH closer to 16.6(which is exactly 60fps).