Upvote Upvoted 3 Downvote Downvoted
AMD announces "free-sync"
posted in Hardware
1
#1
2 Frags +

CES is more than I expected already and the first day isn't even over.

http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014

CES is more than I expected already and the first day isn't even over.

http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014
2
#2
-5 Frags +

I don't understand the point of slowing your refresh rate to the fps.

I don't understand the point of slowing your refresh rate to the fps.
3
#3
7 Frags +

Removes stuttering, tearing, and other shit.

Removes stuttering, tearing, and other shit.
4
#4
-9 Frags +

Not really, by lowering your refresh rate you're making more screen tear more. It's just silly idea.

Not really, by lowering your refresh rate you're making more screen tear more. It's just silly idea.
5
#5
4 Frags +
pine_beetleNot really, by lowering your refresh rate you're making more screen tear more. It's just silly idea.

Normally, yes, but not in this graphics pipeline, which is kind of the whole point.

[quote=pine_beetle]Not really, by lowering your refresh rate you're making more screen tear more. It's just silly idea.[/quote]
Normally, yes, but not in this graphics pipeline, which is kind of the whole point.
6
#6
0 Frags +

Unless you have amds new technology called freesync™

Unless you have amds new technology called freesync™
7
#7
0 Frags +

Screen tearing happens when your gpu produces frames out of sync with the refresh rate. That leads to having multiple frames on your screen at one time.

Stuttering happens when your gpu doesnt produce enough frames so your monitor grabs the same frame twice.

Similar to G-sync, free sync syncs up your monitor refresh rate to the fps produced.

Screen tearing happens when your gpu produces frames out of sync with the refresh rate. That leads to having multiple frames on your screen at one time.

Stuttering happens when your gpu doesnt produce enough frames so your monitor grabs the same frame twice.

Similar to G-sync, free sync syncs up your monitor refresh rate to the fps produced.
8
#8
5 Frags +
pine_beetleNot really, by lowering your refresh rate you're making more screen tear more. It's just silly idea.

yes i fully agree with you, the dozens of professional hardware engineers at both nvidia and AMD are spending their time and money working on something that doesn't actually work! :p

[quote=pine_beetle]Not really, by lowering your refresh rate you're making more screen tear more. It's just silly idea.[/quote]

yes i fully agree with you, the dozens of professional hardware engineers at both nvidia and AMD are spending their time and money working on something that doesn't actually work! :p
9
#9
-5 Frags +
wareyapine_beetleNot really, by lowering your refresh rate you're making more screen tear more. It's just silly idea.Normally, yes, but not in this graphics pipeline, which is kind of the whole point.

Can you explain why?

MrYokeScreen tearing happens when your gpu produces frames out of sync with the refresh rate. That leads to having multiple frames on your screen at one time.

Stuttering happens when your gpu doesnt produce enough frames so your monitor grabs the same frame twice.

Similar to G-sync, free sync syncs up your monitor refresh rate to the fps produced.

Slowing refresh rate tears your screen that in itself. FPS and refresh rate do not scale 1-1. Ideally you get 2 fps every time your screen refreshes, but even getting 1 fps at 144 hz is better than 72 hz at 144 fps.

turtsmcgurtspine_beetleNot really, by lowering your refresh rate you're making more screen tear more. It's just silly idea.
yes i fully agree with you, the dozens of professional hardware engineers at both nvidia and AMD are spending their time and money working on something that doesn't actually work! :p

People spend their money on worthless junk all the time. If you buy a good system this sync garbage will literally do nothing as it cannot increase the hz of the monitor.

[quote=wareya][quote=pine_beetle]Not really, by lowering your refresh rate you're making more screen tear more. It's just silly idea.[/quote]
Normally, yes, but not in this graphics pipeline, which is kind of the whole point.[/quote]

Can you explain why?

[quote=MrYoke]Screen tearing happens when your gpu produces frames out of sync with the refresh rate. That leads to having multiple frames on your screen at one time.

Stuttering happens when your gpu doesnt produce enough frames so your monitor grabs the same frame twice.

Similar to G-sync, free sync syncs up your monitor refresh rate to the fps produced.[/quote]

Slowing refresh rate tears your screen that in itself. FPS and refresh rate do not scale 1-1. Ideally you get 2 fps every time your screen refreshes, but even getting 1 fps at 144 hz is better than 72 hz at 144 fps.

[quote=turtsmcgurts][quote=pine_beetle]Not really, by lowering your refresh rate you're making more screen tear more. It's just silly idea.[/quote]

yes i fully agree with you, the dozens of professional hardware engineers at both nvidia and AMD are spending their time and money working on something that doesn't actually work! :p[/quote]

People spend their money on worthless junk all the time. If you buy a good system this sync garbage will literally do nothing as it cannot increase the hz of the monitor.
10
#10
3 Frags +

Just want to clarify what g-sync and free-sync actually do, going to assume 60hz for everything since that's the worst case scenario and where this technology helps the most.
Currently: Your screen refreshes every 16ms. At 60fps, you get a new frame every 16ms. However, these frames don't come at the same time. In the worst case, your screen refreshes, then the monitor receives the new frame in the next millisecond. In this case, there is 15ms before your new input is shown on screen and the game will feel very laggy. 15ms is a worst case scenario, but the same applies (even if the effect isn't as bad) all the way down to 1ms. The only time you don't have any stuttering or lag is when your screen refreshes the exact instant it receives a new frame, which is a very rare situation. Constant 60fps can be hard to maintain even for a lot of TF2 players and when the frame rate varies from 60fps, this 15ms delay can become much much worse as sometimes multiple screen refreshes can go by without a new frame which manifests as a large stutter.
The above all assumes your frame rate is less than your refresh rate. If the frame rate is higher than your refresh rate, e.g. 90fps, the screen will attempt to draw two frames on the screen at the same time resulting in tearing like in the picture:

http://upload.wikimedia.org/wikipedia/commons/0/03/Tearing_%28simulated%29.jpg

Some games tend to tear more than others and in certain areas, e.g. when I had a 60hz monitor and ran TF2 at 133fps, I had to look hard to see the tearing, but in Assassin's Creed IV, in the Abstergo centre at ~40fps on a 120hz screen I get massive tearing. However, tearing is a thing that most people have gotten used to over time.
With (G|Free)-sync: Your screen refreshes whenever you receive a new frame. This means that every situation involving lag that I mentioned in the above paragraph is gone, every single new frame is shown on screen the instant after it's drawn, improving input lag in all cases except the case I mentioned above. Tearing is no longer a problem as the screen cannot refresh with two frames as each frame has already been drawn on the screen. Literally the only situation where this technology is not useful is the incredibly rare scenario where your frame rate is absolutely constant and perfectly synchronised with your refresh rate. All this results in a perfectly smooth game, even at lower FPS.
What (G|Free)-sync can't do: The definition of smooth in the previous sentence isn't quite accurate however. Even though each frame at ~30fps will be drawn perfectly with no stuttering, you will still have ~33ms delay between frames which will mean your inputs will feel delayed and the game will still feel like a slideshow, just a slideshow running perfectly smoothly without stuttering or tearing.

Just want to clarify what g-sync and free-sync actually do, going to assume 60hz for everything since that's the worst case scenario and where this technology helps the most.
Currently: Your screen refreshes every 16ms. At 60fps, you get a new frame every 16ms. However, these frames don't come at the same time. In the worst case, your screen refreshes, then the monitor receives the new frame in the next millisecond. In this case, there is 15ms before your new input is shown on screen and the game will feel very laggy. 15ms is a worst case scenario, but the same applies (even if the effect isn't as bad) all the way down to 1ms. The only time you don't have any stuttering or lag is when your screen refreshes the exact instant it receives a new frame, which is a very rare situation. Constant 60fps can be hard to maintain even for a lot of TF2 players and when the frame rate varies from 60fps, this 15ms delay can become much much worse as sometimes multiple screen refreshes can go by without a new frame which manifests as a large stutter.
The above all assumes your frame rate is less than your refresh rate. If the frame rate is higher than your refresh rate, e.g. 90fps, the screen will attempt to draw two frames on the screen at the same time resulting in tearing like in the picture:
[img]http://upload.wikimedia.org/wikipedia/commons/0/03/Tearing_%28simulated%29.jpg[/img]
Some games tend to tear more than others and in certain areas, e.g. when I had a 60hz monitor and ran TF2 at 133fps, I had to look hard to see the tearing, but in Assassin's Creed IV, in the Abstergo centre at ~40fps on a 120hz screen I get massive tearing. However, tearing is a thing that most people have gotten used to over time.
With (G|Free)-sync: Your screen refreshes whenever you receive a new frame. This means that every situation involving lag that I mentioned in the above paragraph is gone, every single new frame is shown on screen the instant after it's drawn, improving input lag in all cases except the case I mentioned above. Tearing is no longer a problem as the screen cannot refresh with two frames as each frame has already been drawn on the screen. Literally the only situation where this technology is not useful is the incredibly rare scenario where your frame rate is absolutely constant and perfectly synchronised with your refresh rate. All this results in a perfectly smooth game, even at lower FPS.
What (G|Free)-sync can't do: The definition of smooth in the previous sentence isn't quite accurate however. Even though each frame at ~30fps will be drawn perfectly with no stuttering, you will still have ~33ms delay between frames which will mean your inputs will feel delayed and the game will still feel like a slideshow, just a slideshow running perfectly smoothly without stuttering or tearing.
11
#11
4 Frags +
pine_beetle[stuff]

You don't know how g-sync or free-sync work, see my reply above.

[quote=pine_beetle][stuff][/quote]

You don't know how g-sync or free-sync work, see my reply above.
12
#12
2 Frags +
pine_beetlewareyapine_beetleNot really, by lowering your refresh rate you're making more screen tear more. It's just silly idea.Normally, yes, but not in this graphics pipeline, which is kind of the whole point.Can you explain why?

Because the monitor refreshes if and only if (unless you are below 30 fps, I believe) the GPU has rendered the next frame. Therefore every monitor refresh is exactly 1 frame, so there is no tearing.

But you're right, this is targeted more towards low/mid end systems than high end. If your fps is double your refresh rate or higher, you don't really need it. However, not everybody can afford such a system.

edit: the guy above me is like 9 million times more in depth

[quote=pine_beetle][quote=wareya][quote=pine_beetle]Not really, by lowering your refresh rate you're making more screen tear more. It's just silly idea.[/quote]
Normally, yes, but not in this graphics pipeline, which is kind of the whole point.[/quote]
Can you explain why?
[/quote]
Because the monitor refreshes if and only if (unless you are below 30 fps, I believe) the GPU has rendered the next frame. Therefore every monitor refresh is exactly 1 frame, so there is no tearing.


But you're right, this is targeted more towards low/mid end systems than high end. If your fps is double your refresh rate or higher, you don't really need it. However, not everybody can afford such a system.


edit: the guy above me is like 9 million times more in depth
13
#13
0 Frags +

http://www.pcper.com/reviews/Graphics-Cards/AMD-Variable-Refresh-FreeSync-Could-Be-Alternative-NVIDIA-G-Sync

An update to the free-sync business: looks like you will almost definitely need to upgrade monitor and maybe graphics card to use, unless you have a notebook that uses eDP already. Still a better solution than g-sync in my mind as it isn't a proprietary solution and supported monitors don't need to cost much more with a specialised Nvidia branded g-sync controller since controllers that support the variable refresh rate should be fairly common.

http://www.pcper.com/reviews/Graphics-Cards/AMD-Variable-Refresh-FreeSync-Could-Be-Alternative-NVIDIA-G-Sync

An update to the free-sync business: looks like you will almost definitely need to upgrade monitor and maybe graphics card to use, unless you have a notebook that uses eDP already. Still a better solution than g-sync in my mind as it isn't a proprietary solution and supported monitors don't need to cost much more with a specialised Nvidia branded g-sync controller since controllers that support the variable refresh rate should be fairly common.
Please sign in through STEAM to post a comment.