AMD’s Chill vs. RTSS & In-Engine Limiters / Input Lag Test
02
February

By Adem Lewis / in , , , , , , , , /


Hi my name is Chris and this is Battle(non)sense In my last video I tested NVidia’s new frame
rate limiter which was introduced in the 441.87 driver. As many of you correctly pointed out in the
comments of that video AMD also provides a frame rate limiter in
their driver So as it’s been a long time since I last
tested that feature, let’s have a look at it again today. Please forgive me that I will repeat a few
things in today’s video as many gamers who own a graphics card which
uses an AMD gpu will probably not have seen my last video
as it was about an NVidia specific feature. So in the past
AMD actually provided 2 different features to limit the frame rate. There was FRTC
or Frame Rate Target Control. A conventional frame rate limiter where you
entered a single value for the frame rate cap. And then there is Chill
a dynamic frame rate limiter where you enter a minimum and maximum value
for the frame rate and the feature then adjusts the frame rate
limit on its own inside that range. In the current driver however,
you will only find Chill as FRTC has been removed to simplify the configuration. But you can get Chill to work like FRTC. You can use the desktop application to enable
Chill in the Global Profile, which should then affect all applications. However currently this is not the case
as all games I tested simply ignore the Chill setting in the Global Profile. I have reached out to AMD and reported this
behaviour, but I did not get an answer yet. So, to get Chill to limit the frame rate you
must currently use the Game Specific profiles. And while you can do that here inside the
desktop application I prefer to use the pretty sweet in game overlay
which you can bring up by pressing ALT and R on your keyboard. In here you can now enable or disable chill
and change the frame rate values which will then take effect immediately unlike NVidia’s frame rate limiter
which you can only configure inside the NVidia control panel
and requires a restart of the game for the changes to take effect. Now, when you use Chill to dynamically cap
the frame rate between let’s say 60 and 120FPS
Then you will see that the frame rate drops to 60 when there is no input detected
and it will increase when you start to move around. But even when you move around
the frame rate does not reach the maximum that you have set in the configuration.
It is also constantly changing while you are playing
and that means that the frame times and so, the input delay
is also changing. I honestly do not know why you would want
that. If you can think of a use case where you want
the frame rate and so the input delay to be inconsistent
then please let me know in the comments down below. If you care about a consistent and smooth
gaming experience, then you want to play at a stable frame rate
and stable frame times. To achieve that with Chill
you just must enter the same value for both the minimum
and the maximum FPS settings. This then gives you perfectly stable frame
times, like you would get when you use RTSS or the
Riva Tuner Statistics Server to cap your frame rate. It goes without saying that in order to achieve
a stable frame rate and consistent frame times your PC must be powerful enough to maintain
the frame rate that you set for the frame rate limiter.
So which frame rate you can set the limiter to,
depends on your PC and the graphic settings that you choose. So as said I cannot think of a use case where
I would want Chill to dynamically adjust the frame rate limit
as this leads to inconsistent frame times and so, an unpredictable amount of input delay. So, if you want to reduce the GPU load, the
power consumption as well as the noise produced by your graphics
card. Or if you own a FreeSync monitor and want
to keep the frame rate inside the variable refresh rate range of that display.
Or if you just want to cap the frame rate at a value that your PC can always maintain
to enjoy games at a stable frame rate and so consistent input delay.
Then I recommend that you always set the minimum and maximum FPS settings to a value that your
PC can always maintain. But why would you want to use Chill instead
of RTSS or the FPS limiter that is part of the game. Generally speaking,
the frame rate limiters which are built into the game engine provide the lowest delays. However, they do not provide perfectly stable
frame times as you can see here. But to be fair,
the frame time variation that you get with an in-engine limiter
won’t be the reason why you miss shots or have a bad kill death ratio. So essentially you always want to prefer the
in-engine limiter when possible. But when a game does not provide a frame rate
limiter or when it only provides a few presets which
don’t work for you Then you are forced to use an external tool
like RTSS or AMDs Chill. So lets see how Chill stacks up against the
in engine frame rate limiters of Overwatch and Fortnite
as well as RTSS. To find out how these FPS limiters affect
the responsiveness of these games, I measured the end-to-end delay that every
player is affected by. Which means the delay between the input,
like pressing the left mouse button And the gunfire showing up on the monitor. To measure that button-to-pixel delay I use
a high-speed camera and a gaming mouse which has an LED connected
directly to the switch of the left mouse button. I then point the high-speed camera at the
monitor and record what happens when I press the left
mouse button. Inside the recorded high-speed footage, I
then look for the frame where the LED turns on
And then I count the frames until I see the action triggered by that input
which gives me the delay between the button and
the pixel. First used the built-in frame rate limiter
in Overwatch to test the input delay at 60 and then at 144 frames per second. Next I used RTSS to cap the frame rate.
which, as expected,
lead to an input lag increase as the in-engine frame rate limiters usually
provide the lowest delays. Then I set the minimum and maximum values
to 60 in Chill, measured the input delay
And then I repeated the same test with the minimum and maximum values set to 144. And as you can see here,
The input delays that you get with Chill in these cases
pretty much match those of RTSS. I then repeated the same tests at 60 and 144FPS
in Fortnite. Where the results from RTSS and Chill also
show very similar input delays. So Chill and RTSS provide pretty much the
same input delay at very consistent frame times. If you want to get the lowest possible input
delay and don’t mind the very minor frame time inconsistency,
then you should always use the frame rate limiter that is provided by the game. If the game does not have an FPS limiter
or if it only provides presets that don’t work for you
then both Chill and RTSS are great options to cap your frame rate. I’m not trying to get you to stop using
RTSS it is an awesome application which I have
used for many many years. So if you are happy with RTSS, then please
continue to use it. But in case that whatever application you
use to limit your frame rate causes troubles with game capture or streaming applications
like OBS or if you don’t want to use a separate application
to cap your frame rate then Chill is a great solution
and very easy to use thanks to the honestly quite incredible in-game overlay. That said AMD could still improve this feature. It’s quite painful to set the minimum and
maximum values to a specific frame rate with these sliders. sure, you can use the left and right arrow
keys but it would be much easier if I could just
click at the value and then use the keyboard to enter the frame
rate. I’d also like to see a toggle switch here
so that I can choose between entering minimum and maximum values
or just enter a single value as many gamers want a simple Frame Rate cap. I’d also like to see Chill appear inside
the metrics overlay when it is used to cap the frame rate, or have some other form of
notification that tells the player that Chill is currently limiting the frame rate. And I hope AMD will quickly fix the issue
that the Chill option inside the Global Profile does not do anything at the moment. And that’s all for today.
Big shout out to my patrons as their support allows me to create videos like this one. If you enjoyed this video then please give
it a like subscribe for more
ring the bell to get notified when I upload my next video
and I hope to see you next time! Until then,
have a nice day and take care, my name is Chris
and this was Battle(non)sense


100 thoughts on “AMD’s Chill vs. RTSS & In-Engine Limiters / Input Lag Test

  1. That is NOT accurate, RTSS is very stabile, while NVIDIA’s fps limiter is unstabile and drops a lot more compared to the RTSS. I’ve compared it in CSGO and the RTSS is much more stabile over time. CSGO ingsme fps limiter is crap!

    Still I would like to see how responsive games are when you go after server fps and if the game is more responsive. Example, 128 tickrate server would tell you that the server updates your computer 128 times a second, so in reality you can only see optimaly 128 fps at any given time. Which means everything above 128 fps, like144 to 240 to 360 hz monitors, are just false input because the server has its cap at 128. I have smooth 128 fps in CSGO with g-sync on a 144 hz monitor, BUT the game feels more responsive where you feel every bullet you shoot.

  2. The in-game limiter consistently had less frametime than the FPS shown (~7ms instead of 8.3ms).
    With the Chill dynamic FPS, you probably should have tried setting the MIN to 120 (the desired limit) and then the MAX to something higher, giving it the headroom to get frametimes similar to the in-game limiter.
    The way you used the MIN MAX limits was by setting the MAX to the desired limit. Try setting the MIN to the desired limit instead.
    Cheers.

  3. The only reason I would use Chill feature is when I play strategy games and I don't care what my input is and I don't care as much how it looks like. Throttle my refresh rate so my computer doesn't fry. Maybe give my computer some leeway on how much it can and can't. I don't know.

  4. hi battlenonsense.
    can you test the scaling options in nvidia cp.
    do you get less input lag depending on whether you have it set to aspect ratio/fullscreen/no scaling/integer scaling mode set in the 3d manage settings?
    does overriding the app by checking the box at the bottom of the scaling tab also vary the input delay with each different scaling mode?
    do you get less input lag scaling on your gpu or display depending on the scaling option?
    if it matters on different resolutions other than the native of the monitor also with all those scaling settings?
    do any of these settings confilct or cause more input lag when correlating with the windows settings checking or unchecking the box thats "disabling high dpi scaling behavior" under the properties tab in apps and games we play?
    lots of questions.

    sorry so many questions its a relatively big topic to test and cover. i know youre the guy to do it if anyone is, also im sure youre very busy.
    i figured id at least try to ask. wish i could donate to this cause.

    me and my buds have been wondering this for YEARS! any help appreciated as i cant find any concrete fact online with those settings in relation to input lag.

    Battle(non)sense love all your content btw you do the best and most sought after tests out there by far. really getting rid of any false claim and easing peoples minds like myself that are nitpicking the settings to perfection. at least it has for me tons as now i know the real truth behind tons of settings thanks to you.

    your work is always appreciated.

  5. i remember that frame rate target control in radeon settings used to be super jittery to where it almost changed the way the game engine's pacing felt. good to see they've stabilized it… even if so many other radeon driver features and settings are so messed up right now

  6. on a 60hz monitor i got low input lag by limiting ingame to 59,94fps and on my 120hz monitor I get the lowest input lag with 119,94fps, when using v-sync on

  7. Chill was literally created to save a little on power consumption by limiting FPS when nothing happens, it being used as a FPS limiter is technically just a second function that's why it's not perfect yet (it used to be just a toggle on and off option, no sliders). I'm curious for you to review their anti lag option though.

  8. One thing you may be interested by is that while those services that reduce lag are very unreliable in reducing lag, NoPing has an option that reduces user device input latency by prioritizing those packets to be sent to MMO and online game servers first. At first I though, w/e cant be that great, but man, its like a hack almost. In some games like mmorpg with pvp it gives you an obvious edge.

  9. AMD Chill between min47-max144 is nice if you play single player games for example or games you play with a controller. Even better for freesync users because that most of the time also the freesync range. The game feels still smooth and you use less power, which means less heat (and less costs).

  10. Hi, could you please measure the input lag using nvidia's v-sync option: "fast sync"? they claim that using the option caps fps the card sends out to the monitor by dropping frames, but doesn't affect input lag due to rendering of all the frames.

  11. My 5700xt constantly underclocks, Prey runs at 1200mhz and stutters everywhere I go. Is there a way to "prefer maximum performance" like nvidias control panel?

  12. 2:30 – The use for the min/max chill is for people who want to save the most of power while at the same time have minimal impact on a non-competitive game.
    Some people don't care if framerate moves between 60 and 120.
    Furthermore technically this is the same behaviour if you would set Min to 120 and Max to 120 and then the FPS could also go down by ~20 or more if you are in a heavy CPU or heavy GPU scene on screen. So even if you try to force 120FPS (via RTSS, or via Chill, or via ingame) … in modern AAA games on high/ultra you won't really have that constant upper limit. So frametime will vary by a few ms… which is not really a problem unless you have Vsync on, then it could be that mouse lag gets introduced (for that you should use AMD Enhanced Sync / NV Fast Sync)

  13. Really cool video as always ! First of all, NVIDIA user here. Maybe NVIDIA cards are more powerful at this time, but I haven't seen AMD drivers UI for many years from now and I think it is really really much user friendly than NVIDIA UI as I can see in this video. I hope NVIDIA will do smth about that. Also, I don't know if it's relative to GSync or just my mind but I tested to stop using RTSS in APEX LEGENDS which is my main game at the moment and use the IN GAME ENGINE FPS LIMITER provided by Source Engine (you can activate it using launch command in ORIGIN). And the feeling despite a theorically better input lag was terrible. Frame times are really unstable and my card doesn't maintain the FPS cap as efficiently as it does using RTSS ! I know that you recommend to switch to In game Engine FPS limiter but it didn't work for me in this specific case !

  14. Chill has 2 values because it dynamically decide your FPS depending on the scenes movement. This really helps to save some power while playing isometric games whose scenes are mostly static.

    One more thing, I remember reading why Chill doesn't work globally. Options like Radeon Sharpening, Chill and Enhanced Sync were disabled globally (even if one sets them on in the menus) because they can negatively effect desktop behaviour or videos playing fullscreen. Imagine your 60FPS Youtube video becomes 30 FPS suddenly just because you are not touching to the mouse and the scene is mostly static 🙂

    correction: Chill is not apperantly not based from scene action, I knew it that way but could not find any article, so I it looks like it is based on mouse and keyboard input purely.

  15. I'm sure you are already working hard to keep up with your consistent uploads, but you are impressively smart and incredibly pleasing to listen to! I realize that you already perform netcode analysis on most notable multiplayer games, but I feel like your channel would blow up in popularity if you managed to rebrand the netcode analysis series and generalize to appeal to more players (rather than just us folks that are more dedicated to the intricate details). The series would need to have a name that makes sense to any gamer hearing it for the first time and it would have to be something catchy that is easy to use in a casual (gaming-related) conversation. Every time a big multiplayer game comes out, every PC gamer out there should be thinking, "I can't wait for Battle(non)sense's ______ on this!" Perhaps drawing inspiration from the legendary TotalBiscuit (RIP) and his "Port Report" series, where you evaluate the major graphical settings available in a game and their impact on fps, but also with a strong consideration for input delay. In these videos you could also do a brief test of the game's in-game fps limiter and whether or not it works well. Being able to reliably reproduce this "report" for every big new game gives any and all gamers a reason to subscribe! You could even keep this separate from your Netcode Analysis series and manage to squeeze out two videos worth of content for every big game. This is all assuming that you have the time, but damn you really deserve to be widely renowned and it blew my mind when I referenced your channel to a couple friends the other day and they didn't know who you were 🙁

  16. Hello, Chris! As You may know the most popular game of the January on Twitch was Escape From Tarkov. I've seen both of Your previous videos about shocking netcode of this game. So, time has passed, the game is extremely popular, don't You feel interested in checking what has changed?

  17. input lag which varies by a few milliseconds at most is not a real problem. deltatime insures that your input is proportional to the frametime at all times. Only with massive framedrops is input lag at all noticable.

  18. Its rly hard to save a profil and then load that to every game profil you have. If i play a slow SP game i want diffrent setting from when i play a pvp fps where i want the lowest latency so having diffrent profiles are good. If on the other hand dont want this for some reason just delete all game profiles.

  19. As always, a lot o valuable information on the video! Congratulations!
    I want to sugest for you to do netcode analysis on non fps games there are popular too, like League Of Legends. I think it would be very appealing to more people

  20. Hello Chris,
    Yeah, very bad video. I've used Chill for quiet some time so let me give you some context.

    Chill is meant to be a power-saving feature, not a constant fps thingy. The use case for it would be if you're i.e. waiting for a raid to start and you're basically just standing around, texting or maybe checking someone's gear your GPU doesn't need 200-300 Watts for that. So it renders less frames ergo less power draw (maybe 20-40W). When the raid starts you can turn it off by pressing F11( it also makes a sound so you know it's turned on).

    A few sidenotes

    If you want to see the higher fps limit see with chill, increase your mouse movement
    FRTC is meant for 2D so games like tropico don't render 1k fps whilst loading.
    Some features in the radeon settings are limited to certain APIs so it will not work in some games. It's not a bug.
    IMO you should take this video down, read the documentation and then review it.

  21. Chill is great wtf? Low latency when you're actually inputting M/Kb and higher latency when you're not while still being crazy smooth.

  22. Battle(non)sense

    The purpose of Chill originally was to save power in scenes where you do not need a stable frame rate/times. The prime example given by AMD was WoW. Where they experimented with Framerates in between 31 and 60 FPS as, WoW is not a title which really needs more than 60 FPS, but they claim that the difference in WoW between 30 and 60 is not really noticable. That is the only possible scenario that comes to mind for me.

  23. Chill is not an FPS capper…that is another feature. Chill is meant for when your pc is so powerful it runs the game beyond your monitor cap and not neeeded. for example WoW. do you need 500 fps? no..so you can set to min 40,50, or how i do it 70 fps. than set a cap to 120..thus when you just typing away,standing, and so on your gpu is not using all that power, noise, and so on. aka for laptop use for example.

    "Chill" << is the keyword. You should have done your research for this Feature because you completely use it completely different reason.
    now while watching the video i am amaze Chill is a great solution..so i guess this is why they remove the other.

  24. As far as I know, Chill is not meant for fast shooter games.
    It is a different type of limiter, that tries to reach the FPS target by downclocking the GPU.
    Instead of rendering with full speed…high Boost and high voltage(inefficient)… followed by a break until the target Frametime is reached……Chill tires to adjust the GPU, so it hits the FPS target, but without the high voltages(more efficient).
    And the upper limit is not meant to be reached….that is the same function as normal limiters, stoping the card from high FPS when there is no load.
    The lower limit is the target without user input…as soon as there is user input, the GPU does max boost at first and then reduces clocks to reach FPS approximately 10-20 FPS above then the lower limit.
    I guess, that by setting min and max, too close, or at the same value, you kind of disabled Chill….and tested FRTC.
    I like Chill for slower games like Tomb Raider or RPGs….I set it to 55 and I get about 70 FPS in motion….and the upper limit to 100…..it saves energy….and I don't need the last bit of responsiveness from these games.

  25. Thanks for the video.
    PS some in-game limiters are broken, for instance "outer worlds", you have to use external limiters.

  26. Strategy and low-paced RPGs can benefit hugely from chill to lower power consumption and thermals for the GPU. Fast paced sims or FPS games need low latency, so there chill is bad to apply.

  27. I noticed screen tearing at some points? That should not be the case or should it? at 3:37 where you turn around the the trees, you can notice it. If u look closely it also at 3:44, 3:45. It also seems to appear when you use the built in engine limiter. Like at 4:26 4:32 4:58 🙁

  28. I would like to see AMD/ NVIDIA Driver FPS Limiter and RTSS limiter VS Uncapped FPS. I wonder if these post-game limiters even decrease input lag compared to not limiting the fps at all.

  29. Chill is mostly best for laptops so that you don't waste battery where there isn't a lot going on in game, but then when the action picks up, so will your FPS. It tries to give you the best in both worlds with saving power when little is happening and increasing your framerate when a lot is happening.
    It's also good for reducing heat as your GPU will do less work when very little is happening in game.

  30. Hi Criss. On the question why you would want to have inconsistent input lag. Im not talking specifically about amd chill feature. When its consistent it stays at lower value than the best your system can achieve. While the best input delay is not consistent and happens rarely you still can benefit from it in certain situations. For example when you shoot at longer range and there is not much happening around you to stress the system. So you have highest possible FPS, lowest input lag, your actions are registered sooner and you see enemy movements earlier. Snipers are good example. Competitive gamers dont care too much about smooth gameplay, consistency is good but also means its not the lowest possible input lag. Id say if you can achive or control bit of both is the best scensrio. If you play close to the server and have a connection advantage this is irrelevant though since internet connection is the biggest bottleneck in real time games.

  31. Hi wanted to ask when you use RTSS is it default frame limiter or is it scanline frame limiter, latter is said not to add aditional imput delay.

  32. hey, i know the Quake Champions videos weren't super popular
    but would you consider going back to that game one more time?
    would love to see how that evolved over the years since

  33. Chill is designed to be a power saving feature rather than just a framerate limiter. They variable frame time is a byproduct of trying to achieve that. I think the idea is that you're supposed to set the minimum to whatever you will acceptably play and the maximum to what you want to achieve and it does its best to use the lowest amount of power to get there – which introduces the difference in frame time. It's trying to determine the least possible amount of effort to generate the image on screen, so the less you move, the lower it decides the framerate has to be.

    Makes you wonder why they got rid of the actual framerate limiter though.

  34. I completely disagree with the frame time variation not impacting gameplay much- stutter is FAR harder to compensate for than the slightly longer input latency that comes with RTSS. With that said I don't want you to misunderstand- I am very appreciative of the testing you do. 🙂

  35. the overlay where you mentioned that the tearing in the video was due to using a capture card got my interest…
    I would love to see a video about different screen capture methods. How are software and hardware capture methods different? Do I have to sacrifice my gaming experience to get the best capture? How does my gaming refresh rate affect the capture quality? Can I use a variable refresh rate monitor but still get a good capture at 60fps? What's the best method overall in your opinion?

  36. I need to know whats the Input Lag – if any – is caused by enabling things like ReShade's/Freestyle's Clarity effect!

    It causes a few FPS drops, but I want to know roughly what Input Lag it adds

    Thanks

  37. Chill doesn't work properly for me when I set the min and max values to be the same. I get terribly inconsistent frametimes with it.

    When I use the real FRTC setting from 2019 drivers I get very consistent frametimes. A good amount of people also have this issue and are sticking with the old drivers because the newer ones do not have FRTC.

  38. Global options not working is from the new Adrenalin 2020 edition. Global settings always worked before this update, now it doesn't. I had that painful bug with Chill/Radeon Anti Lag beep sound, when pressing Alt or Ctrl. I disabled both Chill and RAL globally, but it didn't do anything to any game. Per game settings do work tho, so basically almost the same problem as yours. Nevertheless, great video mate..

  39. 2 minutes in and the part about chill not working unless you enable it specifically for that game is just a lie. I have tested it in 5 games, Fifa 20, Dishonored 2, The Division 2, BF4 and BF5 and there is no need to enable it for every game because just enabling it in the graphics tab enables it for everything.

  40. Please make a video about cpu versus gpu usage in games. My hunch is that most games are badly optimized when it comes to cpu usage

  41. RTSS was causing my system to crash since I upgraded to a 5700 XT. And some games simply crash on launch, like League of Legends, if it's running.

    There may be some issues with the enhanced sync and freesync that is causing the glitches with RTSS, but not sure.

  42. They need to get this as a universal driver framework directly into the game-engine. So that there's no difference anymore.

  43. I have a question.
    I can game on 144 fps no problems. But in every game sometimes fpa can drop for like 1 second. It happens.
    What if i put chill min 144 and max 144? What happens when games drops fps?

  44. I don't know if you are just dense, but with 60/120 min/max settings, the frame time changed in sync with mouse and keyboard input, so in the end the final Input Lag, delay, whatever would be the same as 120/120 min/max settings (if a tad bit larger, but let's be honest only top <10% skilled players would care about it, even less notice it).

    I also hoped that you would test the delay with 60/120, but I guess that's too much to ask.

  45. There needs to be a small dynamic frame rate limiter.Let's say you set the gpu usage limit to like 95% so the gpu usage will stay consistent so you'd have the lowest input lag at all times with relatively low frame rate inconsistency.

  46. Say what you will about amd card they got some smexxy looking software. Looks like it works good too. Funny enough I moved from my r9 270x to a 1070 because I wanted GeForce Experience along with shadowplay but I never use them. Now Amd provides tools as good if not better. I might switch back if they get Ray Tracing down.

  47. Thank you for this awesome video @Battle(non)sense

    but please i have a question, i play with a bad internet connexion from bad ISP 8mb down 1mb up, witch mode is better for gaming Online : ADSL2+ OR GDMT ! and i have lower PING With GDMT

  48. I just want to make aware that Radeon Chill was ment for saving power, lower temp noise. It works wonderfully on none online settings. Also, there isn't much difference between 8 and 12 MS.

  49. Pro tip: Use both in-game and external frame rate limiter at the same time. This will minimize the input lag while also retaining more consistent frametimes.
    The input lag will still be higher compared to only using the in-game limiter, however it will be lower compared to just using an external frame rate limiter.

  50. The global profile settings work if you don't have a profile for that game. For example, I don't use any game specific profile (I deleted them all) so I use the global for every game.

  51. Your videos always contain top quality content with no nonsense or pointless filler. Keep up the great work!

  52. "I reached out to AMD and reported this behavior, but did not get an answer yet" -> As a Vega 56 owner I feel like having fantastic hardware with utterly missmanaged software. Features and driver updates are like russian roulette. It may work fine, it may have no sensible concept what so ever, or it may simply break something obvious or not so obvious.

    I simply do not understand how AMD can be so awesome on the cpu market while at the same time be super incompetent for years on the GPU software side.

Leave a Reply

Your email address will not be published. Required fields are marked *