How do you detect latency?

I've setup my new PC with the Nvidia driver option for V-Sync with the setting "Adaptative" ("a good compromise between latency and quality"). Everything looks good, very fluid and the GPU it's not stressed. But I understand V-Sync can cause latency but I don't notice it. So my question is how can I detect the latency induced for the v-Sync option? Is there some kind of test or only intuition/observation?

(I know i can limit the FPS in the Nvidia control panel. I've tested it with different values but it's not as fluid like the v-sync adaptative option)
 
Only real option to test the latency is to record a high fps video with your real wheel and the visual wheel on the video.
I did some testing last year by mapping my num-lock key (which lights up the LED on my keyboard) as upshift and then I put my keyboard in fromt of my monitor and recorded the video.
The delay from the LED (which is activated instantly. Like 0-1ms) and the upshift shown by the game.

A 120 fps video is "okay" for that, I would recommend 240 fps though. Most smartphone can do this when you lower the resolution/qualy. There should be a "slo motion" option in your camera app.
Or if you have a gopro/camcorder, you can check if you can record high fps with that.

You then need a player that can slow down the playback until you can see the single frames or a video editor.
I use Magix Vegas and then counted the frames.
240 fps = 1000ms/240 = 4.167ms frame to frame.
Then you mark the frame where the input from yourself happens and then count the frames until the game shows the input on the monitor.

Pretty complicated... :whistling:

Here's a test video comparing AC vs ACC, it's very low input lag though since I'm using a g-sync monitor, which has very low input lag in general.

 
Last edited:
Wow, great answer. I'll try that
Thanks!
Oh btw:
Can you tell us your pc components and your monitor model?

The description of adaptive vsync is misleading...

There are a few general things I guess you don't know yet:

- when using normal vsync, it will be smooth and fluid but you'll have 2 frames "cached" for smoothing things out. That's the famous vsync input lag

- when your pc drops below your vsync fps (mostly 60, depends on the hz of your monitor), one frame will be displayed a second time until your pc has the next frame ready.
That's the famous "stutter" that happens when you can't maintain stable fps with vsync

- when you don't use vsync, the frames from your pc will simply be thrown onto the monitor whenever they are ready.
Your monitor displays the frames from top to bottom (at 240 fps cameras you might see this).
If the next frame is pushed to the monitor in the middle of this "scan out" process of the monitor, you will see "tearing". You'll have a cut in the displayed image.
Often the cuts are very small so instead of tearing, you might only notice stuttering.

- with adaptive vsync, vsync will be activated when reaching 60 fps (or whatever the hz of your monitor is) and deactivated when your pc drops below that fps value.
That's to not drop down to 30 fps (one frame being displayed twice). You'll have some tearing and some stutter but not this "pause" like stutter you'd have with normal vsync.

Adaptive vsync doesn't like fps limiters! At close to 60, the limiter will cause the vsync to activate and deactivate all the time and not maintain a stable state.

Also adaptive vsync will have variing input lag. 0 cached frames below 60 fps, 2 cached frames at 60 fps.
Your body won't get used to either, so I don't like this...


Finally, there's "adaptive sync", important to notice that it's sync, not vsync!
Adaptive thing is mostly called gsync for Nvidia and freesync for amd. Adaptive sync is the technological term.

With adaptive sync, the pc will push out frames wherever it has them ready but instead of tearing or stuttering, the monitor will change its hz in real time to fit the pc output.
When it's changing too much, you'll see stuttering. But you won't have tearing!

Since there are no frames cached, you have a very low input lag.

That's why I bit the bullet and bought a 1000€ gsync monitor in 2018.
And I love every day with it...


Additional info: you can also limit the fps with Riva tuner to decimal numbers. Like 59.97 for example.
This in combination with the normal vsync will cause the frame cache to be more or less empty.
You'll have a little hiccup every few seconds but it's not really noticeable.
You will gain a way better input lag with this!

It's not usable with adaptive vsync though as that would cause activating /deactivating vsync all the time.

So if you can maintain stable 60 fps, use Riva tuner to limit at 59.97 fps and use the normal vsync for smooth and fluid image quality and low input lag.
I used this method for years! :)
 
Yes I know the Riva option, but you can now limit the FPS with the Nvidia driver control panel. After people asked for it for years... I'm trying different FPS limits now.

Yes, adaptative VSync it's an inferior solution than g-sync but I wanted to "see" the effects in latency, because visually looks very fluid to me. I'll try something like you test.

My monitor is FreeSync, not compatible with g-sync (seems some of them are).
 
Yes I know the Riva option, but you can now limit the FPS with the Nvidia driver control panel. After people asked for it for years... I'm trying different FPS limits now.
I know, but Riva can do 59.97, Nvidia control panel can only do 59 or 60, no decimals...
Yes, adaptative VSync it's an inferior solution than g-sync but I wanted to "see" the effects in latency, because visually looks very fluid to me. I'll try something like you test.
The input lag, when it's fluid, will be exactly the same as vsync. It's not a different technology, just an on/off switch when the fps drop below 60.
But you can't use the 59.97 limiter from Riva with adaptive vsync :(
My monitor is FreeSync, not compatible with g-sync (seems some of them are).
Mostly only the "gaming" freesync monitors are true compatible, because their display controller can handle high speed changes.
Almost every 144 hz freesync monitor is gsync compatible.
The 60 or 75 and some 100/120 hz freesync monitors are hit and miss.
Some can run perfectly fine within a small fps range like 55-65, which eliminates micro stuttering from little drop to 58 for example, where vsync would cause a little stutter.

Can you give us your pc specs :) cpu, gpu, monitor model. Maybe someone knows how to get gsync compatible working for you!
 
Here are the specs:
LG 27MP59G-P (27") IPS (1920 x 1080 , 16:9, 1 ms con MBR, 75Hz, 250 cd/m², 1000:1, sRGB >99%, D-SUB x1, HDMI x1, DP x1

GPU: GTX 1660 Super

With an HDMI cable the g-sync option doesn't appear in the Nvidia control panel. I've to try with a DP cable.

With 70 fps limit seems ok, but still not as fluid as vsync. I'll try decimals with the Riva option
 
Here are the specs:
LG 27MP59G-P (27") IPS (1920 x 1080 , 16:9, 1 ms con MBR, 75Hz, 250 cd/m², 1000:1, sRGB >99%, D-SUB x1, HDMI x1, DP x1

GPU: GTX 1660 Super

With an HDMI cable the g-sync option doesn't appear in the Nvidia control panel. I've to try with a DP cable.

With 70 fps limit seems ok, but still not as fluid as vsync. I'll try decimals with the Riva option
Monitor:
I've quickly found your monitor in a sheet where someone says it's working fine.
Of course you need a Displayport cable for it. "Gsync" range should be 40-75 Hz.
Only issues are some black screens for 1-5 seconds in loading screens (probably because loading screens cause the fps to go down to like 4 or 15 and that's outside of the gsync range.)

70 fps limit:
yeah that will always look bad.
Your Hz is 75 Hz, so vsync only works at 75 fps/hz.
If you limit at 70 fps, two things can happen:

Normal vsync: the real fps will jump between 32,5 fps (half of 75) and 75 fps all the time, causing stuttering.

Adaptive vsync: vsync will simply shut off so it's exactly the same as disabling vsync.

You should either use:

Normal vsync: limit fps to 74.97, which will keep vsync fluid but keep the buffers empty and reduce input lag. This will cause a very little microstutter every few seconds.

Adaptive vsync: no limiter! When you limit the fps, adaptive vsync will deactive vsync.

But really: Grab (or buy) a Displayport cable (at least 1.2 standard) and enable gsync compatible mode!
 
  • Deleted member 197115

Great info from RasmusP as always, I'd just add that Low Latency Mode (ex. max prerendered frames) can be also utilized to control queue size and latency.
BTW, why buffer 2, the default is 3, with Low Latency on it's 1.

OP, some bed time reading if you get bored.
 
Last edited by a moderator:
(If you don't notice it, be very glad and don't try to. Your life will be much easier.)
True.. I should've put that as a disclaimer at the top of my post :roflmao: :whistling:
Great info from RasmusP as always, I'd just add that Low Latency Mode (ex. max prerendered frames) can be also utilized to control queue size and latency.
BTW, why buffer 2, the default is 3, with Low Latency on it's 1.

OP, some bed time reading if you get bored.
True!
Low latency "on" should work well with adaptive and normal vsync. "Ultra" will probably cause stutter, since it's limiting 1 fps below the refresh rate afaik (3 fps with gsync on).

And true, the default is 3... It was more figuratively "There is stuff cached somewhere, somehow" , not spot on technical:D
 

Latest News

How long have you been simracing

  • < 1 year

    Votes: 333 15.4%
  • < 2 years

    Votes: 229 10.6%
  • < 3 years

    Votes: 225 10.4%
  • < 4 years

    Votes: 170 7.9%
  • < 5 years

    Votes: 289 13.4%
  • < 10 years

    Votes: 251 11.6%
  • < 15 years

    Votes: 161 7.5%
  • < 20 years

    Votes: 122 5.7%
  • < 25 years

    Votes: 97 4.5%
  • Ok, I am a dinosaur

    Votes: 279 12.9%
Back
Top