Input latency with vsync turned on?
Input latency with vsync turned on?
First, my specs:
Intel Q9950 2.3GHz overclocked to 4.0 GHz
4 GB DDR2
2 x ATI 4870X2 CrossFired
Windows Vista Home Premium SP1
All software and drivers up to date
I play at 1920 x 1200 fullscreen. I notice with ZDoom, if I turn on vsync, there's a noticable latency or delay in my input. When I move the mouse around, I can "feel" the movement lagging behind. Same with the movement keys. I'm pretty sure about this because the instant I type vid_vsync 0, input suddenly feels more responsive.
This doesn't happen in GZDoom with the OpenGL renderer and vertical syncing. It only happens in ZDoom.
I know I can just play without vsync in ZDoom, but the screen tearing is quite obvious. Any solutions?
Intel Q9950 2.3GHz overclocked to 4.0 GHz
4 GB DDR2
2 x ATI 4870X2 CrossFired
Windows Vista Home Premium SP1
All software and drivers up to date
I play at 1920 x 1200 fullscreen. I notice with ZDoom, if I turn on vsync, there's a noticable latency or delay in my input. When I move the mouse around, I can "feel" the movement lagging behind. Same with the movement keys. I'm pretty sure about this because the instant I type vid_vsync 0, input suddenly feels more responsive.
This doesn't happen in GZDoom with the OpenGL renderer and vertical syncing. It only happens in ZDoom.
I know I can just play without vsync in ZDoom, but the screen tearing is quite obvious. Any solutions?
Re: Input latency with vsync turned on?
Is cl_capfps true or false?
Re: Input latency with vsync turned on?
My FPS in uncapped. 35 FPS sucks.
EDIT: Just some clarification. I am getting perfectly fine frame rates with vsync. It stays at 60. It's just my input that feels like it's lagging. When I move the mouse around, it'll take like a few milliseconds before I actually see the screen move. It's not obvious, but as I said, it just doesn't feel right. Turning off vsync removes this latency, but then I get screen tearing because the game runs at 200 FPS. :P
EDIT: Just some clarification. I am getting perfectly fine frame rates with vsync. It stays at 60. It's just my input that feels like it's lagging. When I move the mouse around, it'll take like a few milliseconds before I actually see the screen move. It's not obvious, but as I said, it just doesn't feel right. Turning off vsync removes this latency, but then I get screen tearing because the game runs at 200 FPS. :P
Re: Input latency with vsync turned on?
This might require you to cap your framerate. Try capping your framerate as well.
Re: Input latency with vsync turned on?
Ed: Maybe you didn't hear him say that he hated capping the FPS. Yes, I have experienced input lag as well when using VSYNC. I guess it's because the computer needs a little extra time to gather all of the information necessary to sync up the frames. The result is that your actions are delayed slightly (albeit significantly!). I usually play Skulltag, so in order to not get owned all the time, I turned VSYNC off. I can deal with the tearing.
Re: Input latency with vsync turned on?
My only other option would be to increase zdooms allowed refresh rate to reduce the tearing. Figure out the highest refresh rate your monitor will handle, and then set that number to 'vid_refreshrate'.
Example: vid_refreshrate 85
Example: vid_refreshrate 85
Re: Input latency with vsync turned on?
IMPORTANT UPDATE:
This problem was noted and posted to some extent by me and some other dude, and my first solution was:
r_forceddraw 1
and that fixed the lagging while vsync was turned on, but I had to use displaybits 16, since in DDraw displaybits 8 results in terrible skipping when screenflashes happen. And displaybits 16 results in lower performance, noticeable in KDIZD.
BUT RECENTLY:
Sorry to see you have ATI graphics, because in recent NVIDIA drivers, there is an option:
Maximum pre-rendered frames
And the default is four, so I set it to zero, and the ZDoom lag went away. It says this option only affects DirectX games.
So see if your graphics drivers have an option about pre-rendered frames, or if that fails, switch to directdraw.
This problem was noted and posted to some extent by me and some other dude, and my first solution was:
r_forceddraw 1
and that fixed the lagging while vsync was turned on, but I had to use displaybits 16, since in DDraw displaybits 8 results in terrible skipping when screenflashes happen. And displaybits 16 results in lower performance, noticeable in KDIZD.
BUT RECENTLY:
Sorry to see you have ATI graphics, because in recent NVIDIA drivers, there is an option:
Maximum pre-rendered frames
And the default is four, so I set it to zero, and the ZDoom lag went away. It says this option only affects DirectX games.
So see if your graphics drivers have an option about pre-rendered frames, or if that fails, switch to directdraw.
Re: Input latency with vsync turned on?
Apparently, the setting is called "flip queue size" in ATI drivers. I tried to find a way to set this from within ZDoom with no luck.
Re: Input latency with vsync turned on?
Weird, it seems like an option that an application should be able to control.
I wonder if there should be a WIKI section for common problems and their fixes. Many people now have had this particular problem.
I wonder if there should be a WIKI section for common problems and their fixes. Many people now have had this particular problem.
- leileilol
- Posts: 4449
- Joined: Sun May 30, 2004 10:16 am
- Preferred Pronouns: She/Her
- Location: GNU/Hell
Re: Input latency with vsync turned on?
you could try terminating ati2evxx processes (ATI's stupid hotkey poller that screws up priority)
Re: Input latency with vsync turned on?
Actually there's already a [wiki]FAQ[/wiki] page at the wiki, and one of the sections listed is for troubleshooting, but its content hasn't been updated in ages.phi108 wrote:I wonder if there should be a WIKI section for common problems and their fixes. Many people now have had this particular problem.
[edit] Fixed a few outdated entries and added one for this issue. Somebody who knows exactly what needs to happen to fix the issue (both for NVidia and ATI) needs to go touch it up, because the info given here is pretty bare-bones.