Mouse sensitivity is framerate dependent

LiquidLink

New member
Currently my game is very poorly optimized, meaning that the framerate can jump from 200 fps down to 45 very quickly. This really demonstrates the issue with sensitivity being tied to framerate, as it is almost unbearable. When the framerate is high my sensitivity is very very low, and when the framerate is low the sensitivity is very high. I tested this several times, and my current solution is to lock the framerate to 60. Even with this temporarily making the game playable, it is still very clear that the changes in sensitivity. I have experienced this both with mouse smoothing and raw input. Does anyone have any idea what may be going wrong? Or is this just an issue with the character controller?

Thanks to anyone who helps.
 
Do you have VSync disabled? Also, just to confirm, are you running the latest version?
 
Do you have VSync disabled? Also, just to confirm, are you running the latest version?

I have noticed the difference in sensitivities with v sync on and off, and in a fresh project that only had UFPS 2 imported last week. So yes I have had this issue on the latest version as well.

V sync should have nothing to do with the issue though anyway correct? as if you are on a 60 hz monitor and the framerate drops to 40 you will notice it, the same as a 144 hz monitor dropping lower as well. If this is a fundamental issue with mouse sensitivity then I think it is very important to get fixed.

Also, for when I made the new project and just ran the UFPS demo scene. All I did to notice the difference was add a simple script to lock the framerate to different amounts, (30, 60, 120).

I hope that I am wrong about this and am just making a stupid mistake, however I saw someone on the discord complain that when their game was running at very high framerates the sensitivity was very low, which sounds like the exact same problem.

Thanks for any help and just let me know if you have any more questions.
 
Thanks for the details. When the look direction is retrieved it uses the frame count to determine if it is out of date so this does mean that it is framerate dependent. I have changed PlayerInput so there is a new variable exposed that allows you to control the update rate to prevent it from updating too many times. You can make this change by replacing:

Code:
            if (!m_HasFocus || m_UpdateFrame == Time.frameCount) {
                return;
            }
            m_UpdateFrame = Time.frameCount;

With:
Code:
            if (!m_HasFocus || m_UpdateTime + m_LookUpdateInterval >= Time.unscaledTime) {
                return;
            }
            m_UpdateTime = Time.unscaledTime;

You'll need to replace m_UpdateFrame to m_UpdateTime and making it a float. You should then add the m_LookUpdateInterval to the PlayerInputInspector:

Code:
                EditorGUILayout.PropertyField(PropertyFromName("m_LookUpdateInterval"));

If for example you set the update interval to 0.02 it'll only update every 0.02 seconds.
 
Hey I just wanted to update this and say that I actually did implement this and it did not work. It made the game feel very choppy when the lookupdateinterval was less than the framerate. For example if the look update is set to .02 and I lock the game to 120 frames per second, it will feel terrible. The other issue is that it does not actually fix the issue at hand. When the fps is lowered the sensitivity is much higher.

I am not sure if you have any other ideas on how to fix it, but I would like to try.

This is the smoothest FPS controller I have found, except for this one issue, so I also wanted to ask out of curiosity what you did specifically to remove object jitter when moving while turning. Every time I try to make a First Person controller, or I try to use someone else's (except yours), objects around the environment noticeably jitter when I move and turn at the same time, is there any basic thing I am missing when it comes to this? This is the main reason that I want to make this one work so badly, because it certainly does feel smooth when locking the framerate.

Thanks for any help.
 
I found a temporary solution but it is not great and I am hoping you can fix it to account for all fames in between the fixed update. I just changed the value of the look vector that is set in SetCameraLookVectorInternal in the DeterministicObjectManager to be dependent of the current delta time.

private void SetCameraLookVectorInternal(int cameraIndex, Vector2 lookVector)
{
m_Cameras[cameraIndex].LookVector = lookVector;
}

I changed to...

private void SetCameraLookVectorInternal(int cameraIndex, Vector2 lookVector)
{
lookVector.x = lookVector.x * (int)(1 / DELTA_TIME) / 60;
lookVector.y = lookVector.y * (int)(1 / DELTA_TIME) / 60;
m_Cameras[cameraIndex].LookVector = lookVector;
}

Where DELTA_TIME is a variable that is set equal to Time.deltaTime in Update. The other numbers in there are basically just arbitrary numbers I picked. (1 / delta time gives the framerate, then divide by 60 since I have set up my sensitivity at 60 fps). I use the value created by that (which I had to cast as an int for reasons that I do not understand) as a multiplier to standardize the look vector. This is clearly not perfect, as it only multiplies by the most recent delta time. I would assume something like this could be incorporated earlier down the "pipeline", like when the look vector is calculated.

Basically I'm asking if you could either incorporate an actually good version of this code or could help me figure out how to. I do think fixing this issue would benefit everyone using the character controller though.
 
Hmm.. when I tried that code it mostly returned a value of 0 because the the value was being cast to an int. There is a TimeUtility.FramerateDeltaTime property that you can use though. Within PlayerInput change:

Code:
                m_CurrentLookVector.x *= (m_LookSensitivity.x + lookAcceleration);
                m_CurrentLookVector.y *= (m_LookSensitivity.y + lookAcceleration);
to:
Code:
                m_CurrentLookVector.x *= (m_LookSensitivity.x + lookAcceleration) * TimeUtility.FramerateDeltaTime;
                m_CurrentLookVector.y *= (m_LookSensitivity.y + lookAcceleration) * TimeUtility.FramerateDeltaTime;
 
Top