You could measure how far you have to move the mouse to get from the left side to the right on your desktop, using a ruler, then go ingame and set the sensitivity so it takes the same mousemovement in centimeters to turn whatever fov you are using.
I didn't answer cause by the time he was done typing that question he should have come up with what you said and realized it's stupid to want a 3d sens to match a 2d sens anyway.
I might be wrong but i think same sensivity means 1pixel=1dpi movement.
use this -> http://phoon.us/mouse/ to find whats "estimated useful dpi =" and try to match it with your actual dpi by changing sensivity value.
remember to allways click Calculate in "real sensitivity (inches/360) =" because it takes the result from this to calculate the above parameter.
for me its 5.8116
with 800dpi, resolution 1280, fov 110.
i guess that you have 6/11 in windows so "windows sensitivity multiplier" for this is equal 1 not 6.
are you sure your screen resolution is correct? 1050 width is strange.
i think that you filled your screen resolution - 1600x1050 in mouse resolution and screen width fields. :D
i ve heared that you just have to set m_rawinput and sensitivtiy to 1 and it will be the same sensitivity like on desktop but ingame it feels way too slow for me
the formula is (360*TAN((fov*pi)/180)/2)/(pi*horizontal resolution in windows*m_yaw)) assuming that you have 4:3 resolutions on both
though im not really sure, you'd better /summon injx if im wrong
that would give 1 pixel turn in the middle of the screen for 1 mouse count move if you have the same resolution ingame/windows for example or the same physical speed of the area around xhair projection and the physical windows pointer speed at your windows resolution in general
but that's not really 1:1 sensitivity ingame and in windows
well ye, a slight bracket misprint there, should be
(360*TAN((fov*pi/180)/2)/(pi*win_width*m_yaw))
as for your formula it gives some enormously huge sens as far as i noticed, also mine was basically given by injx himself some time ago(well, not this one exactly, but the single pixel turn angle with ingame resolution sensitivity formula which is easily modified into this one) and unless he was mistaken back then im pretty sure it's correct http://www.wolframalpha.com/input/?i=%28360*TAN%28%2890*pi%2F180%29%2F2%29%2F%28pi*1280*0.022%29+
A formula for setting the in-game sensitivity so that it is "the same" as the desktop.
Using phoon's variable names (and injx's formulas):
f = field of view (FOV, in degrees)
y = m_yaw (ConVar)
D = DESKTOP resolution width
pi = 3.141592654
s = sensitivity (ConVar)
Calculate in-game sensivity, s:
s = (360 / pi) * tan(f / 2) / (D * y)
Note: Because in-game sensitivity works on DEGREES of rotation, the in-game resolution width (g) is NOT used, but the DESKTOP resolution width is used.
That makes 1 mouse count move the in-game view by the same ANGLE as 1 DESKTOP pixel. That may or may not be 1 IN-GAME pixel, but I think it's more important that the ANGLE is the same.
I see this is very like h8m3's formula, but a little different...
Any ideas if this applies even if the in-game resolution is different from the desktop resolution? Or is it fine as long as the aspect ratio is the same?
This does apply even if the in-game resolution is different from the desktop, and is 100% fine when the in-game aspect ratio is the same as the desktop aspect ratio.
Some hypothetical examples:
- Both in-game and desktop 1920×1080, a movement of 10 mouse counts would move the desktop pointer 10 pixels across the centre of the screen (10 pixels anywhere on the screen), and that same 10 mouse counts would swing the in-game view port about 10 pixels left or right in the centre of the screen.
- Desktop 1920×1080 and in-game 960×540, the 10 mouse counts would swing the in-game view port about 5 pixels, but those pixels are twice as large as desktop pixels (because of the halved resolution), so it will visually look the same.
If the in-game aspect ratio is different, it might not quite work...
If in-game it handles the different aspect ratio by adding black bars to the left and right, then I think using the formula with a smaller adjusted D value will work. If in-game is 4/3 and desktop is 16/9, then use:
D = 4/3 × 9/16 × Actual_Desktop
If in-game it handles the different aspect ratio by stretching the image to fit, then the original formula is correct for horizontal movement, but the stretched/warped view makes vertical movement have an actual different sensitivity.
Basically just to explain myself, I always find myself changing my sensitivity (in cpma, ql, cs, etc) between 2 - 3 and what I discovered was it starts to feel 'weird' after I haven't been playing for a while and have been playing a lot of dota, etc.
In games where I don't care about FPS such as dota I just use native res of 1920x1080 and by using your formula I have come to the conclusion that the magic sensitivity is 2.71. This actually makes it feel the same between desktop, dota, and the likes of quake and CS!
what FOV are u using? are u sticking to 90 in quake to make it feel the same in cs? (as cs doesn't have adjustable FOV and the formular takes FOV into account)
Still i think the formula is crazy overcomplicated. Just measure left to right on desctop vs. left to right in game (like the 360 thingy just with one side on the screen, instead of crosshair) This gets my to sensitivity 2,7 ingame (game: Left 4 Dead 1. FOV should be around 90. Not sure if its exactly the same as in L4D2 or CS:GO)