Ok I've read more than a few times that higher resolutions need a minimum DPI to achieve "pixel perfect aiming". Can someone explain to me why this is?
Right now I'm thinking the basis of this idea is that a lot of people think DPI refers to how many pixels the cursor moves onscreen for every inch of movement on the mousepad, but it actually refers to the resolution of the images taken by the optical sensor in the mouse. Ie, a mouse set to 1000dpi takes images at a higher resolution and so each image of the mousepad/whatever surface is broken down into more segments. So as a result, you have to physically move your mouse less for the sensor to detect a movement, hence the higher sensitivity you get at higher DPIs.
So is the whole pixel perfect accuracy/higher dpi/higher resolution idea, based on that first definition of DPI? Even if it is, it doesn't necessarily mean more DPI isn't needed at higher resolutions. I'm just looking for an explanation, that fits with what DPI/CPI actually means.
Right now I'm thinking the basis of this idea is that a lot of people think DPI refers to how many pixels the cursor moves onscreen for every inch of movement on the mousepad, but it actually refers to the resolution of the images taken by the optical sensor in the mouse. Ie, a mouse set to 1000dpi takes images at a higher resolution and so each image of the mousepad/whatever surface is broken down into more segments. So as a result, you have to physically move your mouse less for the sensor to detect a movement, hence the higher sensitivity you get at higher DPIs.
So is the whole pixel perfect accuracy/higher dpi/higher resolution idea, based on that first definition of DPI? Even if it is, it doesn't necessarily mean more DPI isn't needed at higher resolutions. I'm just looking for an explanation, that fits with what DPI/CPI actually means.
17702 Hits