Generically speaking: pixel = dot = point. They are different physical elements, depending on the medium you're working in. On computer monitors, pixels matter. In printing, dots are what count. Points are more generic and could refer to pixels or dots. The terms are commonly interchanged and often confused.
"Resolution" is the total number of [pixels, points or dots] wide, by total number of [pixels, points or dots] high. So a printer could have a resolution of 1200x1200 dots per inch, while a monitor could have a resolution of 1280x1024.
DPI and PPI are simply ratios. DPI is "dots per inch," PPI is "points per inch" or "pixels per inch." Those ratios increase and decrease based on the resolution (width x height, in pixels) and size (in inches) of a given medium.
To calculate the DPI, you need to determine the actual physical widths and heights of the medium. A common example is the Apple iPhone 4 screen:
Physical Width = 1.94 inches
Physical Height = 2.91 inches
Width (in pixels) = 640
Height (in pixels) = 960
The assumption is that all pixels, dots, or points occupy a square space. Therefore, the simple equation to determine PPI / DPI is to divide pixel height by physical height, yielding roughly 329 DPI.
This information helps to answer your question. Windows does not have any idea what the DPI of your display is, because it has no concept of what the physical dimensions of the display are. You can buy 20" monitors with 1920x1080 resolution, as well as 70" monitors with the same 1920x1080 resolution. Both have signficantly different DPI's, yet Windows has no idea and nothing to do with it.
While Windows offers the option of increasing or decreasing the DPI, all it will really do is adjust system font sizes and default icon / UI sizes of things. Many other apps, graphics, websites and emails will actually get very poorly distorted if you make changes to the DPI settings.
Apple Mac OS (especially iOS) has significantly better support for DPI, and knows, based on the devices it is installed on, which DPI setting to use.
As far as my understanding goes, screen size is not important.
It's rather the screen resolution that is important.
In the simplest (non-existent) case, each dot on the mouse's pad corresponds to one pixel on the screen.
So, for example, if your screen resolution was 1920x1200 and your mouse was capable of a maximum DPI of 600, you'd have to move your mouse two inches to get from the bottom of the screen to the top. If your mouse used a DPI of 1200, it would only take one inch to make the same movement on the screen.
Therefore higher mouse DPI allows you to move faster on the screen with less mouse movement.
Higher resolution displays may require higher sensitivity or higher mouse DPI to attain the same amount of on-screen movement, or one would need a ridiculously large mouse-pad.
Sensitivity is software based. It gets your dpi and divides or multiplies it to get the final dots-to-pixels sensitivity. It is just a multiplier of the input sent by the mouse.
For example, if you had the mouse on 3600 DPI and then set the sensitivity to 2.5/10, it would function the same as 900 DPI on 10/10 sensitivity.
There are cases where mouse sensitivity in the supplied mouse driver is applied in addition to Windows mouse sensitivity, it all ending with an unpredictable mess.
So, for your questions:
Same DPI but lower resolution : Mouse will cover larger physical screen territory with the same hand-movement. It will be harder to click exactly on a specific small area
on the screen.
Same DPI but higher resolution : Mouse will cover smaller physical screen territory with the same hand-movement. Working with small screen objects is easier as the mouse is "slower".
Same DPI, same resolution but different screen size : Getting from one side of the display
to the other will take more mouse movement. But if the same windows are displayed
in the same size, working inside such a window will be the same.
For getting faster across a large screen, you could use mouse acceleration together with
fast and large gestures. However, in gaming mouse acceleration can make you overshoot
your target.
Both higher and lesser mouse DPI have their advantages.
I am, for example, currently using a mouse that has a DPI switch button,
so I can change the DPI to suite my current task.
Best Answer
Text size is normally expressed in points (of which there is a fixed value of 72 per inch on digital displays), not pixels. Therefore 12-point text will take 12/72*96=16 pixels on a 96 DPI display, but 12/72*110=18.3 pixels on a 110 display. Unless your display is capable of changing its native resolution on the fly (which would be a very neat trick to see), your text will appear 110/96-1=14.6% larger.