Yo have to understand, what is hardware pixel and what is software pixel. The resolution, that you are setting in display settings, is just a conversion parameter.
The monitor screen size (17" for example) has fixed number of hardware pixels, which equal the maximum resolution of this specific monitor. One pixel in harware is one dot, that can have any color. When technologies evolve, the number of pixels that can be put into one mesurment unit grows: smaller and smaller pixels, like bigger and bigger screen sizes, gives more pixels = bigger maximum resolution. As with more pixels you can see smaller details of an image on screen.
The monitor allows you maximum resolution, your OS detects that and gives you only those resolutions that are allowed by monitor. If you give smaller software resolution than your monitor can handle, the video card simplifies the image going to the monitor. For example, making 4 pixels (in square), show the same color that 1 pixel of your image contain. The lower the resolution, the more pixels will be "unused" - marked with same color.
The image [typically] contains a fixed number of pixels - the same as your monitor. The size of pixel depends on monitor. So, if you have 1600x1200 screen, but are running at 800x600 resolution, your 800x600 image will show twice as large as it would be if viewed on 1600x1200 resolution.
If you view an image, that is 1600x1200 on 800x600 screen, you will see only 1/4 of it on screen. If you zoom it out, to make it show fullscreen, you will see all of it, but it will be simplified - the pixels, depending on ratio (in this case 1/4), will have calculated an average color/brightness from 4 pixels of your image, giving you and average color, from each of 4 pixels (in square) into 1 pixel on screen.
There are chains on which maximum resolution depends on - monitor screen size, pixel density for that screen, graphics card output possibilities, operation system output possibilities. If any of latter fails to support your monitor maximum resolution, you will not be able to us it at its full potential.
Generically speaking: pixel = dot = point. They are different physical elements, depending on the medium you're working in. On computer monitors, pixels matter. In printing, dots are what count. Points are more generic and could refer to pixels or dots. The terms are commonly interchanged and often confused.
"Resolution" is the total number of [pixels, points or dots] wide, by total number of [pixels, points or dots] high. So a printer could have a resolution of 1200x1200 dots per inch, while a monitor could have a resolution of 1280x1024.
DPI and PPI are simply ratios. DPI is "dots per inch," PPI is "points per inch" or "pixels per inch." Those ratios increase and decrease based on the resolution (width x height, in pixels) and size (in inches) of a given medium.
To calculate the DPI, you need to determine the actual physical widths and heights of the medium. A common example is the Apple iPhone 4 screen:
Physical Width = 1.94 inches
Physical Height = 2.91 inches
Width (in pixels) = 640
Height (in pixels) = 960
The assumption is that all pixels, dots, or points occupy a square space. Therefore, the simple equation to determine PPI / DPI is to divide pixel height by physical height, yielding roughly 329 DPI.
This information helps to answer your question. Windows does not have any idea what the DPI of your display is, because it has no concept of what the physical dimensions of the display are. You can buy 20" monitors with 1920x1080 resolution, as well as 70" monitors with the same 1920x1080 resolution. Both have signficantly different DPI's, yet Windows has no idea and nothing to do with it.
While Windows offers the option of increasing or decreasing the DPI, all it will really do is adjust system font sizes and default icon / UI sizes of things. Many other apps, graphics, websites and emails will actually get very poorly distorted if you make changes to the DPI settings.
Apple Mac OS (especially iOS) has significantly better support for DPI, and knows, based on the devices it is installed on, which DPI setting to use.
Best Answer
It looks like you are referring to "zoom 200%" in some OS settings.
When you use 1280*720 everything is rendered in this resolution and then scaled up as a bitmap (by your monitor). The final image indeed consists of 2x2 pixel blocks.
When you use 2560*1440 resolution with zoom of 200% then every object is scaled up first, then rendered in the full resolution. With a bitmap it may not make a difference but objects like TrueType fonts or vector graphics scale "smoothly", they can alter every available pixel separately. In effect the resulting image doesn't necessarily form 2x2 pixel blocks on your screen as in the first case.
Example
Let's start with low resolution 4x4:
We draw an object described as "upper-left right triangle, 4x4, black":
The monitor gets the above bitmap and scales it to its native resolution 8x8, so each original pixel becomes a 2x2 pixel block:
Now let's use 8x8 resolution from the very beginning:
We consider an object described as "upper-left right triangle, 4x4, black":
But we tell the OS to use 200% zoom. The OS recalculates the object and gets "upper-left right triangle, 8x8, black":
This is then sent to the monitor and displayed.
Comparison:
Note if we only had the original 4x4 triangle as a bitmap, the final result would be like the left one above, regardless if the scaling was done by the OS or the monitor. Mathematical description of the triangle allowed the OS to recalculate it to new dimensions and get the smooth image at the end.
In modern operating systems many GUI elements, fonts etc. are available as "mathematical descriptions" that may be recalculated smoothly to given dimensions (zoomed). The general term is vector graphics.