Yo have to understand, what is hardware pixel and what is software pixel. The resolution, that you are setting in display settings, is just a conversion parameter.
The monitor screen size (17" for example) has fixed number of hardware pixels, which equal the maximum resolution of this specific monitor. One pixel in harware is one dot, that can have any color. When technologies evolve, the number of pixels that can be put into one mesurment unit grows: smaller and smaller pixels, like bigger and bigger screen sizes, gives more pixels = bigger maximum resolution. As with more pixels you can see smaller details of an image on screen.
The monitor allows you maximum resolution, your OS detects that and gives you only those resolutions that are allowed by monitor. If you give smaller software resolution than your monitor can handle, the video card simplifies the image going to the monitor. For example, making 4 pixels (in square), show the same color that 1 pixel of your image contain. The lower the resolution, the more pixels will be "unused" - marked with same color.
The image [typically] contains a fixed number of pixels - the same as your monitor. The size of pixel depends on monitor. So, if you have 1600x1200 screen, but are running at 800x600 resolution, your 800x600 image will show twice as large as it would be if viewed on 1600x1200 resolution.
If you view an image, that is 1600x1200 on 800x600 screen, you will see only 1/4 of it on screen. If you zoom it out, to make it show fullscreen, you will see all of it, but it will be simplified - the pixels, depending on ratio (in this case 1/4), will have calculated an average color/brightness from 4 pixels of your image, giving you and average color, from each of 4 pixels (in square) into 1 pixel on screen.
There are chains on which maximum resolution depends on - monitor screen size, pixel density for that screen, graphics card output possibilities, operation system output possibilities. If any of latter fails to support your monitor maximum resolution, you will not be able to us it at its full potential.
Best Answer
Some quick background
Pixels are the smallest physical "dots" that are lit up on the monitor to display an image. They are the building blocks and define all of the tradeoffs. The monitor is manufactured with a specific arrangement of pixels, which is its "native" resolution.
Characters are drawn on the screen by defining which pixels are illuminated within an imaginary grid. The number of pixels in the grid determines the size of the font on that monitor.
Normal Size
Let's start with screen fonts at their normal size and the computer configured to use the monitor's native resolution, and compare how the same font will look on two different size monitors. On each monitor, the actual size of the font on the screen will be determined by the physical size of the screen's pixels.
The density or closeness of the pixels at which the screen is manufactured is measured in pixels per inch. That determines the physical size of each pixel. A 19" monitor with a native resolution of 1600x900 pixels, and a 23" monitor with a native resolution of 1920x1080 pixels, both have roughly 96 pixels per inch. So, if these two screens were compared side-by-side, the font would be the same size on both displays.
Magnification
If you want to select a larger font or set the computer to magnify the font, either option reduces how much will fit on the screen. The larger screen will give you more screen real estate (more pixels to work with). So, the larger, higher resolution monitor will allow you to display more of the enlarged content on the screen, offsetting the content loss from enlargement.
If you set the computer to a lower resolution and magnify it to fill the screen, it maps the content of the smaller image onto the larger space and interpolates to determine what each physical pixel displays. Mapping a 1280x720 resolution onto a 1600x900 display is equivalent to a magnification of 125%. If you wanted that same magnification on a 1920x1080 display, you would select a resolution of 1536x864 (or whatever was the closest standard resolution available), to map full screen. That figure is nearly the same as the native resolution of your current monitor. If you selected a slightly higher resolution, you would get a slightly lower magnification.
So with the larger monitor, you could come close to displaying the content of your current monitor's native resolution at the magnification you like.
Hstoerr raises a good point in his comment. The question talks about crisp, clear text at the monitor's native resolution. Displaying content at the content's native resolution will be sharp. Using any form of magnification, or mapping lower resolution onto a higher-resolution screen, will lose that sharpness. The process of interpolating and averaging pixels degrades edges and fine detail. The sharpest results will be obtained by using larger native content (for example, large fonts and icons displayed without magnification).