Generically speaking: pixel = dot = point. They are different physical elements, depending on the medium you're working in. On computer monitors, pixels matter. In printing, dots are what count. Points are more generic and could refer to pixels or dots. The terms are commonly interchanged and often confused.
"Resolution" is the total number of [pixels, points or dots] wide, by total number of [pixels, points or dots] high. So a printer could have a resolution of 1200x1200 dots per inch, while a monitor could have a resolution of 1280x1024.
DPI and PPI are simply ratios. DPI is "dots per inch," PPI is "points per inch" or "pixels per inch." Those ratios increase and decrease based on the resolution (width x height, in pixels) and size (in inches) of a given medium.
To calculate the DPI, you need to determine the actual physical widths and heights of the medium. A common example is the Apple iPhone 4 screen:
Physical Width = 1.94 inches
Physical Height = 2.91 inches
Width (in pixels) = 640
Height (in pixels) = 960
The assumption is that all pixels, dots, or points occupy a square space. Therefore, the simple equation to determine PPI / DPI is to divide pixel height by physical height, yielding roughly 329 DPI.
This information helps to answer your question. Windows does not have any idea what the DPI of your display is, because it has no concept of what the physical dimensions of the display are. You can buy 20" monitors with 1920x1080 resolution, as well as 70" monitors with the same 1920x1080 resolution. Both have signficantly different DPI's, yet Windows has no idea and nothing to do with it.
While Windows offers the option of increasing or decreasing the DPI, all it will really do is adjust system font sizes and default icon / UI sizes of things. Many other apps, graphics, websites and emails will actually get very poorly distorted if you make changes to the DPI settings.
Apple Mac OS (especially iOS) has significantly better support for DPI, and knows, based on the devices it is installed on, which DPI setting to use.
What is the resolution the computer is using?:
To find out the exact resolution that the computer itself is "sending" to the tv, do a ScreenShot , or hit the PrintScreen Button and paste the image off the clipboard into some picture editing program. Then read the dimentions of the captured picture.
That will confirm that the computers own resolution viewport thing. Screen caps to test can be done with a single monitor or dualies and you can find out what the computer sees.
If the computer own res is showing low, check out features of your video card, using your video card software. HDMI is not always pixel perfect with the video cards, sometimes (depending on the version and manufacture) it requires a bit of tweaking. ATI for example treated HDMI as if it was going to act like an analog tv. Some early LCD digital televisions did act that way (overscans).
DPI (not resolution) is system wide not monitor independant:
From what you showed (luv them pics) the computer is indeed in the desired resolution, and for some reason your tv is doing an interpolation. (it also looks like it is being interpolated) That does not explain the size difference , when DPI settings are system wide not monitor independant. While it looks like your usual 720type interpolation (again) it doesnt explain what looks like a DPI change.
Also why are the Icons of the same clearity but the browser not? have you tested what your observing with many programs, or could it be something that a browser is doing? (I dont know of any browsers that would scale thier DPi based on a monitor, but ya never know what stupid features they will put in next :-)
Resolution can be different on different monitors:
The windows system is completly capable of running 2 monitors at 2 different resolutions. Each monitor can therfore have a resolution set for it. OffTopic- color profiles and cleartype and refresh rates can also be different.
TVs for a computer what fun:
If the screen capture shows the proper res, it could be assumed (but not guarenteed) that the changes you need to make are on the TV itself. First verify the actual model number, to insure you got what you think you got (look on the back of the actual tv). Then try and find format , aspect and other settings on the tv, while nothing seems cropped or zoomed, and the tv will always deal with the picture as a whole item (again not making a lot of sence that it is only part of the image).
Setting everywhere , find them all, learn the purpose:
Tvs are tvs , they are not monitors, so they have a lot of settings designed to work with broadcast signals, most tvs can be set to other than pixel by pixel settings, and some will even overscan still (throwback from CRTs).
Hooking up a TV often required tweaking both the software for the GPU, and the TVs own settings to attempt to keep everything pixel=pixel, so it could be some tweaking will be in order. (overscan, underscan, aspect, format even possible to be effected by refresh rates)
The pictures you display are very usefull, but I could not determine for a fact what is going on, maybe some of the above will help, or help you to provide more information.
Added:
Windows Display Resolution:
In the windows Display Control Panel\All Control Panel Items\Display\Screen Resolution
select the monitor/tv picture (first) that you are having troubles with and check the resolution settings there. If you have 2 monitors make sure before you make changes you have the monitor (picture) selected.
Monitor drivers (profiles):
Usually it is not really important if the tv/monitor name is not shown (the profile driver for the monitor is not installed) "Generic" connections should still be capable of getting the correct resolutions, without the added profile for the monitor. With that said it doesnt hurt to check if the monitor came with a CD or a download off thier web that has a "monitor driver" which is just a profile for that monitor. Monitor profiles also hold color calibration information.
What windows wont do, GPU software often will:
The software that comes with the GPU card (or chip even) Often has many more advanced features for adjusting things than the windows own simple and to the point display setting. Because your using Nvidia, I dont know where the options are. In ATI it also has a special section for HDMI connected tvs, that can resolve some issues and differances when hooking up an actual TV (instead of a monitor).
Best Answer
Yo have to understand, what is hardware pixel and what is software pixel. The resolution, that you are setting in display settings, is just a conversion parameter.
The monitor screen size (17" for example) has fixed number of hardware pixels, which equal the maximum resolution of this specific monitor. One pixel in harware is one dot, that can have any color. When technologies evolve, the number of pixels that can be put into one mesurment unit grows: smaller and smaller pixels, like bigger and bigger screen sizes, gives more pixels = bigger maximum resolution. As with more pixels you can see smaller details of an image on screen.
The monitor allows you maximum resolution, your OS detects that and gives you only those resolutions that are allowed by monitor. If you give smaller software resolution than your monitor can handle, the video card simplifies the image going to the monitor. For example, making 4 pixels (in square), show the same color that 1 pixel of your image contain. The lower the resolution, the more pixels will be "unused" - marked with same color.
The image [typically] contains a fixed number of pixels - the same as your monitor. The size of pixel depends on monitor. So, if you have 1600x1200 screen, but are running at 800x600 resolution, your 800x600 image will show twice as large as it would be if viewed on 1600x1200 resolution.
If you view an image, that is 1600x1200 on 800x600 screen, you will see only 1/4 of it on screen. If you zoom it out, to make it show fullscreen, you will see all of it, but it will be simplified - the pixels, depending on ratio (in this case 1/4), will have calculated an average color/brightness from 4 pixels of your image, giving you and average color, from each of 4 pixels (in square) into 1 pixel on screen.
There are chains on which maximum resolution depends on - monitor screen size, pixel density for that screen, graphics card output possibilities, operation system output possibilities. If any of latter fails to support your monitor maximum resolution, you will not be able to us it at its full potential.