Definitely reading the specs from the manufacturer is the best way. However, I can personally guarantee that any laptop that has an HDMI port will be able to output 1920x1080. Your performance at that resolution will vary depending on the graphics card.
Also, any laptop manufactured in the last several years will be able to output at that resolution, you may not be able to play games at high frame rates on everything but I don't think you'll have a resolution issue.
This isn't a really technical answer but without more technical information (model number, GPU series, HDMI spec, etc) it's hard to give a specific technical answer.
What is the resolution the computer is using?:
To find out the exact resolution that the computer itself is "sending" to the tv, do a ScreenShot , or hit the PrintScreen Button and paste the image off the clipboard into some picture editing program. Then read the dimentions of the captured picture.
That will confirm that the computers own resolution viewport thing. Screen caps to test can be done with a single monitor or dualies and you can find out what the computer sees.
If the computer own res is showing low, check out features of your video card, using your video card software. HDMI is not always pixel perfect with the video cards, sometimes (depending on the version and manufacture) it requires a bit of tweaking. ATI for example treated HDMI as if it was going to act like an analog tv. Some early LCD digital televisions did act that way (overscans).
DPI (not resolution) is system wide not monitor independant:
From what you showed (luv them pics) the computer is indeed in the desired resolution, and for some reason your tv is doing an interpolation. (it also looks like it is being interpolated) That does not explain the size difference , when DPI settings are system wide not monitor independant. While it looks like your usual 720type interpolation (again) it doesnt explain what looks like a DPI change.
Also why are the Icons of the same clearity but the browser not? have you tested what your observing with many programs, or could it be something that a browser is doing? (I dont know of any browsers that would scale thier DPi based on a monitor, but ya never know what stupid features they will put in next :-)
Resolution can be different on different monitors:
The windows system is completly capable of running 2 monitors at 2 different resolutions. Each monitor can therfore have a resolution set for it. OffTopic- color profiles and cleartype and refresh rates can also be different.
TVs for a computer what fun:
If the screen capture shows the proper res, it could be assumed (but not guarenteed) that the changes you need to make are on the TV itself. First verify the actual model number, to insure you got what you think you got (look on the back of the actual tv). Then try and find format , aspect and other settings on the tv, while nothing seems cropped or zoomed, and the tv will always deal with the picture as a whole item (again not making a lot of sence that it is only part of the image).
Setting everywhere , find them all, learn the purpose:
Tvs are tvs , they are not monitors, so they have a lot of settings designed to work with broadcast signals, most tvs can be set to other than pixel by pixel settings, and some will even overscan still (throwback from CRTs).
Hooking up a TV often required tweaking both the software for the GPU, and the TVs own settings to attempt to keep everything pixel=pixel, so it could be some tweaking will be in order. (overscan, underscan, aspect, format even possible to be effected by refresh rates)
The pictures you display are very usefull, but I could not determine for a fact what is going on, maybe some of the above will help, or help you to provide more information.
Added:
Windows Display Resolution:
In the windows Display Control Panel\All Control Panel Items\Display\Screen Resolution
select the monitor/tv picture (first) that you are having troubles with and check the resolution settings there. If you have 2 monitors make sure before you make changes you have the monitor (picture) selected.
Monitor drivers (profiles):
Usually it is not really important if the tv/monitor name is not shown (the profile driver for the monitor is not installed) "Generic" connections should still be capable of getting the correct resolutions, without the added profile for the monitor. With that said it doesnt hurt to check if the monitor came with a CD or a download off thier web that has a "monitor driver" which is just a profile for that monitor. Monitor profiles also hold color calibration information.
What windows wont do, GPU software often will:
The software that comes with the GPU card (or chip even) Often has many more advanced features for adjusting things than the windows own simple and to the point display setting. Because your using Nvidia, I dont know where the options are. In ATI it also has a special section for HDMI connected tvs, that can resolve some issues and differances when hooking up an actual TV (instead of a monitor).
Best Answer
While monitors support resolutions below what their specifications state, most modern video cards also have the ability to scale/crop a user-defined resolution and output it to the monitor's native resolution (on a hardware/driver based level). I would highly recommend you use the latter method, as you can be guaranteed that it will work with any monitor, so long as it supports at least the resolution you need.
Do note that most monitor-based scaling techniques will introduce noticeable blur into the image, unless you use the video card's drivers to crop or resize the image. Most monitors simply expand the signal they are passed to their native resolution (where the blur comes from), so the use of video card scaling is preferred (especially since you require a very specific height in pixels).
Again, just note that if you deviate from the monitor's native resolution, you will either have to crop or scale the image. Either case is not advisable, but scaling is usually worse, since it introduces more artifacts into the image.
If you for some reason really have to run below the native resolution, say for example in games that are too heavy for your GPU, the most tolerable option is often 1/4 of the native pixels, ie. half the resolution on both axis. This way every in-game pixel renders as 4 pixels on monitor, which is usually least bad of the options you have.