I see a range of problems in the answers and comments here (even in some highly-voted answers that provide otherwise very good information) that span from minor deficiencies that need explanation to some serious inaccuracies, so I think that some clarification is needed.
The question is specifically: What is the difference between 1080p and 1080i? so I will start by outlining the main similarities and differences, I'll add some tips on how to choose the best format and then I will proceed to explain the problems that I found here.
Some of the information presented below is adapted from my answer to Interlacing on a computer monitor but is rewritten to strictly stick to the subject of the difference between 1080p and 1080i.
Note
Please note that this answer is specifically about HDTV and talks about signals and resolutions that can be transferred with a standard HDMI cable. Other resolutions and frame/field rates are certainly possible but standard HD TV sets, game consoles, Blu-ray Discs etc. use only certain resolutions and frame/field rates described below (or at least they did at the time of writing this answer). Specifically this answer doesn't talk about: Ultra-high-definition television, Super Hi-Vision, Ultra HD television, UltraHD, UHDTV, UHD, 4K, 8K or anything beyond 1080p and 1080i that this question is about.
Resolution
Both 1080p and 1080i have 1080 horizontal lines of vertical resolution
which with a widescreen aspect ratio of 16:9 results in a resolution of 1920 × 1080 pixels (2.1 megapixels). It is not true that 1080i has a lower vertical resolution than 1080p.
Frames vs. fields
1080p is a frame-based or progressive-scan video where you are dealing with frames. You have frame rate and it is expressed in frames per second.
1080i is a field-based or interlaced or interleaved video where you are dealing with fields. You have field rate and it is expressed in fields per second.
A field contains half of the lines of the frame, either even lines or odd lines, and if one field is composed of even lines, then the next one will be composed of odd lines and so on.
Frequencies
1080p has a frame rate of 25 frames per second for TV in PAL countries, 30/1.001 frames per second for TV in NTSC countries and 24 frames per second for cinematography.
1080i has a field rate of 50 fields per second for TV in PAL countries and 60/1.001 fields per second in NTSC countries.
(Note that it is not 30 frames and 60 fields per second for NTSC but actually 30/1.001 and 60/1.001 which is approximately 29.97 and 59.94 but the difference is important. Read about the NTSC color encoding on Wikipedia to see why.)
How to think about it
1080p at 25 frames per second: Imagine that you are shooting 25 pictures per second and storing them as bitmaps. Every frame is a full picture from the given instant. Every pixel in that frame was captured at the same time.
1080i at 50 fields per second: Imagine that you are shooting 50 pictures per second but storing only half of the bitmaps every time - sometimes you store the odd lines and sometimes the even lines. (Note that it is not the same as storing pictures with lower vertical resolution.) Every field is a half of a full picture from the given instant. Every pixel in that field was captured at the same time.
50 halves ≠ 25 full pictures
Contrary to some comments here, interlaced video at 50 Hz does not mean that 25 full pictures per second are shown. It means that 50 halves of pictures are shown but those are halves of 50 different pictures that were shot at 50 distinct moments of time in every second. You not only don't have 25 full pictures per second - you don't have any full pictures at all.
Problems with 1080i
Interlacing causes a lot of problems. For example you can't easily:
- scale the video
- rotate the video
- make video slow motion
- make video fast motion
- pause the video
- grab a still picture frame
- play video in reverse
without doing some tricks and loosing quality. You don't get any of those problems with progressive video. In addition the video encoding is harder because the codec never has a full frame to work with.
Problems with 1080p
The drawback is that 1080p as currently in use has a frame rate that is only half of the field rate of 1080i so the motion is noticeably less fluid - in fact it's exactly twice less fluid which is a lot. You can see it on large flat TVs that often deinterlace the video to be able to display it on their LCD screens (that, unlike CRT displays, are progressive in nature) which is the cause that they display picture of very high resolution but with jerky motion and some deinterlacing artifacts.
Another problem is that usually 1080i is required for TV broadcasting which means that 1080p is simply out of the question for some applications.
Best of both worlds
Using progressive 1080p with 50 or 60/1.001 full frames per second in the future has a potential to eventually solve the above problems but it will require a whole new range of studio equipment including cameras, storage and editing systems so it probably won't happen anytime soon. The widely used SDI standard for connecting HD video equipment doesn't have enough bandwidth.
Currently the only way to have a fluid motion with progressive scanning is 720p that has a frame rate that is two times faster than 1080p but the resolution of only 1280 × 720 pixels (instead of 1920 × 1080 pixels) which may or may not be a problem for some applications. There is no 720i.
Conclusion
There is no one clear winner here.
Update: Here are some general guidelines to choose the right format:
- Is it for high-definition TV? Use 1080i or whatever is required.
- Is it for standard-definition TV? Use 720p and then convert to 576i or 480i.*
- Is it for Internet and resolution is more important than fluid motion? Use 1080p.
- Is it for Internet and fluid motion is more important than resolution? Use 720p.
(It all assumes that 1080p has a frame rate of 25 or 30/1.001 frames/s, 1080i has a field rate of 50 or 60/1.001 fields/s and 720p has a frame rate of 50 or 60/1.001 frames/s as is currently the case. Hopefully a high resolution progressive format like 1080p with a frame rate of 50 or 60/1.001 frames/s or maybe even higher will make this recommendation obsolete in the future.)
*) For number 2 make sure that your 720p has the frame rate of 50 fps if your target format is PAL or SECAM and 60/1.001 if your target format is NTSC (unfortunately it means that there is no format that can be converted to both PAL/SECAM and NTSC). The reason I recommend using 720p for recording is to greatly simplify the edition process when every frame is complete with no interlacing (throwing out every other line at the end is easier than creating the missing lines if you need them) and you have some extra resolution to work with so you can for example zoom the image slightly without making the result look blurry. (If anyone has any bad experience of using 720p to prepare material for SD PAL or NTSC TV broadcasting then please comment so I could update this recommendation.)
Explaining problems
These are the parts that I found in the answers and comments here that I think need some explanation:
Progressive Scanning is more desirable in almost every case.
I think that progressive scanning is indeed better in every respect, but if we are not talking theoretically about the idea of interlacing but specifically about 1080p and 1080i standards as used today, then one has to take into account the fact that 1080i is often required for TV broadcasting and converting 1080p to 1080i would result in jerky motion.
P is better than I in most cases i believe, which is the important bit.
Again, yes, progressive is better than interlaced all other things being equal, but progressive video with frame rate that is two times smaller than the field rate of interlaced video (which is the case with 1080p and 1080i) is something very different, especially if interlaced video with high field rate is required for TV broadcasting and the high field rate cannot be reproduced from progressively recorded material with lower frame rate.
[In 1080i] all the odd lines are displayed, followed by all the even lines. This means that only 1/2 the resolution (540 lines or pixel rows) is displayed on the screen at any give time - in other words, only 540 pixel rows are displayed at any given time.
No. For LCD all 1080 lines are always displayed, for CRT displays usually much less than half of of the lines are displayed at any given time which is equally true for both 1080i and 1080p.
The phrase "only 540 pixel rows are displayed at any given time" is extremely misleading. All 1080 rows-of-pixels usually are displayed at once (and even if they weren't, they'd still appear to be to the human eye), but only half of them will be updated in any given frame. It's effectively the refresh-rate, not the resolution, that's cut in half.
While it is true that the phrase "only 540 pixel rows are displayed at any given time" is extremely misleading, it is not true that the refresh-rate is cut in half, because in 1080i the refresh rate is two times faster than with 1080p so it is actually the other way around.
1080i60 means that you're getting 60 half frames (alternating lines) per second, so only 30 complete frames per second.
With 1080i60 you actually get less than 60 fields (or "half frames") per second, but it doesn't mean that you get 30 (or almost 30) complete frames per second. In fact you don't get even a single complete frame per second.
More resources
This is what I consider the best resource on the subject of field-based (aka interlaced or interleaved) and frame-based (aka progressive-scan) video:
See also the following articles on Wikipedia:
I hope it somewhat clarifies the subject.
Best Answer
Theory
Differences in video probably won't be noticeable for untrained eye. 1080p video would have to be downscaled anyway. It won't be exactly the same, though, because compression and scaling are applied in different order.
Let's assume that original video was 1080p. In this case the 720p video was first scaled, then compressed. On the other hand, 1080p clip was first compressed server-side, then scaled on your machine. 1080p file will obviously be bigger. (otherwise it would offer higher resolution, but at lower quality, ruining the visual experience and invalidating the point of using higher resolution1)
Lossy compression usually causes visual artifacts that appear as square blocks with noticeable edges when video is paused, but aren't visible when you play it with normal framerate. 1080p file will contain more square blocks (caused by compression) than 720p video, but those blocks will be of approximately the same size in both videos.
Doing simple math we can calculate that 1080p video will contain 2,25 times more such blocks, so after scaling it down to 720p those blocks will be 2.25 times smaller than in actual 720p video. The smaller those blocks are, the better quality of the final video is, so 1080p video will look better than 720p video, even on 720p screen. Resized 1080p video will appear slightly sharper than actual 720 clip.
Things get a bit more complicated if source material was bigger than 1080p. The 1080p clip is first scaled to 1080p and compressed before you play it and then scaled once again while playing. The 720p clip is scaled only once and then compressed. The intermediate scaling step which is present in 1080p video case will make its quality slightly worse2. The compression will make 720p even worse, though, so 1080p wins anyway.
One more thing: It's not only video that is compressed, but audio too. When people decide to use higher bitrate1 for video compression, they often do the same with audio. 1080p version of the same video may offer better sound quality than 720p video.
1: A bitrate is the factor that decides how good the compressed video is at the cost of file size. It's specified manually when video is compressed. It specifies how much disk space can be used for every frame (or time unit) of compressed video. Higher bitrate = better quality and bigger file. Using the same bitrate with the same framerate will produce files of (approximately) the same size, no matter what video resolution is, but the higher resolution is used, the less disk space can be spent on single pixel, so increasing output resolution without increasing bitrate can make compressed video look worse than it would with lower output resolution.
2: Try it yourself: open up a photo in any editor and scale it to a bit smaller size, then again and again, save it as PNG. Then open original photo again and scale it to the same size in one step. Second attempt will give better results.
Test
@Raestloz asked for actual videos for comparision in his comment. I couldn't find 1080p and 720p versions of the same video for comparison, so I made one.
I have used uncompressed frames from "Elephant's dream" movie (http://www.elephantsdream.org/) which are available under CC-BY 2.5. I have downloaded frames 1-6000 and converted them into videos using ffmpeg and following batch file:
500 kbps is low enough for compression artifacts and distortions to appear on 720p video. 1125 kbps is proportional bitrate per pixel for 1080p (500 × 2.25 = 1125, where 2.25 = 1920×1080 / 1280×720). 700 kbps is intermediate bitrate to check if using bitrate much lower than proportional for 1080p makes sense. 4000 kbps is high enough to create mostly lossless video in both resolution for comparison of resized 1080p to actual 720p.
Then I have split videos back into single frames. Extracting all frames is slow and takes a lot of space (true story), so I recommend using ffmpeg's
-r
switch to extract every 8th frame (ie.-r 3
for 24 fps video)I can't provide future-proof download links for videos, but these steps can be easily replicated to create clips such as mine. For the record, here are output file sizes: (should be roughly identical for both resolutions, because bitrate is constant per second)
Downloads for samples of extracted frames are available at the end of this post.
Increasing bitrate and resolution
Here's a comparison of the same region cropped from both frames after scaling to 720p (frame 2097). Look at the fingers, heads and the piece of equipment hanging from the ceiling: even going from 500 to 700 kbps makes a noticeable difference. Note that both images are already scaled to 720p.
Frame 3705. Notice rug's edge and the cable:
Frame 5697. This is an example of frame that doesn't compress very well. 1080p 700 kbps video is less detailed than 720 500 kbps clip (ear's edge). Skin details are lost on all compressed frames.
GIFs of all three frames, with increasing bitrate. (Unfortunately I had to use dithering because GIMP doesn't support more than 255 colors in GIF, so some pixels are a bit off.)
Constant bitrate, different resolutions
Inspired by @TimS.'s comment, here is the same region from frame 2097 with 720p and 1080p side by side.
For 500 kbps, 720p is a bit better than 1080p. 1080p appears sharper, but these details aren't actually present in uncompressed image (left guy's trousers). With 700 kbps I'd call it a draw. Finally, 1080p wins for 1125 kbps: both stills look mostly identical, but picture on the right has more pronounced shadows (pipes on the back wall and in lower right part).
Very high bitrate
@Noah asked a good question in the comments: will both images look identical with high enough bitrate? Here's 720p 4000 kbps vs. 1080p 4000 kbps vs. uncompressed frame 5697:
Now this is pretty subjective, but here's what I can see:
It's scaling that starts to play role here. One could intuitively answer that 1080p will look worse than 720p on a 720p screen, because scaling always affects quality. It's not exactly true in this case, because the codec I used (h.264, but also other codecs) has some imperfections: it creates little boxes that are visible on contrasting edges. They appear on the 1080p snapshot too (see links at the bottom), but resizing to 720p causes some details to be lost, in particular smoothes out these boxes and improves quality.
Okay, so let's calculate difference between 720p (left) and 1080p (right) vs. original frame and stretch the contract, so it's clearly visible:
This image gives us even clearer vision of what is going on. Black pixels are represented perfectly in compressed (and resized to 720p) frames, colored pixels are off proportionally to intensity.
Disclaimer
This test is purely synthetic and doesn't prove that real-life 1080p video looks better than 720p when played on smaller screen. However, it shows strong relationship between video bitrate and quality of video resized to screen size. We can safely assume that 1080p video will have higher bitrate than 720p, so it will offer more detailed frames, most of the time enhancing viewer's experience. It's not the resolution that plays most important part, but video bitrate, which is higher in 1080p videos.
Using insanely high bitrate for 720p video won't make it look better than 1080p. Post-compression downscaling can be beneficial for 1080p, because it will shape compression noise and smooth out artifacts. Increasing bitrate doesn't compensate lack of extra pixels to work with because lossy codecs aren't perfect.
In rare cases (very detailed scenes) higher resolution, higher bitrate videos may actually look worse.
What's the difference between this artificial test and real-life video?
Once again: I don't say that this post proves anything. My test is based on artificially-made video. YMMV for realistic examples. Still, the theory probably is true, there's nothing that would suggest it can be wrong. (except for scaling thing, but the test deals with it)
Concluding, in most cases 1080p video will look better than 720p video, no matter what screen resolution is.
Downloads: