This isn’t what I normally write, but I thought it’d be good both from a consumer perspective and in a- satisfy- your- curiosity kind of way. When you pick-up a new video game or movie they claim to support 720p or 1080i or native solution X. So, what’s the difference between 1080p, 1080i, 720p…etc? The short answer it’s the number of pixels wide the picture is and whether or not the succession of frames are interlaced or progressive. To understand what that means we have to go back to the beginning. Starting….now
Long ago, when people got real pensions, television was invented. Those vacuum tubed machines operated on the same principle as film, a bunch of still frames flashed by the screen. The film industry has discovered that anything slower than 12 frames a second and the human eye can perceive one frame from the next. In the end, they settled on 24 frames a second. They still use that standard today for the most part. At the movies, this can be controlled very easily: the width of the celluloid frames (12mm, 8mm) multiplied by the speed of the projector. But you can’t do that on an analog set, especially since mechanical TV died. Whatever could they do?
Any method they used would have to last the life of the TV. Various television manufactures came to the same conclusion: electricity. In the US, our alternating current travels at 60 Hz. That means every second 60 pulse of electricity come through the outlet. In Europe, they use 50 Hz. That’s why you need a special adapter when you travel over there. Every time a pulse of electricity came through a frame would be shown. This solution came with a huge flaw. The technology did not allow for 60 frames a second, it was just too much information. The next solution: fuck it.
They decided that they could just drop frames. Television would simply be filmed at 30 frames a second and 25 frames a second in Europe. That’s where the I from 1080i comes in. You have 60 pulses but only 30 frames to fill them. They could have showed every frame twice, but that would be a waste of bandwidth. Instead, they chose to do something call interlacing. Each frame only shows have the bars (this is before pixels) and the next frame shows the rest of them. That’s why when you used to pause a recording of a television it sometimes looked like it was shaking. It was actually bouncing between the two frames of the same picture. It’s also why all movies on standard definition (480i) had to be edited to fit the allotted time, no matter what. Because films at shown at 24 frames a second and television is shown at 30 frames a second. Even on movie cable channels, the movie run times are few seconds different than they are in the theatres. So I stands for interlacing. What about P?
Technically speaking, the first HD television where rolled out in the late 1930’s, but they were mechanical and expensive. Anyways, 720p comes along. That’s 1280×720 pixels. Television screens are always wider than they are tall because of our eyes’ orientation. Now, we have 921,600 pixels instead of 640 bars. Now, with increases in broadcasting technology, there was no need to drop half the pixels. Progressive scan was created. Soon, 1080 by 1920 came along which adds up to over two million pixels. Again, broadcasters hit a bandwidth limit. Interlacing made its return with 1080i. Thus far, it’s been deemed too expensive to broadcast. However, since 720p shows ever pixel all the time, it’s better at showing fast moving objects, like a baseball being pitched. That’s why ESPN is broadcasted at 720p. At current, this is the state of display resolution. Now, you know and don’t have to ask your IT consultant.