Technically speaking
Everything you never wanted to know about digital
First off, digital is not necessarily better than analog, and digital is not necessarily high-definition. Standard digital signals can be transmitted by the same means analog signals can, but it takes a digital tuner to pick them up. While an analog picture gets worse the farther it is from the transmitter, a digital signal will either look perfect or it won’t come in at all. Techies call this the “cliff effect,” because with poor reception, the signal seems to drop off a cliff.
Another detail to watch for is the “aspect ratio” of one’s signal source. This is basically the size and shape of the image you see on the screen. Movies are shot in 16:9, meaning they are 16 units wide and 9 units high, to give a field of vision closer to what humans naturally see. The standard for analog TV is 4:3, which is almost square. That’s why movies shown on regular TV are often shown with black bars on the top and bottom. The ratio becomes crucial when you watch a 4:3 image on a 16:9 screen, because the unused portions of the screen can be permanently “burned” into the screen.
The quality of a TV image is determined by how many lines the monitor can display at once. The best TVs can show 1,080 lines, but the best DVD players—the highest quality signal source besides true HDTV—provide only 960 lines of information. The worst signal source for a high-definition TV is probably a VCR, which, running in the range of 250 lines, can look as if it’s being viewed through a rain cloud.