High-Definition is the "new" way to watch video, but how do you know if what you're downloading is truly high-definition or not? Well, first we need to look at what HD actually is.
What is HD?
Officially it is video with a resolution of 1280×720 (720p) or 1920×1080 (1080i or 1080p). All of the formats are widescreen 16:9 and display between 24 and 60 frames per second. The scanning is either progressive (every frame the entire picture is drawn) or interlace (only for 1080 where every other frame has half the lines redrawn).
Now that sounds like it should be a high-quality picture and in fact it should be. However when that video is then processed for online or other use it might not be as high due to the post-processing.
When a video is transferred to a new medium or format and a compression codec and algorithm are used to decrease the file size, quality is lost. In fact what may have originally been an HD video file could be compressed to a loss of quality where it is only as good as or worse than standard definition video to the human eye. Most video compression is 'lossy' which means the algorithm thinks that much of the data in the original format is not necessary to maintain an image that looks good.
How can you tell if the HD video you're downloading is really HD?
Well for one the file size will be quite large. Secondly, you need to look at two other things, the Codec used and the bitrate. The Bitrate is basically the amount of data that is processed per second. How can you tell if the HD video you're downloading is really HD? HDTV itself is 15Mbit/s while HD DVD is 36 and Blu-Ray is 54Mbits/s.
That video you get online probably isn't pushing even the 15Mbit/s as that would translate into enormous file sizes. Apple's iTunes HD video content for download is only pushing about 4Mbit/s, half of standard DVD. Xbox Live Video downloads are pushing about 7Mbits/s, half the standard HD bitrate. All of this translates into smaller file sizes and faster downloads. But it also results in data loss and a lower quality picture. So is it still HD video at that point? Technically, yes.
What's the problem?
The problem is that many codecs allow video to be compressed at a wide variety of rates. H.264 which is in widespread use on the web can drop the bitrate to 1.5Mbits/s or approximately 10% of standard HDTV quality. Divx, MPEG2 and every other codec that utilizes this 'lossy' compression technique also allow for a variety of bitrates in order to tweak the file size to exactly what the user wants when compressing. I have an "HDTV XVID" file on my computer. Looking at the properties of it the resolution is 624x352, the length is 42:31 and it shows at 23 frames-per-second. The bitrate? 1.123Mbit/s and the file size is 358MB.
Is this then still HD video? The answer to that is simply NO. The resolution has dropped below the specified resolutions of HDTV and while the video plays well and looks good on the computer, it is not the quality of HD video. In fact, it's far from it.
What can be done?
Honestly? Nothing. Many people do not want to wait and cannot afford storage space for full HD downloads. People are impatient and many are obviously happy to lose some quality and recoup it in time and storage space. While the Internet infrastructure and backbone are increasing in speed and bandwidth pipes, the lines to many homes are still speed-limited creating a bottleneck. Until everyone has fiber optic lines directly into their homes the download speed will not be able to maintain a full-HD video download.
In fact even some satellite and cable providers (Comcast, DirectTV) were found to have been crunching their signals to fit more into the same pipeline effectively resulting in less-than-full-HD quality video signals while maintaining the technical definition of HD.
The problem is that the HD definition only takes into consideration the frame rate, resolution and interlacing. It does not take into consideration bitrate. So technically any 720P, 1080i or 1080p video is HD, even though it might look like shit.