Tuesday, January 6th, 2009

When HDTV isn’t so HD – Why the cable company’s compression algorithms can really ruin your day (and your hardware investment)




During the recent holiday break, I took part in that great American pastime of lounging around watching the variety of college football bowl games. Having recently moved into the HDTV arena myself, I was anxious to see the action in all it’s 1080i glory.

I traveled to a friend’s house to watch one of the big games on ABC. On a side note, ABC currently broadcasts in 720p and not 1080i (like NBC, CBS and many other networks), but on most ‘normal’ size HDTVs the difference is not really noticeable. Anyway, as the big game came on a few people started asking ‘Is this HD? Do you have the HD feed on?’

Although the image was obviously better than standard definition (SD) television, and in the widescreen format, it didn’t really look that clear. It wasn’t the television, in fact the TV was a new beautiful Sony Bravia and we had just seen it in action with a Blue-ray disc before the game. No, it was Comcast and an obviously ambitious attempt to broadcast ‘HDTV’ using the absolute minimum amount of bandwidth possible.

Back in the old days with analog TV, the signal you got was essentially always the same regardless of who you were getting your signal from and regardless of if it was over a cable wire or antenna. The digital revolution has changed all that. Now, many homes have a single ‘data’ connection to the outside world which carries phone, video and the Internet all over the same wire. In fact, so far as the data network is concerned these things are all basically one in the same… just different streams of data all flowing over the same connection, which eventually gets split up inside your home by various end devices (cable box, cable modem and VOIP connection).

In the mid 1990s, round about the time that Windows 95 debuted, I read Bill Gates’ book “The Road Ahead.” The book, and some videos on an included CD-ROM, described the ‘future’ where our homes would only have one network connection that carried all our data services. It described technologies such as on-demand video and high speed in-home Internet connections fast enough to hold live video conferences.

Today, we take such technologies for granted, but only 15 years ago this seemed like a far off futuristic world. This was especially true considering that at the time most homes had analog cable, analog phone and a 14.4k modem. Compare that to today when we can have hundreds of channels of digital TV (including many HDTV channels), digital phone and Internet connections that are more than a thousand times faster than that 14.4k modem.

So great, we’ve got all this stuff… but now all this stuff is competing for the same bandwidth and much of the home-delivery data networks have effectively run out of bandwidth for the time being. All those cool services each require their own big chunk of bandwidth capacity. For example, a full resolution HDTV signal requires about 19 Mbit/s of bandwidth… that’s likely more that is being allocated to your cable modem… for just a single TV channel!

To deliver all this content, the data delivery companies (namely the cable company) needs to cut from one area in order to deliver content elsewhere. One area they’re doing this is by eliminating analog TV channels from the ‘basic’ cable service.

All of the digital bandwidth that the cable company delivers over the coaxial cable into your home is transmitted on frequency ‘channels’ (just like your cable modem), the difference though is that each frequency channel can delivery a chunk of data that can be then split between multiple services. So a single frequency channel might be able to carry one analog channel, but four digital channels. Therefore, since using a channel for digital data is vastly more efficient, the cable companies are phasing out many analog channels. Personally, I think that’s fine. Although basic cable customers will likely be very annoyed to see their channels go, it’s a necessary evil to advance the technology into the 21st century. Cable providers are required to keep analog version of the local channels available until 2012, but after that look for them to ditch these channels real quick to free up the bandwidth.

However, the other dirty little secret of all this bandwidth shuffling is that the cable companies can also play around with the amount of bandwidth assigned to each digital TV channel. For example, a full-quality HDTV signal should get about 19 Mbit/s, but the cable company may decide to give it less than that and use the remaining bandwidth for something else. The result is, of course, that the HDTV signal just got a bit less HD.

The bigger problem is that they often don’t do a very good job of further compressing these HD signals such that the resulting images can really look quite poor. Compression artifacts (basically what makes such an image look like crap) are real easy to spot. It basically looks more like a YouTube video than what one expects HDTV to look like. Compression artifacts are fine for streaming free goofy videos on YouTube, but they’re really not OK for broadcast quality HDTV (especially considering the rates some of these cable companies charge).

I live in the NYC metro region where the cable companies are now forced to go head-to-head with Verizon and their FiOS fiber optics into the home service. FiOS gives Verizon much more bandwidth to play with and as a result the cable companies here are under enormous pressure keep their HDTV as uncompressed as possible. However, even so I do still sometimes notice poor quality images resulting from compression.

So what do I think should happen?

My major beef with all this is the lack of transparency between what the cable TV companies sell (HDTV) and what they deliver (often squeezed down compressed-to-death ‘HD’TV).

I’d like to see the FCC step in and require any cable TV company (whether that be the traditional cable companies or now the telephone companies) to include live information on each channel showing what they’re actually delivering at that time (eg how much bandwidth has been applied to that channel and what if any compression is being applied). Such data should be compiled on a monthly basis and made publicly available to consumers.

Obviously there are a lot of technical details and standards that would have to be decided upon, plus an easy to understand rating system for the average consumer to comprehend (in addition to the supporting data for techno-geeks to pour over). With the traditional divide between the phone, TV and Internet providers all mushing into one ‘data’ provider, consumers are increasingly faced with the problem of choosing between multiple providers offering what appears to be very similar services. Such data about what these companies are actually delivering would not only help consumers make more informed choices, but would also place even more competition pressure on these companies to build bigger, better, faster and cheaper networks.

A car dealer can’t sell you a new Cadillac and then give you a used Fiat Uno, so why should a cable company be allowed to sell you ‘amazing crystal clear HDTV’ and then delivery ‘crappy compressed-to-death HDTV?’

This thread shows some excellent examples of compression artifacts resulting from a cable company trying to squeeze so-called HDTV into a reduced amount of bandwidth:

http://www.avsforum.com/avs-vb/showthread.php?t=1008271

Sponsored video content:



One Comment on “When HDTV isn’t so HD – Why the cable company’s compression algorithms can really ruin your day (and your hardware investment)”

  1. jj5

    Good summary. This is certainly an issue. I guess one issue is that HDTV is marketed based on the ‘size’ of the image and not really the true ‘resolution / bit rate’ of the image.

Post comments, questions and suggestions here:

You must be logged in to post a comment.