Electronic Design

Motion Blur Distorts Digital Video's Future

Video has its own take on Moore's Law: As soon as something is finalized, things are sure to change.

Back in the dark ages, watching TV was a challenge. If you were on the fringes of the broadcast networks’ ranges, you and your siblings had to take turns holding the TV set’s rabbit ears to improve its reception. And if the set stopped working, you were in for a trip to the local drug store with a bag full of vacuum tubes.

These days, if you’re talking about a TV not functioning, you’re likely referring to the shift from analog to digital broadcasts on Feb. 17, 2009. In fact, according to iSuppli, about 50% of the world will move to digital broadcasts by 2010 (Fig. 1). But unlike that trip to the drug store, not everyone is running out to the nearest electronics store to get a giant digital flatscreen, as converter boxes for existing analog sets will be available.

Last month, the Federal Communications Commission announced that the Wilmington, N.C., area will serve as a test market for the upcoming analog-to-digital switchover. Fullpower stations in the market will switch the analog spigot off on Sept. 8 and hope that viewers who still rely on National Television System Committee (NTSC) tuners won’t start a riot. Despite the potential for problems, the FCC does hope to work out all the kinks prior to next February. Of course, digital broadcasts are just the beginning of the changes ahead for the video industry.

The new federal Advanced Television Systems Committee (ATSC) standards only require broadcasters to deliver a digital signal in a minimum standard-definition (SD) format. Yet satellite, cable, and now even fiber providers have been pushing to deliver as much highdefinition (HD) content as possible. It really doesn’t seem to matter if a given HD signal is poor. Apparently, he who broadcasts the most HD channels wins—for now. But when has the customer ever mattered to the cable, satellite, or phone companies?

There’s also a push to move to H.264 compression over MPEG-2. According to Ambarella, a company that provides high-definition videocompression and image-processing semiconductors, both major satellite providers and the vast majority of Internet Protocol TV (IPTV) providers have switched to H.264 compression. This has created a large market for MPEG-2 to H.264 transcoders with all of the legacy set-top boxes still in use. Of course, when folks do upgrade their set-top boxes, they expect the technology to work harder, according to a study by Parks Associates (Fig. 2).

Robert Pleva, director of semiconductor product marketing at Sigma, says that IPTV’s rapid adoption rate is “taking off very robustly.” This trend is driving the features discussed by Parks Associates, such as sophisticated GUIs and Java code. These features will increase the demands on the set-top box CPU, which now typically runs at 500 MHz and includes large instruction/data caches.

Going forward, television will merge with Web-delivered content. “This trend introduces challenges because there is a need to move beyond well-defined video codec support, such as MPEG-2, H.264, and VC1,” says Pleva. This translates into the need to support tons of proprietary codecs found on the Web today. Plus, the Web-centric codecs tend to change much more frequently, making it difficult to keep up with the latest changes.

But the U.S. is certainly not alone when it comes to using H.264. Via the Digital Video Broadcast (DVB) association, many countries in Europe have been using H.264 for broadcast television since 2004. It’s additionally being used in Brazil, Korea, Hong Kong, and Japan, and it will find its way into many other places over the next few years. Of course, for the time being, MPEG-2 is still used heavily throughout the world.

Yet MPEG-2’s legacy is creating several challenges—or business opportunities. A lot of infrastructure is MPEG-2, and most broadcasting companies can likely afford to replace existing infrastructure. However, MathStar has learned while working with industry players on its high-performance field-programmable object array (FPOA) family that legacy MPEG-2 issues go well beyond the broadcasters.

“The hospitality and multi-tenant housing industries have not only long since begun switching out their analog TVs for digital ones in accordance with the FCC’s ruling, but they’re also using the opportunity to migrate to HDTV,” explains Sean Riley, vice president of marketing at MathStar. “The engineering challenge is that the broadcasters, like DirecTV, are transmitting HD content compressed using H.264, while the televisions in, say, hotel rooms are equipped to handle content compressed using the MPEG-2 standard.”

This challenge required MathStar to work with LG Electronics to engineer a 1-GHz FPOA-based transcoder that converts H.264 to MPEG-2. This allows the likes of hospitality customers to preserve existing infrastructure. “Without LG’s transcoders, these hotels would have to replace every one of their set-top boxes,” says Riley.

One of the huge issues that HD and television in general need to address going forward is quality. Some stations, such as ESPN, have requirements like minimum bit rate. Others, though, may suffer from a slower bit rate to deliver more HD signals. But what’s hard to understand is the absurd lack of quality on certain channels.

Have you ever switched on a sportscast and thought you were watching a classic game from a decade or two ago, only to find out it was a live event? In an age where YouTube became an overnight success, is it unreasonable to ask for quality now that we have so much quantity?

We also must ask if we’re future-proofing on purpose or because the salesperson at our local electronics store insisted we needed the latest 60-in., 1080p-capable television. Probably a little of each, but the broadcasters will need to play catch-up sooner or later. “This level of quality exceeds what most broadcasters can afford to transmit today, even with digital methods,” says Riley.

Continue on Page 2

The norm in the U.S. is 720p or 1080i for most HD broadcasts. However, broadcasters are making tradeoffs that are detectable to most consumers, especially the videophiles willing to plop down hard-earned cash for better quality. According to Riley, “At MathStar, we believe that service providers need to find a way to deliver full 1080p content to this growing consumer base and that 1-GHz MathStar FPOA chips can enable the high-performance video encoding to make it happen.”

Furthermore, signals often are repackaged for another purpose. For example, broadcasts can be repurposed to suit alternate delivery mechanisms like IPTV.

“There is a need to ensure the original material is captured and edited at the highest resolution to avoid unacceptable artifacts when viewed in the home,” says John Hudson, Gennum’s director of connectivity technology. “Many broadcasters have recognized that 1080p 60/50 production provides an ideal format to enable them to realize the highest possible image quality regardless of the final delivery mechanism.”

We already know that the quantity is there. But how will broadcasters place high-quality videos in the storage pockets of high-quality mobile, PC, or DVR devices without straining the available bandwidth? Gennum believes the answer may lie in developing new connectivity technologies that deliver the best of both worlds without straining the system.

“Through our close collaboration with the video broadcast and consumer connectivity industry, we are working on innovative techniques and products that capture and deliver the highest-quality video while leveraging the existing distribution infrastructure,” says Hudson.

Speaking of getting more out of the infrastructure that’s already in place, Gefen Inc. has some ideas. “We see the next evolution involving wireless and powerline technologies replacing A/V extension solutions that use CAT-5, RGB, fiber optics, and coax cables to transmit HD signals with zero signal loss,” says Hagai Gefen, the company’s president and CEO.

Unfortunately, buying the latest video gadgetry is a lot like buying the latest computer. As soon as you bring it home, it seems as though some part of it will become obsolete. Take a look at the recently concluded nextgeneration DVD war.

Once Toshiba pulled the plug on its money-losing HD DVD business, Blu-ray became the de facto high-def DVD standard—leaving the small number of HD DVD adopters with little more than expensive paperweights. Yet while Sony celebrates its victory, how long will it be before Blu-ray goes the way of the laserdisc?

After all, how can Blu-ray compete with the likes of the more portable, more compact, easier to use, and more ubiquitous rewritable flash USB drive? With fiber now and cable next year promising 100 Mbits/s, downloading HD-quality movies at an average of 30 Gbytes each will take less than an hour. And if there’s one thing the music industry recently learned, it’s that consumers are willing to forgo a hard medium to get content cheaply and quickly.

Blu-ray’s days already are numbered. The push to get a 100-Mbit/s spigot will only create plenty of opportunities for new modems and wireless technologies, like wireless HDMI extenders. For example, Gefen offers a device that extends HDMI signals up to 33 ft while supporting 1080p at 30 fps and 1080i at 60 fps with a 65-Mbit/s throughput (Fig. 3). Once these technologies hit the mainstream, you can expect more electronic waste left at the curb.

While 720p/1080i are today’s standard vertical resolutions broadcast as HD, new innovations are on the way. Before too long, 1080p will become the de facto standard as HD specifies up to 2160 vertical lines, so there’s definitely some future-proofing going on. But what will happen when most consumers own televisions that can display 1080p?

UHDTV, with 4320 vertical lines, is being tested in Japan. It won’t be ready for consumption until at least 2020, and it will only make a difference on televisions that are 60 in. or larger. Meanwhile, 3DTV (3-dimensional television) should start to become available in about five years. So how far will technology need to advance to match what humans can visually process? Most people can’t easily differentiate between 1080i and 1080p, so why do we even need 1080p and beyond?

One last consideration involves the pushing versus pulling of content. In the traditional push model, video is broadcast in a constant stream across hundreds of channels. The Internet has set up the expectations for a pull model in which consumers download video at a given time and can then choose to view the video later. Of course, video already can be viewed later using a DVR. But there are tradeoffs to each approach.

This sets the stage for a few final questions. Since consumers will soon have access to 100-Mbit/s connection speeds, will a pull model become more prevalent? The obvious answer is yes, so perhaps the more important question really is how this will tax the networks and infrastructure and what will be done to solve the inevitable problems created by this model.

Now, about audio... oh, never mind.

Hide comments


  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.