As the new 4K Ultra High Definition (UHD) televisions arrive at your favorite outlet chain, designers are gearing up for the next dramatic change in video standards. Once again, we are facing challenges in both network delivery and back-office content management, as well as the design of 4K-compliant professional broadcast equipment—delivering over 530 million pixels a second!
Download this article in .PDF format
This file type includes high resolution graphics and schematics when applicable.
The Big Picture
The introduction of the latest high-definition television standard—called ultra-high-definition version 1, 4K, or UHD-1—has nearly quadrupled the number of pixels on a panel from just over 2.3 million to over 8.8 million. Today, all major television manufacturers are already shipping Web-enabled panels that can stream (from limited sources) full 4K video along with up to 32 channels of audio. HDTV running at 1080 lines (1200 horizontal pixels) with progressive frames (1080P) and MPEG 2 transport can be sustained with roughly 6 to 12 Mbits/s, where a 4K video (2160P) will require twice that rate.
Additionally, the studio equipment capable of the 4K standard must now route serial digital interface (SDI) uncompressed video running at over 12 Gbits/s to maintain a 60-Hz frame rate. This introduces new signal integrity challenges as well as the obvious issues with non-linear editing of content and media storage. Currently there are two new standards simply called UHD-1 and UHD-2, which are distinguished by the resolution used by each (Fig. 1). UHD-1 specifies 4K resolution (3840 by 2160 pixels). UHD-2 specifies 8K resolution (7680 by 4320 pixels).
As with other Society of Motion Picture and Television Engineers (SMPTE) standards, both 10- and 12-bit color depth with chroma subsampling rates of 4:4:4, 4:2:2, and 4:2:0 will be used to move and store uncompressed video. Even with 4:2:2 chroma subsampling, the amount of data at 60 Hz is overwhelming. To make matters worse, in an effort to accommodate additional frame rates, the base frame rate is likely to be 120 Hz, doubling the uncompressed data content per second.
Camera manufacturers such as Sony Blackmagic Design and RED Digital are now building equipment to capture 4K and 8K images. These cameras capture upwards of 100 frames per second of uncompressed video for cinematic and television production. Movies are now routinely being shot in 4K, such as Oblivion, which was shot with a Sony F65. The content is already here and more is quickly becoming available, especially with the cost of pro-consumer (or “pro-sumer”) cameras plunging.
Broadcasters that are still transmitting over the air (OTA) are sending 720p HDTV at most. Cable systems can handle higher resolutions including 1080i or 1080p. To move to 4K UHD-1, broadcasters will need to increase the bandwidth required for a single channel by over nine times. Given this limitation, the first appearance of “broadcast” content will come from high-speed Internet connections.
Today, DOCSIS version 3 modems can provide sustained downstream rates of more than 50 Mbits/s, enough for two or three H.264 4K transport streams (at roughly 15 Mbits/s each). For service providers such as NetFlix, this is an extremely complicated issue. They even have begun partnering with Internet service providers (ISPs) to buffer content more local to the consumers to minimize the “rush hour” of video that occurs between 7 and 10 p.m.
But even these providers are dealing with compressed video that previously has been encoded. Yes, the overall file sizes are larger. But otherwise, H.264 transport streams can deliver 4K UHD-1 content without any changes, assuming a level 5.2 decoder. The real issue is with uncompressed video that exceeds 12 Gbits/s at 60 frames per second. This is where the real challenges can be found in cabling, routing, and switching.
Live events and other applications using uncompressed real-time video pose the greatest challenge for UHDTV in both the encoding and transport. In SMPTE-424M, which is also known as 3G-SDI (3 gigabits per second) serial digital interface, 1080p60 (60 frames per second, 1080-line progressive scan) video can be moved uncompressed with various chroma sub-sampling, except 4:4:4. That format will exceed the standard’s maximum data rate. For that format, dual-link 3G-SDI must be used.
So what happens when you want to move frames that are four times larger? You need four times the bandwidth. But even the proposed 12G-SDI SMPTE standard will not be able to handle 10- or 12-bit 3840x2160p60, let alone the double 120-Hz base frame rate.
SMPTE is proposing ST 425-5 and ST 425-6, which are quad-3G link standards for a combined throughput of 12 Gbits/s. As far as production equipment, the interfaces are the same as those used in 3G-SDI. For now, designers can continue to build equipment with standard 3G interface I/O devices, such as the LMH0387, which includes the return loss matching network (Fig. 2).
HDMI 2.0 And 4K
Assuming that switch equipment vendors can upgrade their software to support quad-3G-SDI links, the studio and content provider issues for the near term can be addressed. However, those new panels require a video source such as a cable network decoder, often called a “set top” box since their predecessors mechanically sat on top of a television with a picture tube in it. To move the video data to the panel, analog cabling has given way to a digital standard called high-definition multimedia interface (HDMI), which has gone through several revisions.
HDMI uses a signaling system called transition minimized differential signaling (TMDS). This standard uses current mode logic (CML) as the physical layer and a special form of 8b/10b encoding that provides additional signals for synchronization along with an encoder that minimizes transitions in the data (thus the standard’s name). TMDS does an excellent job of reducing electromagnetic interference (EMI) by using twisted-pair wires and scrambling the data.
HDMI cables use four twisted pairs to carry three lanes of data and a single clock lane. Prior to version 2.0 the data rate per lane was 3.4 Gbits/s, which could easily be carried using a 28 AWG twisted pair without any method to improve signal integrity. But starting with version 2.0, the rate was increased to 6 Gbits/s to handle 4K video and a larger number of audio channels along with many new features.
This increased speed can be handled by improving the loss of the cable by moving to 24 AWG twisted pair, using active or passive equalization, or combining both. The longer cables used in home theaters pose a particular issue with version 2.0 due to the potential for extra length required between the panel and source equipment.
One way to help fight spectral loss and jitter is equalization. HDMI connectors do specify a +5-V, supply but the current is severely limited to only 50 mA, minimizing the use of active equalization to only solutions that use extremely low power. However, passive equalization is possible on a fixed-length cable. There are devices such as the DS80EP100, which is an integrated passive equalizer that requires no power to operate and compensates for the spectral loss in the cable due to longer runs improving deterministic jitter. These devices are so small they can fit inside the cable connector.
As consumers demand a more immersive experience at home, UHD-1- and UHD-2-compliant panels will begin to replace larger HDTV (1080p) panels as prices decline. Content providers are already using the standard to produce both TV shows and full-length movies, increasing the available content for 4K daily. This continues to put pressure on the networks, cable providers, and Internet backbone to deliver the increased resolution of UHD. Japan is already planning to use 8K video for the upcoming 2020 Tokyo Olympics, which will be broadcast live worldwide.
The new ultra-high-definition panels utilizing HDMI 2.0 are already here. The challenge for all sectors of the broadcast industry will be to deal with the issues in creating and editing content, switching live feeds, and delivering an ever increasing visual and audible experience to their end customers.
Richard Zarr is a technologist at Texas Instruments focused on high-speed signal and data path technology. He has more than 30 years of practical engineering experience and has published numerous papers and articles worldwide. He is a member of the IEEE and holds a BSEE from the University of South Florida as well as several patents in LED lighting and cryptography.