The data networking and telecom markets are driving the bleeding edge of bit rates for high-speed digital interfaces. However, digital-signal-processing systems in the medical imaging, wireless infrastructure, industrial, and defense industries are experiencing an I/O gap between the volume of data sourced by their data-acquisition analog front ends (AFEs) and the capability of digital-signalprocessing elements to sink the data.
In most cases, data-acquisition AFEs interface directly to FPGAs, or indirectly to CPUs, graphical processing units, or Cell processors through backplanes like PCI Express. According the market research firm Databeans, the total volume of data being generated by data converters will increase by a factor of four through 2012 (see the figure). But, the line rates of interfaces on backplanes and FPGAs hasn’t kept pace with Moore’s Law. Signal-compression technology presents a solution to bridge this gap.
For signal-compression technology to address this data bandwidth gap, it must be transparent to signal type to provide compression results across a broad range of applications and signals. Well-known compression technologies for video and audio, such as MPEG2/4 or MP3, are signal-specific. And, unlike sampled data compression, they can rely on human physiological limitations to achieve highly lossy compression ratios.
For other industrial, scientific, and medical applications, though, signal compression should support a lossless mode to simplify integration into these systems without repeating any system noise analysis. In addition, since most sampled data systems operate on noisy signals, near lossless compression modes can provide higher compression ratios if the noise introduced by the compression algorithm is spectrally white and if the algorithm can control the amount of noise introduced.
Furthermore, the compression algorithm should also offer a mode where the compression ratio can be prescribed to simplify transmission across fixed-bandwidth backplane and I/O interfaces. Clearly, the value of signal-compression technology within a system is greater when the compression is closer to the analog domain, and when decompression is closer to the software domain, because bandwidth increases and cost is saved throughout the hardware signal chain. Two example applications illustrate the benefits of signal compression in data converters.
First, in ultrasound machines, each ultrasound probe can contain more than a hundred, or in the case of advanced 4D machines, over a thousand transducer elements, each connected to a high-speed, 12-bit analog-to-digital converter (ADC) sampling at up to 65 Msamples/s. For a 256-element machine, this creates 200 Gbits/s of backplane bandwidth and 512 pins between the AFE and the ultrasound beamforming array. With 3:1 compression, the backplane bandwidth and number of pins are both reduced by 66%, greatly reducing the complexity and I/O power within the console.
Second, fourth-generation wireless basestations like WiMAX and Long Term Evolution (LTE) use fiber-optic connections between the radio electronics at the top of the tower and the baseband processor at the bottom. As systems move from 3G to 4G, the number of antennas increases from two to four or eight to support multiple-input multiple-output (MIMO) or smart antenna technologies.
In addition, the channel bandwidths increase from 5 MHz for W-CDMA to 10 or 20 MHz for LTE. This combination of factors drives fiber-optic bandwidth requirements to the point where the fiber-optic transceivers can cost more than the power amplifiers! Signal compression can enable 4G basestations to reuse the same fiber-optic infrastructure as existing 3G systems.
One such signal-compression technology is Samplify’s Prism, which is now available in what the company says is the first compressing ADC. The ADC, dubbed the SAM1610, provides 16 input channels, each operating at 65 Msamples/s with 12 bits of resolution. As a result, it can address the data bandwidth gap in ultrasound equipment and 4G wireless infrastructure.
With decompression cores available for leading FPGA families, as well as Intel-compatible CPUs, Cell processors, and graphic processing units, the benefits of signal compression are now available end to end throughout the hardware signal chain for these systems.