These days' modern sensor-rich displays for military command and control have to integrate graphics video and radar components to provide tactical awareness. Although simulation of this data provides one method of training operators, direct recording of the display screens during realistic training missions allows for analysis and debriefing of an actual situation. Capturing the contents of a high-resolution computer display containing real-time sensor data and graphics has been a challenge.
One key requirement of a direct recording system is the preservation of the fidelity of the graphic detail so that the details of the picture are preserved on replay.
Developments in video-grabbing and image-compression technologies are offering solutions to this problem. They allow displays to be captured and stored on digital media that may be moved or streamed to remote locations for replay.
Of interest to system designers is the ability to employ cost-effective standard desktop PCs to reconstruct the image data and replay the mission scenario along with audio data from the operators. This provides trainers with the convenience of transferring the recorded data to a remote location for debriefing, where a standard desktop PC can be used to seek to a section of interest and then pause, fast forward, or rewind the scenario.
Today's military command and control consoles combine complex graphics and sensor information. The task of training personnel in the use of these consoles, commonly deployed onboard ships and in air-traffic control centers, is made more effective via digital data recorders. DDRs capture the contents of the console display screen as well as supplementary data, such as audio and sensor signals. By recording and replaying real-world situations, a trainee can be exposed to and learn how to respond to actual mission problems.
Recorded mission data offers the important benefit of enabling all data sources to be replayed fully synchronised via time-stamping. Using a data recorder enables a trainer to use actual situations to demonstrate appropriate (or wrong) responses to a wide variety of scenarios. Similarly, the trainee's own performance can also be recorded and reviewed for analysis. Digital data recording also enables remote viewing and analysis of console data. Through use of a data network or a physically removable storage medium, mission files can be transferred to a debriefing room, where the recorded mission can then be studied on a standard PC.
One of the key challenges in data recording is the preservation of detail in high-resolution, computer-generated displays at resolutions of 1600 x 1200 or even higher. The goal is to develop a data-recorder system that provides near lossless capture of the video so that the detailed text and colors of a mission display are faithfully captured for replay. Such a data-recorder system requires a sample rate in excess of 400 MHz for the highest resolution video.
Because of screen refresh rates, it is not necessary to capture every single frame: to preserve all user-interactions and display events requires a system capable of capturing and recording the console display data at a rate of 5 or more frame per second. A typical data recorder would support screen resolutions up to 1600 x 1200 or 1280 x 1024 signals. In addition, it would support one or more sensor inputs (for example, cameras) and a number of audio channels for recording operator communications.
Although hard-disk and tape storage densities continue to grow, the volume of captured data can become very large, especially for high-resolution computer video. At this point, data compression becomes a necessity. System decisions have to be made between the need to compress the data for extended recording times and lower data rates, and the need to preserve the fidelity of the data sources. Choosing the best compression technology involves balancing the conflicting requirements of image quality, processing times, and computational requirements.
The challenge, then, is to create a digital screen recorder capable of capturing, compressing, and digitally storing multiple channels of RGB video, TV video, audio, and discrete events.
The choice of image-compression technology is critical to achieving extended recording times without significant loss of image quality, whilst also being sensitive to the requirements of a cost-effective replay solution. As a mature compression technology with a choice of COTS implementations in both hardware and software, JPEG provides an established baseline for RGB compression.
The image quality is very good at modest compression ratios and a modern desktop computer can decompress upwards of 40 Mbytes/s without special-purpose hardware. Significantly, as a still-image-compression standard, JPEG processes each frame independently. Thus, the replay process can show each frame without dependencies on history.
One recent development in image compression is JPEG 2000, which is based on wavelet decomposition, rather than the sine-wave transform of JPEG. This new standard offers enhanced compression ratios for most video sources, but requires more computing resources for compression and decompression. JPEG 2000 supports both lossless and lossy modes with a programmable compression ratio, and offers excellent performance for real-time video sequences. This results in high compression ratios with the added benefit of low latency and minimal resynchronise times. Unlike MPEG-based compression, which uses a sequence of frames to identify temporal redundancy, JPEG 2000 compresses each frame separately and independently. In the event of a disturbance to the video stream, resynchronisation can occur on the next frame boundary.
A system that supports both JPEG and JPEG 2000 provides the best of both worlds. Applying JPEG to high-resolution RGB video compression, and JPEG2000 to the low-resolution camera signals, strikes a good balance between the demand for high-fidelity recording and the limitations of the PC platform as the target playback device. Since JPEG can be replayed at higher frames rates than JPEG2000, the core requirement of using a standard PC for replaying the recordings can be best met using JPEG. Conversely, for real-time camera video displaying at 30 frames per second, the higher compression ratios of JPEG 2000 justify the cost of hardware acceleration for the replay.
An example of a data-recording system that meets the requirements described is Curtiss-Wright Controls Embedded Computing's new Sentric data recorder product. It provides integrated screen, video, and audio record and playback capabilities. The Sentric solution combines JPEG and JPEG2000 in a single integrated solution that allows multiple video sources to be recorded into a single data stream and replayed either from the record unit itself, or remotely from a PC workstation. The architecture supports a number of RGB, video, and audio, enabling a single unit to record up to four channels of each on a multiplexed data stream.
Sentric can be deployed on a VME card that can be integrated into an existing console. As a card-level solution, Sentric interfaces to the video, audio, network, and SCSI signals through the rear VME connectors. As a compact card-level solution, this is attractive when adding recording capabilities into existing console displays with minimal additional space. For replay, a PC-compatible playback application allows for a high-performance PC to read and replay the video and audio data with no special hardware by replaying TV video at a slower than actual frame rate. An optional PCI decompression accelerator card enables real-time replay of TV video on the PC.
CWCEC also offers JPEG 2000 real-time video compression and decompression on PMC and PCI form factors with its Orion card. In input mode, Orion can accept up to ten analogue video inputs and select two for simultaneous compression with the JPEG 2000 standard. The compression engine supports full frame rate encoding of standard 625-line PAL or 525-line NTSC composite video, outputting a JPEG 2000 compliant data stream onto the PCI bus. In output mode, the card receives up to two JPEG-2000 data streams across the PCI bus, decompressing to give two independent PAL or NTSC video-output signals through the front panel.
With the Orion card, input video is de-interlaced and passed to the JPEG2000 compression engine, which employs a transform processor to perform wavelet conversion and quantifying. The computationally expensive entropy coding is performed by three dedicated codecs that provide the context modeling and arithmetic coding of the wavelet coefficients. In a typical video-capture application, the Orion card is programmed to receive two video signals. It then creates two independent video streams that transfer over a PCI bus into an application program.
As part of a recording system, these streams may be stored in local storage, or as part of a distribution solution, in which the streams may transfer over a network to a remote display.