Vision Quest

Jan. 1, 2008
Image Sensors with high dynamic range, together with powerful processors and sophisticated algorithms, are enabling a new generation of vision-based systems

Image Sensors with high dynamic range, together with powerful processors and sophisticated algorithms, are enabling a new generation of vision-based systems that can detect and distinguish objects in a wide range of lighting conditions.

“The introduction of in-car video cameras is opening up new application areas for driver assistance systems,” said Bernd-Josef Schäfer, vice-president of the driver assistance systems business unit at Robert Bosch GmbH. “As each new functional enhancement appears, the car is gradually learning to ‘see,’” Schäfer said.

Vision systems are helping automakers differentiate their vehicles on the basis of safety. The rear vision camera system on the Buick Enclave, for example, has a caution symbol that changes in size and color to draw the driver's eye to the closest detected object (Figure 1).

Systems are looking inward as well as outward. Saab, for example, is developing a driver attention warning system that uses infrared cameras on the driver's door and the center console to record and analyze eye movement. The system sounds audible and visual alerts if the driver's eyes close for longer than a normal blink. If the driver's condition persists, audible warnings become more urgent and the driver's seat vibrates.

Volvo in January launched a collision avoidance package that includes lane-departure warning, adaptive cruise control, collision warning with auto brake, distance alert and driver alert. In the driver- alert application, a camera continuously measures the distance between the car and road lane markings. Sensors register the car's movement while a control unit stores the information and calculates whether or not the driver risks losing control of the vehicle. If the risk is high, the driver is alerted via an audible signal and a text message. The lane-departure warning system employs a camera to monitor the car's position between the road markings.

Using image sensors from Sensata Technologies B.V., Bosch developed the night-vision system deployed on Mercedes S and CL class vehicles (Figure 2). Infrared high-beam headlights illuminate an area of more than 150 meters in front of the vehicle. The infrared image is picked up by a video camera, where electronics convert the signals into an image on the central display that enables drivers to identify dangerous situations more quickly and gives them more time to react. The system, which includes a 400 MHz PowerPC and a field-programmable gate array (FPGA), can also recognize and display pre-defined road signs.

Bosch is readying Night Vision Plus for 2008 production. The Plus version adds color, and is said to be capable of distinguishing between pedestrians who are standing or moving. Bosch is also developing video-based driver-assistance systems that use sensor data fusion to process spatial information. Its technology can merge signals from a video camera and a 77 GHz radar sensor, or those from two video cameras. The system can recognize and analyze critical situations, predefined or otherwise.

Bosch is using sensor data fusion technology to develop a Predictive Emergency Brake application that will automatically apply the brakes and minimize the severity of the consequence if it recognizes that the driver is failing to react to a pending collision.

Delphi is using sensor fusion in a collision mitigation system that combines cameras and long-range radar for applications including lane- departure warning, collision warning with auto brake, driver alert control, and adaptive cruise control.

Omron Automotive Electronics has fitted a vehicle with a sensor fusion demonstration that combines Omron's lidar sensor and high-dynamic range camera to facilitate full-speed range adaptive cruise control, lane- departure warning, and heading control systems. The heading control system, integrated with electric power steering, provides a gentle tug on the wheel in the proper direction when the vehicle drifts over a lane marker.

Hella KGaA Hueck & Company markets a rear-view CMOS color camera with a 130-degree aperture angle, and plans to launch an ultrasonic park-assist application in Europe this year, and is readying other camera-based systems for launch in 2009.

Hella's active driver-assistance systems are based on camera and ultrasonic technology as well as lidar and 24 GHz radar. The Chrysler 300 offers Hella's lidar-based adaptive cruise control technology while Audi's Q7 SUV contains Hella's radar-based lane-change assistant system, also known as the Audi Side Assist. The lane-change assistant relies on two 24 GHz radar sensors integrated into the vehicle's bumper.

Siemens VDO, now part of Continental, has developed an Intelligent Passive and Active Safety (IPAS) system that networks driver- assistance systems with active and passive safety systems. The IPAS platform includes a “LiCam” sensor, which combines a CMOS camera and a lidar sensor to provide environmental and traffic information. The lidar sensor delivers data on vehicles in front, while the CMOS camera monitors lane markings in varying light conditions and provides information for traffic sign recognition and high beam assist. Fusing sensor data from lidar and CMOS censors results in optimum lane attribution and object recognition, for full-speed range adaptive cruise control functionality, according to Dean McConnell, director, Occupant Safety & Driver Assistance Systems, at Continental, N.A.

Significant vision technology development is occurring at the chip level. Texas Instruments has numerous design wins for vision applications, according to automotive and marketing business manager Brooke Williams. Digital-signal- processor-based devices employing TI's DaVinci technology include the TMS320DM642 series and the newer TMS320DM643x. DM642 devices are deployed in rear-view and birds-eye view applications that use multiple cameras, as well as in night-vision applications with data feeds from infra-red sensors.

Pin-compatible devices in the DM643x series (Figure 3), based on TI's C64 DSP core, offer clock speeds from 300 MHz to 500 MHz and a variety of feature sets, enabling automakers to deploy vision applications at different price points. “An OEM could offer a single function, such as lane-departure warning, at the low end of their line, and then add software and provide lane-departure warning, traffic sign recognition and high-beam dimming on a higher-priced model,” Williams suggested.

“The vision market is very dynamic, and each customer has a preferred way of developing algorithms,” he noted. “We want to give customers the ability to program in whatever way best allows them to differentiate their solutions and add value.” DM437x provides hardware accelerators in a video-processing subsystem that takes image data directly from CMOS sensors and handles it without involving the DSP processor.

Also noting that vision system developers have varying needs, Sensata offers image sensor chips, imaging modules, video cameras, and a vision system platform. Sensata's vision global marketing manager, Greg Noelte, noted a trend toward multiple applications served by a single camera. “That demands higher-resolution imagers with a wide dynamic range, and faster digital signal processors,” he said. Sensata's IM103 imager offers a 120 dB dynamic range. The firm's next-generation imager, Avocet, will feature a 150 dB range and 60 fps resolution. Technology developed by SMaL, which Sensata acquired last March, enables vision systems to function in a wide range of lighting conditions, from dimly lit garages to bright sunlight, or glare from oncoming headlights. Sensata offers an evaluation platform called RapidView that includes a TI DM6437 DaVinci DSP, Sensata Avocet or Micron MT9V022 image sensor, and Sensata's Vision System Support Library. The platform provides video input/output capabilities, multiple automotive communications protocols plus interfaces and onboard memory.

NEC Electronics' IMAPCAR image processor, developed with help from Toyota and Denso, can integrate data from multiple sensors and/or different types of sensor elements, according to Jens Eltze, engineering manager for NEC Electronics America's Automotive Strategic Business Unit.

Fabricated on a 0.13 µm process and drawing less than 2 W of power, the IMAPCAR runs at up to 100 GOPS (billion operations per-second), using 128 parallel processing elements with random access memory for each, but relying on software for processing of image-recognition functions. The chip has sufficient power to accommodate multiple algorithms simultaneously, such as for lane-departure warning and street sign recognition. Communications tasks are delegated to a small companion microcontroller.

For day as well as night vision applications, STMicro is sampling the VL5510, a high (130 dB) dynamic range camera with a 2x1 aspect ratio for a wider field of vision compared with the more common 3x4 ratio. The camera offers 1024 x 512 5.6-micron pixels, and can see up to 950 nm — beyond the ability of the human eye. “A driver in a tunnel facing headlights will be able to see when they come out, and see what is on the other side of the tunnel,” Duncan said.

STMicro is starting work on SiGe radar in lieu of GaAs radar, which costs far more. “Radar and camera systems are complementary,” said Martin Duncan, strategic marketing manager for ST's automotive product group. “Cameras are good for identifying objects at distances up to 50 meters. A 77 GHz radar system can look out to 200 meters and do so in all weather conditions, but can't identify what's out there. Putting the two together — sensor fusion — is the way things are going at the high end.”

Duncan said that ST is continuing its collaboration with Mobileye on the EyeQ image processor. The EyeQ1 is in production, available on some Cadillac and Buick models, as well as on the Volvo XC90, V70, S80, and XC70, and the BMW Series 5. The firms already have design wins for the EyeQ2, which will sample at the end of February and be released for production around mid-2009.

The EyeQ2 architecture consists of two 64-bit MIPS34K floating point, hyper-thread RISC CPUs; five vision-computing engines (VCE); three vector microcode processors (VMP);a 64-bit Denali mobile double data rate (DDR) controller; a 128-bit Sonics Interconnect; dual 16-bit video input and 18-bit video output controllers, 16 direct memory access (DMA) channels, and peripherals including dual CAN controllers; dual UART interfaces and an I2C interface.

Engineers at OmniVision Technologies have developed proprietary process level enhancements that extend the spectral light sensitivity of the firm's near Infra-”red sensors to 1050 nm. Inayat Khajasha, senior marketing manager, worldwide automotive, said the NIR sensors allow automotive cameras to see beyond and outside the range of a vehicle's headlights, and to perform object detection in complete darkness.

OmniVision's OV10620 system-on-chip, a high-dynamic-range color sensor, allows cameras to recognize color on traffic signs, lights and road lines and markings. An on-chip algorithm-processing pipeline allows the sensor to switch to HDR mode to handle extreme variations of bright and dark conditions within the same scene and automatically switches back to non-HDR mode when conditions return to normal.

Melexis is working on its third generation of automotive-grade CMOS camera circuits for visible and near-infrared light. It currently offers the CIF (352 x 288) MLX75006 and the PVGA (750 x 400) MLX75007 (Figure 4). Both offer program-mable pixel response to achieve various dynamic range levels, and both feature an overmolded plastic package with an optional integrated glass lens stack designed to simplify assembly and to protect the chip and bond wires against scratches and light.

“The automotive vision market is absolutely exploding,” said STMicro's Duncan. He added, however, that automakers in North America, Europe and Japan differ in their focus. “There is a lot of interest in rear-view systems in Japan,” he said, “where in Europe and North America, interest is greater in front-view applications. Lane-departure warning and headlamp control systems are popular in North America. Lane- departure warning systems are dominant in Europe, but there is also a lot of interest there in traffic sign recognition, and collision avoidance, with links to adaptive cruise control.” Duncan predicts that a geographic convergence in vision applications will occur in 2013-14.

ABOUT THE AUTHOR

John Day writes regularly about automotive electronics and other technology topics. He holds a BA degree in liberal arts from Northeastern University and an MA in journalism from Penn State. He is based in Michigan and can be reached by e-mail at [email protected].

COMPANY MENTIONSBosch www.bosch.com Buick www.buick.com Continental www.conti-online.com Delphi www.delphi.com Hella www.hella.com Melexsis www.melexsis.com Mercedes www.mercedes.com Micron www.micron.com Mobileye www.mobileye.com NEC www.necel.com Omnivision www.omnivision.com Omron www.omron.com Saab www.saab.com Sensata www.sensata.com STMicroelectronics www.st.com Texas Instruments www.ti.co Volvo www.volvo.com

Sponsored Recommendations

Highly Integrated 20A Digital Power Module for High Current Applications

March 20, 2024
Renesas latest power module delivers the highest efficiency (up to 94% peak) and fast time-to-market solution in an extremely small footprint. The RRM12120 is ideal for space...

Empowering Innovation: Your Power Partner for Tomorrow's Challenges

March 20, 2024
Discover how innovation, quality, and reliability are embedded into every aspect of Renesas' power products.

Article: Meeting the challenges of power conversion in e-bikes

March 18, 2024
Managing electrical noise in a compact and lightweight vehicle is a perpetual obstacle

Power modules provide high-efficiency conversion between 400V and 800V systems for electric vehicles

March 18, 2024
Porsche, Hyundai and GMC all are converting 400 – 800V today in very different ways. Learn more about how power modules stack up to these discrete designs.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!