Electronic Design
  • Resources
  • Directory
  • Webinars
  • CAD Models
  • Video
  • Blogs
  • More Publications
  • Advertise
    • Search
  • Top Stories
  • Tech Topics
  • Analog
  • Power
  • Embedded
  • Test
  • AI / ML
  • Automotive
  • Data Sheets
  • Topics
    - TechXchange Topics --- Markets --AutomotiveAutomation-- Technologies --AnalogPowerTest & MeasurementEmbedded
    Resources
    Electronic Design ResourcesTop Stories of the WeekNew ProductsKit Close-UpElectronic Design LibrarySearch Data SheetsCompany DirectoryBlogsContribute
    Members
    ContentBenefitsSubscribeDigital editions
    Advertise
    https://www.facebook.com/ElectronicDesign
    https://www.linkedin.com/groups/4210549/
    https://twitter.com/ElectronicDesgn
    https://www.youtube.com/channel/UCXKEiQ9dob20rIqTA7ONfJg
    Vyacheslav Kurmashov | Dreamstime.com
    Night Driving Promo
    1. Markets
    2. Automotive

    SWIR Sensor Tech Promises Enhanced Driver Visibility and Safety

    Aug. 24, 2020
    With the ability to “see beyond the visible,” a mass-producible CMOS-based short-wave infrared sensing solution brings a higher level of ADAS capability to vehicles.
    Ziv Livne

    While autonomous vehicles, or self-driving cars, are still far from being available to the average driver, most cars on the market now offer advanced driver-assistance systems (ADAS). These semi-autonomous features recognize driving threats and react accordingly, drastically reducing traffic fatalities and enhancing vehicle reliability and safety. ADAS not only help vehicles avoid hazards outside of human control, but they also help those behind the wheel become better drivers so that they can learn to recognize danger and avert imminent threats on their own.

    ADAS are able to diminish human error by notifying the driver of risks and responding to those hazards efficiently and effectively. But just like drivers, the cameras on cars can only react to what they can see. Consequently, in the absence of “super” vision that can deliver crucial image data, the safety and reliability of these systems are frequently compromised.

     What’s the Low-Visibility Challenge?

    Today, ADAS aren’t functional in many common adverse weather conditions and in low light. Despite only25% of travel occurring during nighttime, nearly 49% of accidents occur during this time, meaning drivers are nearly twice as likely to crash their vehicle when driving at night. A major obstacle in the creation of a secure ADAS solution has been integrating a sensor that can see in low-visibility scenarios, such as fog, haze, dust, glare, rain, and darkness. As a result, driver-assistance systems lack critical information about the immediate environment that’s required to make smart and safe decisions.

    Moreover, even under optimal visibility conditions, ADAS hazard-detection capabilities are extremely deficient and pose a serious threat for drivers who depend on the technology’s accuracy. For example, current-sensing technologies struggle to detect pedestrians wearing dark clothes or dark-fur animals crossing the road.

    A recent AAA studytested pedestrian-detection automated emergency-brake systems and found that no system was able to detect an adult pedestrian crossing in front of a vehicle at night. This deficiency leaves room for many unnecessary accidents to occur. For instance, more than 1.3 million deer-related accidents occur in the United States every year. In addition, invisible hazards such as oil slicks and black ice can’t be detected from a safe distance. Thus, the ability to recognize hazards that are nearly impossible to detect with current ADAS technology is urgent.

    Seeing Beyond the Visible

    ADAS found in most new vehicles today rely mainly on a combination of cameras to “see” with the support of radar. Autonomous-vehicle systems also include light detection and ranging radar (LiDAR), which currently comes at an extremely high cost.

    It’s undeniable that ADAS technology existing on the market today has reduced the likelihood of car accidents. However, until now, industry consensus on sensor-fusion components hasn’t been able to offer a consistently reliable solution. Most of today’s solutions in those scenarios rely on radar, which has low resolution and high false-positive rates, making it unreliable on its own. The ADAS sensor-fusion solution still lacks an affordable sensor modality that can provide high resolution in common low-visibility conditions for the automotive market.

    An example of how TriEye's SWIR camera works in low-visibility conditions.
    An example of how TriEye's SWIR camera works in low-visibility conditions.

    Fortunately, a viable solution to the problem of low visibility is possible. Short-wave infrared (SWIR) sensing has the potential to enhance driver capabilities and enable precise hazard detection. In contrast to cameras on the visible spectrum, a SWIR camera has a lower refractive coefficient, meaning that it’s significantly less scattered. Existing SWIR cameras are based on an exotic compound called indium gallium arsenide (InGaAs). They’re currently being used in industries such as defense and aerospace, but up to now, they haven’t found their way into mass-market applications due to high cost and long lead time.

    Based on almost a decade of academic research, TriEye was able to fabricate the industry’s first CMOS-based SWIR sensing solution that can be mass-produced. As a result, SWIR, which is able to “see beyond the visible,” can be applied to ADAS to help cars perceive what standard visible cameras are unable to see. It also allows for the detection of obscured and unseen objects at longer ranges so that the ADAS can alert the driver and react to hazards before it’s too late. The combination of SWIR's vision capabilities and the manufacturability of CMOS holds the promise of a considerably safer driver experience in the near future.

    Ziv Livne is the VP of Product and Business Development at TriEye.

    Continue Reading

    Range-Detection Technologies Target Consumer and Commercial Apps

    Where Are We with Autonomous Vehicles?

    Sponsored Recommendations

    Designing automotive-grade camera-based mirror systems

    Dec. 2, 2023

    Design security cameras and other low-power smart cameras with AI vision processors

    Dec. 2, 2023

    Automotive 1 TOPS vision SoC with RGB-IR ISP for 1-2 cameras, driver monitoring, dashcams

    Dec. 2, 2023

    AM62A starter kit for edge AI, vision, analytics and general purpose processors

    Dec. 2, 2023

    Comments

    To join the conversation, and become an exclusive member of Electronic Design, create an account today!

    I already have an account

    New

    Tiny Sensors Simplify Full Body Motion Capture

    Design Resources Boost Embedded Development Projects

    Who is Using RISC-V?

    Most Read

    Observability Framework Exposes DDS

    Virtual Circuits Beat Out Quantum Computer

    Master Cell Balancing to Enhance EV Performance


    Sponsored

    Design an efficient edge AI system with highly integrated system-on-chip processors

    Design automotive occupancy detection systems with new Arm-based processors

    The Tech Between Us Driver Monitoring Systems

    Electronic Design
    https://www.facebook.com/ElectronicDesign
    https://www.linkedin.com/groups/4210549/
    https://twitter.com/ElectronicDesgn
    https://www.youtube.com/channel/UCXKEiQ9dob20rIqTA7ONfJg
    • About Us
    • Contact Us
    • Advertise
    • Do Not Sell or Share
    • Privacy & Cookie Policy
    • Terms of Service
    © 2023 Endeavor Business Media, LLC. All rights reserved.
    Endeavor Business Media Logo