Road Hazards: If You Can’t See Them, You Can’t Detect Them

February 19, 2020
Driver point-of-view driving in low visibility conditions - glare

While autonomous vehicles, or “self-driving cars,” are still far from being available to the average driver, most cars on the market offer semi-autonomous features that significantly enhance vehicle reliability and safety.

Vehicles with these Advanced Driver Assistance Systems (ADAS) offer features such as lane marking detection, pedestrian warning, and emergency braking, all of which enable drivers to make better and smarter decisions. ADAS applications have proven their value and reduced the number of road fatalities globally. In 2007, 94% of driving collisions occurred as a result of human error in the United States—since then, the development and applications of ADAS reduced room for human error, drastically improving driver safety.

However, as valuable as ADAS technology has proven to be at promoting safe driving and decreasing the risk of car accidents, there is a lot of area for improvement. As ADAS algorithms are highly dependent on visible camera image data, they are unable to function and operate well in low visibility scenarios including adverse weather conditions and nighttime. These are scenarios when the human driver is most prone to making mistakes. Unfortunately, the sensor systems that are currently available on the market neglect to address hazards that aren’t as visible and frequently fail when they are most needed. 


The Magnitude of the Low Visibility Challenge

Recent research from AAA reveals that automatic emergency braking systems proved to be completely ineffective at night. According to the National Highway Transportation Safety Administration (NHTSA), although only 25% of travel in the United States occurs during hours of darkness, nearly 49% of accidents occur during this time. This means someone is nearly twice as likely to crash their vehicle when driving at night due to decreased visibility. Pedestrians are also more prone to get injured at night, with 75% of pedestrian fatalities occurring after dark. Additional reasons for crashes include environmental factors such as slick roads, glare, fog, rain, and other weather-related conditions.

In addition to poor performance in adverse weather and low light conditions, the existing sensor fusion solution has shortfalls in accurate object detection. Even under optimal visibility conditions, its detection of pedestrians wearing dark clothes or dark-furred animals in the road is extremely deficient and poses a serious danger. This deficiency has left room for more than 1.3 million deer-related accidents to occur in the United States every year.

Furthermore, while standard visible cameras can differentiate between the shape of a person and an animal, they cannot always differentiate between a real person and a picture of a person. This flaw puts object detection algorithms at risk of detecting “phantom” objects.


Our Approach

We at TriEye have developed a viable solution to the low visibility challenge. Short-Wave Infrared (SWIR) sensing has the potential to enhance driver capabilities and to enable precise hazard detection. In contrast to cameras on the visible spectrum, a SWIR camera has a lower refractive coefficient, meaning the light particles that hit the sensors are significantly less scattered and therefore creating a clearer image. Existing SWIR cameras are based on an exotic compound called Indium Gallium Arsenide (InGaAs) and are currently being used in industries such as defense and aerospace to solve the low visibility challenge. Until now, SWIR hadn’t been used for mass-market applications due to its high cost and long lead time. 

Based on almost a decade of academic research, TriEye succeeded in fabricating the industry’s first Complementary Metal-Oxide Semiconductor (CMOS) based SWIR sensing solution that can be mass-produced. Thus, SWIR, which is able to “see beyond the visible,” can be applied to ADAS to help cars perceive what standard visible cameras are unable to see and provide an accurate depiction of the environment the car is driving in. It also allows for the detection of obscured and unseen objects at longer ranges so ADAS can alert the driver and react to hazards before it is too late. TriEye’s combination of SWIR vision capabilities and CMOS-based sensors’ manufacturability promises a bright picture ahead for driver safety.


Final Thoughts

ADAS needs better sensors in order to assist drivers in the identification of visible and invisible hazards under all weather and light conditions. By reimagining this technology, those advanced systems will be able to better “see” the world in front of and around them. TriEye’s SWIR camera is leveraging technology that wasn’t previously available at such a low cost, turning invisible road hazards, visible.

Recent posts