The vision element in ADAS serves as the eye of the system and has various applications.
Awareness of advanced driver assistance systems
which consists of various sensors that help a vehicle detect and respond to objects, is rising.
The vision element in ADAS serves as the eye of the system and has various applications. The camera
— either visual or thermal
— serves as the most important component of the ADAS vision system. “There are four sides in a vehicle where cameras can be installed for ADAS applications — that is, front, rear, and on both the sides of the vehicle,” said a recent blog post
by eInfochips. “Front cameras are deployed in front of the vehicle behind the rearview mirror, rear cameras are mounted near the vehicle number plate, and side cameras are mounted near the side mirrors.”
The tasks that the cameras perform are as follows:
According to the post, front cameras in a vision-based ADAS are monocular cameras, which run algorithms
like forward collision warning, traffic-sign recognition, pedestrian recognition, lane departure warning and vehicle detection.
As for rearview cameras, they are are usually used for parking assistance and object detection, the eInfochips post said. “In some vehicles, parking assistance is supported by four fish eyes cameras or wide-angle cameras, which provide a bird’s eye view to assist in parking.”
Driver monitoring systems:
A dashcam mounted in a vehicle’s cabin supports the driver monitoring system for ADAS. This was assisted by advanced facial analysis algorithms to track eye gaze, head pose, and mouth status of the driver, the eInfochips post said. “[The] Driver monitoring system in ADAS recognizes and authenticates [a] driver, detects drowsiness and distraction and provides real-time alerts to reduce possibilities of on-road accidents.”
Surround view system:
Surround view systems in vehicles helped with left-right turn awareness, blind spot detection, lane change assistance, obstacle and pedestrian detection in the sideways of the vehicle and top-view parking assistance, the eInfochips post said.
Hardware aside, the importance of software and algorithms can't be ignored. On the subject of pedestrian detection, a blog post
by RSIP Vision maintains that one of the most challenging tasks of ADAS operating in urban or rural environment is detecting pedestrians. “As human behavior can sometimes be unpredictable, ADAS systems can be programmed to track pedestrians and predict with high levels of accuracy their orientation and intentions: human lives are at stake.”
As for lane-departure warning, the RSIP Vision post said lane lines were clearly detected, and that an ADAS vision system should know with perfect certainty when a car was approaching one of them, when it was crossing it and when it was driving on a different lane than the one it came from.
“The system needs to be perfectly calibrated in order never to lose contact with the ideal straight lines delimitating both sides of the lane, regardless of weather or traffic conditions. There is no need for more than one camera, in order to make this camera calibration system robust enough,” it said. “The algorithm takes into account the perspective given by the incoming images and exploits the properties of vanishing points, computed by identifying the two parallel lines. Knowing at every single moment the width of the lane leaves the developer with a relatively simple geometric problem to resolve.”
While vision played an important role in ADAS, other sensors should not be ignored, RSIP concluded. The post from enInfochips supported this, saying that “the future of ADAS lies in more advanced and sophisticated ADAS systems supported by LIDAR and RADAR sensors, in addition to vision-based ADAS, to provide many more applications for assisted and automated driving."