Join or Sign in

Register for your free asmag.com membership or if you are already a member,
sign in using your preferred method below.

To check your latest product inquiries, manage newsletter preference, update personal / company profile, or download member-exclusive reports, log in to your account now!
Login asmag.comMember Registration
https://www.asmag.com/project/resource/index.aspx?aid=17&t=isc-west-2024-news-and-product-updates
INSIGHTS

Connectivity and autonomous cars drive the future

Connectivity and autonomous cars drive the future
Advanced driver assistance systems (ADAS) are systems created to help drivers with the driving process. They are meant to increase car safety and, more generally, road safety.
The automotive world is now experiencing major disruptions. Innovative technologies are coming together in a way that has the potential to change how we interact with our cars. Car companies are experimenting with autonomous cars. Thanks to the combination of sensors and connectivity, cars will be constantly connected to the web, transmitting and receiving telematics and effectively becoming an Internet of Things (IoT) sensor.

Advanced driver assistance systems (ADAS) are systems created to help drivers with the driving process. They are meant to increase car safety and, more generally, road safety. These come equipped with semi-autonomous features such as lane keeping, adaptive cruise control, emergency breaking, parking assist, blind spot detection and warning alerts. The driver is in control and allows the vehicle to assist at his discretion.

According to information from research and advisory firm Frost & Sullivan, ADAS/Automation is the fastest growing segment in the European and US automotive market. It is driven by consumer safety ratings like the New Car Assessment Program (NCAP) in countries such as Europe, U.S., Japan and China, as well as insurance companies and legislation which require new vehicles to have features like autonomous emergency assistance.

By 2028, Frost & Sullivan expects 6.2 million vehicles to have automated features, the majority of which being semi-automated and the rest highly automated or even fully automated. Although these features improve safety, the existing technology holds much more potential beyond this.

Car sensor bottlenecks
The sensors accompanying ADAS should be coupled with programming that will allow the vehicle to react appropriately. “The biggest challenge is to develop algorithms so that the autonomous car will be able to handle complex situations and take the correct action to navigate on the roadways,” said Elizabeth Beasley, Marketing Coordinator for Velodyne LiDAR.

The act of driving requires a person to assess situations, evaluate different factors and make correct decisions on how to proceed, usually within an incredibly short period of time. Since humans have limited reaction times, complex situations involving more than one vehicle or someone else not paying attention results in accidents.

One of the goals of autonomous cars then is to remove the human element, allow a computer to process a scene, and immediately take action to avoid a collision. However, while autonomous vehicles have the ability to detect surrounding cars, people and other objects, the technology does not have the sophistication required to interpret the thousands of common sense and social signs that the average human driver uses to predict other people’s behavior.

“Hand signals, eye contact or just knowing that someone who is looking down at his phone may absent-mindedly walk across the street are social behaviors that are currently beyond the ability of a computer to process, but all hope is not lost. Companies, such as Google, have already logged thousands of hours teaching their software how to interpret such behaviors and make adjustments accordingly,” Beasley added. This points to a very interesting phenomenon — the new automotive landscape is attracting non-traditional players, most notably companies from the computer industry like Google and NVIDIA.

Current technology focuses on vision processing modules or algorithms that combine object recognition data (from cameras) and obstacle detection data (from radars) to perform environment scans to support different applications. “In the future, these algorithms are expected to include more data from high-definition 3D maps custom built for autonomous driving, including lane geometry and road attribute data, along with Lidar and camera data, to accurately predict the environment and also help the vehicle plan its course for the next one mile. The big difference that companies like NVIDIA are trying to bring to the table is infusing these algorithms with deep learning or artificial intelligence to help these algorithms behave like human beings in object recognition and classification, and reacting to variables and unexpected scenarios on the road,” said Praveen Chandrasekar, Consulting Director for Mobility at Frost & Sullivan.

Data analytics and deep learning drive autonomous cars
Today, NVIDIA has shifted beyond graphics into high performance computing and artificial intelligence. “Our cards are capable of massive data analytics. We can do the same thing in a car to process the data coming from sensors,” Danny Shapiro, Senior Director for Automotive at NVIDIA described. NVIDIA introduced the “DRIVE PX” — a small form factor “supercomputer” for the car that can fuse data from 12 cameras as well as LIDAR, radar and ultrasonic sensors. This allows algorithms to accurately understand the full 360-degree environment around the car. Computer vision is being replaced with deep learning.

“The DRIVE PX has the computing power of about 150 MacBook Pros in one small box the size of a license plate. Today, every feature in the car is using a separate box, a separate module. This is expensive and complicated from an engineering point of view. We can have one box that centralizes everything and can be updated over the lifetime of the car. If you want to detect cars, you feed the system millions of pictures of cars, and it is much easier to train the system and experiment and enhance it,” said Shapiro. “Our partner Audi will be introducing technology that will be a precursor to our DRIVE PX, where it will be in charge of autopilot and autonomous features for navigation in traffic jams, it will allow owners to have the car driven autonomously on the highway, stay in the lane, maintain a safe distance, accelerate and break if needed. Volvo will also be using DRIVE PX 2 in its Drive Me program — piloting fully autonomous driving on specific routes in Sweden.”

Vehicle-to-X communication connects cars
Another often mentioned feature is Vehicle-to-X communication, which enables the vehicle to get information from other cars or from infrastructures beyond the reach of traditional sensors. “We are developing two different technologies. On one hand, communication via backend and on the other hand, vehicle-to-X communication (vehicle-to-vehicle and vehicle-to-infrastructure). V2X enhances forward-looking driving and the car is able to see around corners. The feature also enables different safety functions such as the left turn assist, electronic brake light and roadworks assist, to name a few,” explained Christian Schumacher, Global Head of ADAS Customer Programs in Continental. The possibilities in vehicle-to-vehicle or vehicle-to infrastructure communication are extremely promising.

Apart from better road safety, the information can be used for traffic management, parking management, remote updating of the car’s firmware without the need to recall vehicles to a service center, etc. However, the adoption of V2X is still slow and far off.


There are no unified communication standards that can ensure interoperability between different brands and in different countries, there are cybersecurity threats that need to be addressed (how can we avoid cars from being hacked and hijacked?) and a need to develop new business models that will support this connectivity. For example, who will bear the costs of installing the roadside infrastructure? How will vehicles consume and pay for bandwidth they use?

These new business models will change the way we drive and interact with our cars. Some predict a future in which the number of cars is reduced and “share economy” models are used to decrease the number of cars and increase the utilization of existing cars to the maximum. Automotive giants like Volkswagen and Toyota have recently invested in car sharing apps Gett and Uber. It is possible that within our lifetime, we will be using fully autonomous shared vehicles for mobility inside cities. Even if we still own our vehicles, a new factor will be added to our selection criteria. “In the future, we will choose cars according to which company has better programmers — each company will choose its own features and we will choose the brand according to the comfort and driving experience its software offers,” predicted Shapiro.

The future of autonomous vehicles
As in-vehicle software advances, the “software-defined vehicle” could offer an endless array of rich, powerful and personalized experiences for passengers and drivers alike. Passengers and drivers will be able to customize the car’s driving modes to match a much wider range of handling modes and enjoy new forms of in-vehicle entertainment like immersive, high definition video conference or 4K video. “One other area we’re watching closely is the flying car, which would, of course, present the ultimate challenge and engagement for driving enthusiasts,” summarized Shaun Kirby, CTO of Rapid Prototyping at Cisco Systems.
Subscribe to Newsletter
Stay updated with the latest trends and technologies in physical security

Share to: