Join or Sign in

Register for your free asmag.com membership or if you are already a member,
sign in using your preferred method below.

To check your latest product inquiries, manage newsletter preference, update personal / company profile, or download member-exclusive reports, log in to your account now!
Login asmag.comMember Registration
https://www.asmag.com/project/hikvision_non-visible_light_technology_survey/
INSIGHTS

Edge processing is trending in the age of AI

Edge processing is trending in the age of AI
Edge processing is on track to see greater adoption in the coming years, and its application will be more prevalent in smart speakers, security cameras and robots, among other product categories, according to Tractica.
Edge processing is on track to see greater adoption in the coming years, and its application will be more prevalent in smart speakers, security cameras and robots, among other product categories, according to Tractica, a market intelligence firm that focuses on human interactions with technologies.
 
AI edge device chipset market size will grow at a 68.5-percent compound annual growth rate, from US$792.8 million in 2017 to US$51.6 billion by 2025, according to the “How Will 5G + AI Transform the Wireless Edge” whitepaper published by Tractica.
 
Concerns about privacy, latency, bandwidth and costs are some of the drivers pushing the adoption of AI processing at the edge.
 
Privacy is a concern because in order to train the AI algorithm, user-generated voice, facial and location data will be collected. A 2018 survey by Janrain showed that 94% of U.S. consumers were concerned about their data, while 62% were considering closing their Facebook accounts.
 
Against this backdrop, many manufacturers are championing data privacy in their products and services, while “on-device AI processing for facial recognition is becoming the trend,” Tractica says.

Cloud's latency and bandwidth issues

It is worth noting that most AI services offered today are cloud-based. But this is changing. Network latency, which refers to the time it takes to transfer data between the cloud and edge device, is becoming a concern because users expect real-time response. For example, latency can affect a voice assistant’s quality of service if answers are always provided after a long pause. Self-driving car is another case in point. “Even a few milliseconds of delay and be the difference between life and death, which is why AI inference will happen on the vehicle,” Tractica says.
 
One might think that with its higher connectivity speed, 5G could address the latency issue. The problem is, 5G will be deployed selectively, instead of everywhere. “Thus, on-device processing, which is always available, will be a requirement for AI processing in the car or other mission-critical computing tasks,” Tractica notes.
 
AI at the edge is also preferred because it has lower bandwidth requirement. Bandwidth is related to the volume of data transferred in a given time. Computer vision and facial, object and emotion recognition related applications may require higher bandwidth. The reason is that the resolution at which the images or videos are captured and transferred will determine the bandwidth needed. “Even at a low resolution, the volume of data is phenomenal for an always-on use case,” the whitepaper points out.

Edge's security and cost advantage

Cloud-based AI processing is vulnerable to security attacks, but edge-based processing can decrease security vulnerabilities, according to Tractica. “On-device AI can also detect anomalies in data communications and intrusion attacks more quickly through on-device data and personal behaviors compared to a cloud-based solution.”
 
Also, in the long run, edge-based processing is very likely to be cheaper than cloud-based processing. Two factors at play are edge hardware’s competitive pricing and the elimination of the bandwidth costs for transferring data between edge devices and the cloud.
 
“In one example cited by Amazon Web Services (AWS) and NVIDIA, the cost of deploying an occupancy detection solution in a fleet of 20 buses using a solution with on-device processing is 1,000 times cheaper than deploying the same solution in the cloud,” the whitepaper says.

Moving forward

“5G is transforming the wireless edge. It will connect trillions of devices, as opposed to the billions connected today.” Tractica says, adding that this will increase the intelligence and capabilities of those devices.
 
“The paradigm of centralized processing architectures with the cloud/servers as the primary hubs for AI models is giving way to a decentralized architecture where part or full AI processing will be performed at the edge device,” Tractica concludes. “Therefore, on-device AI processing will play a critical role in the evolution of AI and the internet going forward."
Subscribe to Newsletter
Stay updated with the latest trends and technologies in physical security

Share to: