Join or Sign in

Register for your free asmag.com membership or if you are already a member,
sign in using your preferred method below.

To check your latest product inquiries, manage newsletter preference, update personal / company profile, or download member-exclusive reports, log in to your account now!
Login asmag.comMember Registration
https://www.asmag.com/project/resource/index.aspx?aid=17&t=isc-west-2024-news-and-product-updates
INSIGHTS

The many applications of emotion recognition

The many applications of emotion recognition
Opsis, a developer of emotion recognition solutions, touts the technology as being able to help a range of industries, from retail to healthcare, achieve their security and business objectives.
This is an update to the original article published in 2017.
 
Human beings have various emotions, which can now be recognized by machines and computers thanks to advanced algorithms. One developer of such algorithms, Opsis, touts emotion recognition as being able to help a range of industries, from retail to healthcare, achieve their security and business objectives.
 

What makes their solutions unique

 
According to Stefan Winkler, CEO and Co-Founder of Opsis, his company’s solution is unique in that it offers fine-grained estimations along two dimensions: valence (positive vs. negative emotions) and arousal (energetic vs. passive expressions). This helps the system recognize more emotions than the seven main ones – neutral, sad, happy, surprised, afraid, angry and disgusted – in competing solutions.
 
“Facial expressions and emotions are continuous entities with numerous variations. Consequently, they cannot be limited only to seven specific predefined cases. For example, existing systems can identify a happy and a surprised face. However, classification will not be as accurate for a transitional expression between two basic emotions (like a mixture of happy and surprised, or sad and disgusted). Since existing systems build explicit models for each of the seven specific emotions, they are not able to identify robustly combined expressions,” Winkler said. “We detect face(s) and track 49 facial feature points which is much more accurate compared to our competitors. Our circumplex model handle many more expressions compared to the seven prototypical in the current market offering.”
 

What they offer

 
Today, Opsis offers four solutions that are based on scientific annotation. These are addressed as follows:
 
SenseCare: Annotated datasets from the clinical, health, counselling and psychological fields are used to train Opsis algorithm to recognize emotions that can be used to aid detection and intervention of mental health.
 
SenseSafe: Annotated datasets derived from cognitive, experimental and social psychology fields are used to train algorithm to detect emotions of drivers and large groups on safety, interactive infotainment or malicious intent.
 
SenseCrowd: Annotated datasets derived from cross-cultural, developmental, social and forensic psychology fields are used to train our algorithm to detect emotions of people in crowded places and public safety and experience.
 
SenseLearn: Annotated datasets from behavioral, educational, and human factors psychology fields are used to train algorithm to detect attention and engagement levels of participants/users in education and advertising research.
 

Various applications

 
According to Winkler, emotion recognition has applications in different sectors, for example retail. “Based on our current customer and partners engagements, one of the most promising use cases is marketing/advertising. Our customers want to know how people respond to ads, products, packaging and store design,” he said.
 
Among other verticals that can benefit from this technology are as follows.
 
Tele-consultation, health prediction: AI model for social service agencies, hospitals and healthcare institutions can identify tell-tale signs for pre-stroke and epilepsy. It can also be a tool for occupational therapy for health prediction, identifying signs of non-motor stage 1/2/3 symptoms for depression, anxiety and/or cognitive decline.
 
Education and HR: Opsis solution can use different emotions and eye tracking to inform of students’ attention level and assist teachers in adjusting content or teaching methods and detecting learning disabilities. HR recruiting on emotion psychosomatic assessment benefit from unconscious bias removal, especially in large-scale recruiting.
 
Public safety to analyze sentiment: With regard to security, the solution provides real-time information on crowd mood, emergency control and risk management. Detecting persons of interest/suspicious characters and providing early warning signals can be achieved by the solution, which helps identify the coalescence of aggressive groups and detect the likelihood of their violent behavior – detecting the “true” inner state (lack of empathy, guilt, shame) sheds light on individuals’ likelihood to commit crime.
 
“Our customers have been very receptive to this new avenue of recognizing and understanding customers’ emotions. Our partners like the SP/SI have expressed interest to incorporate emotion for better campaign and visualize how customers react to their marketing campaign,” Winkler said. “OEM/SDK manufactures are interested to incorporate into their surveillance solution for smart nation initiatives rollout. They foresee that emotion recognition have strong potential to embed into IoT and smart nation for surveillance, wearable and end sensing devices.”
 

Use of AI

 
It’s for sure that emotion recognition will only grow and find more acceptance among users; MarketsandMarkets, for example, forecasts that the global emotion detection and recognition market size will grow from US$23.6 billion in 2022 to $43.3 billion by 2027, at a compound annual growth rate (CAGR) of 12.9 percent during the forecast period.
 
The technology will pack more steam especially amid advances in AI and deep learning, which have become more powerful yet lightweight than ever. This will allow emotion recognition to be hosted on a range of devices from the cloud to the edge.
 
Opsis’s emotion AI recognition, for example, is processed on edge devices that require less than 1 GHz, meaning any IoT, sensor, actuator, surveillance camera, PCB board camera, Raspberry Pi and wearable optical device can capitalize on Opsis’s emotion analytics. Further, low computation cost and minimal bandwidth can help achieve savings on virtual machines, storage, backup and disaster recovery in the cloud. The solutions also enable platform providers like CRMs, automated marketing or apps services like SaaS and PaaS providers to analyze EQ with emotion insights.
 
“Some high-profile acquisitions also highlight the enormous potential and growing demand for emotion recognition solutions. With all these high profile acquisitions, it shows AI is set to grow and such technologies are highly sought after,” Winkler said.


Product Adopted:
Other
Subscribe to Newsletter
Stay updated with the latest trends and technologies in physical security

Share to: