Join or Sign in

Register for your free asmag.com membership or if you are already a member,
sign in using your preferred method below.

To check your latest product inquiries, manage newsletter preference, update personal / company profile, or download member-exclusive reports, log in to your account now!
Login asmag.comMember Registration
https://www.asmag.com/rankings/
INSIGHTS

Google Assistant is smarter than Alexa and Siri: Study

Google Assistant is smarter than Alexa and Siri: Study
Stone Temple, a digital marketing company, has released the result of a test on the smartness of various voice assistants, and showed that Google Assistant is the smartest among all.
Stone Temple, a digital marketing company, has released the result of a test on the smartness of various voice assistants, and showed that Google Assistant is the smartest among all.

The company asked each voice assistant (Amazon Alexa, Google Assistant, Cortana and Siri) about 5,000 questions. Two factors are considered: how many questions they could answer and the accuracy.

Google Assistant was evaluated on smartphones and smart speakers. And the result showed that Google Assistant on smartphones was able to answer about 95% of the questions, with an accuracy rate close to 80%. Google Assistant on Google Home, on the other hand, was able to achieve a 85% answering rate with 65% accuracy. In other words, the assistant performed better on phones than on smart speakers.

The second smartest assistant is Microsoft’s Cortana, which was able to answer 90% of the questions with 63% accuracy. Apple’s Siri is ranked at the last place, answering 80% of questions with a 40% accuracy.

Amazon Alexa, which was able to answer over 80% of the questions with a 53% accuracy, got the third place. Alexa also had the biggest improvement since last year. The company conducted a similar test on assistants last year as well, and Alexa could only answer about half of the questions.

Incorrect answers from the assistants weren’t absolutely wrong. If an answer failed to deliver full information for the query, it would be listed as an inaccurate answer.

“Every competing personal assistant made significant progress in closing the gap with Google,” wrote Stone Temple in the statement.

The test didn’t check each personal assistant’s overall connectivity with other apps and services. Questions asked were related to knowledge information, such as “how fast does a jaguar run?” or “how long is one inch?”
Subscribe to Newsletter
Stay updated with the latest trends and technologies in physical security

Share to: