Loup Ventures recently tested four smart speakers by asking Alexa, Siri, Google Assistant, and Cortana 800 questions each. Google Assistant was able to answer 88% of them correctly vs. Siri at 75%, Alexa at 73%, and Cortana at 63%. Last year, Google Assistant was able to answer 81% correctly vs. Siri (Feb-18) at 52%, Alexa at 64%, and Cortana at 56%.
Loup Ventures asked each smart speaker the same 800 questions, and they were graded on two metrics: 1. Did it understand what was said? 2. Did it deliver a correct response? The question set, which is designed to comprehensively test a smart speaker’s ability and utility, is broken into 5 categories:
- Local – Where is the nearest coffee shop?
- Commerce – Can you order me more paper towels?
- Navigation – How do I get to uptown on the bus?
- Information – Who do the Twins play tonight?
- Command – Remind me to call Steve at 2 pm today.