A new report about Apple Inc.’s (Nasdaq: AAPL) voice-recognition software Siri concludes what many users have been saying for a while: It doesn’t work all that well.
Of 1,600 common searches, the speech technology accurately resolved the request 62 percent of the time on a loud street and 68 percent in a quieter setting, according to a report released today by Piper Jaffray Cos., the Minneapolis-based investment bank.
The report graded the technology “D” for accuracy, while predicting it will improve as more features are added.
“You’re playing the lottery when you’re using Siri,” said Gene Munster, the Piper Jaffray analyst who conducted the study. “They have a plan to be more competitive, but it’s going to take a couple of years.”
Apple has made Siri the defining characteristic of its iPhone 4S, spending heavily on advertisements featuring actors such as Samuel L. Jackson and Zooey Deschanel. The ads have contrasted with the experience of many users, who have taken to customer forums and websites to complain that Siri doesn’t work as well as it’s being marketed. One group of customers even filed a class-action suit against Apple for false advertising. Apple has denied any wrongdoing.
Apple continues to build features for Siri. Earlier this month, the company said its new iOS 6 mobile operating system will support sports statistics and movie listings. As more applications like that are added and the accuracy is improved, the speech-recognition technology will displace searches that many users now perform through Google Inc.’s search engine.
“Apple would love nothing more than to take that away from Google,” Munster said. He expects commerce features to be added eventually so people can shop by speaking out commands.
Munster said that while Siri is good at comprehending what a user is saying and will accurately repeat the question, it struggles turning those words into a correct answer. For instance, Siri will repeat old answers when a user is trying to ask a new question. The technology also struggles when trying to use speech commands to find directions, Munster said.
In Piper Jaffray’s tests, Siri was able to accurately decipher what a user was saying 83 percent of the time on the street and 89 percent in an area with low noise.
“Apple right now gets a ‘B’ in comprehension and a ‘D’ in accuracy,” Munster said. “There’s a big difference between comprehension and her actually doing what you want her to do.”