The voice recognition software known as Siri is one of Apple’s most iconic services. There is nothing quite like speaking to your smart phone and getting a response. It invokes a very futuristic feeling. Well, AT&T would chime in and say that its voice recognition software has been around for over twenty years. Now, to take it to next level, AT&T is going to release a limited number of Watson API’s to developers to plug directly into their apps.
The move is smart. Watson works best when it works within a certain category in order to anticipate what you’re talking about. For example, if Watson is in a restaurant app it will make more assumptions that what you’re trying to tell it is about food and dinning. So if you said, “Blue Plate Special” it’d bring up information about diners rather than a place that has a sale on blue plates. Further, Watson is also designed to transcribe spoken words into text. That’s pretty impressive. There’s little detail out there about how the central server will communicate with mobile devices, more information is to come.
What does this mean to you? Since the program is being sent out to independent developers, anything is possible. More than likely, every niche will be filled. IT service providers could transcribe troubleshooting logs while they work, gamers could play completely hands free, when you’re talking to an automated customer service system, it’ll finally understand that you don’t want to talk it.
That does bring up a point. Is this a first step of automated systems becoming more efficient than human service providers? There is something appealing to dealing with a machine. It never has an attitude or get annoyed when you can’t understand what they are saying. On the other hand, automated service is limited by its programming. Only time will tell if Watson and Siri will evolve into dynamic systems that can actually understand the complexities of human language all the way down to sarcasm and turns of phrase.