Getting Siri-ous about Ontologies.

Original Article : http://www.unwiredview.com/2011/10/12/how-siri-on-iphone-4s-works-and-why-it’s-a-big-deal-apple’s-ai-tech-details-in-230-pages-of-patent-app/


The brouhaha over iPhone  4S “Siri” made one wonder about the ontologies in use for designing such an Artificially intelligent service. With people, having multiple accents and  having different ways(read:vocabularies)  of placing requests , it was intriguing to note how Siri approaches the problem. The article gives an analysis of the Apple Patent over Siri  describing the layers behind the service. 

Essentially Siri is an interpreter of speech requests and figures out the user intent by  using Active Ontologies analyzing the request in question and calling the relevant partner API to garner suggestions. For example , a request like  “suggest me some good italian place for dinner” is interpreted using the domain ontology it has relating to restaurants , comprising a domain-specific vocabulary, rules of interaction , reviews from partner APIs , leveraging the user current location(if needed), to provide relevant suggestions. The game-changer here is that Siri is proactive and tries to hone in on exactly what the user means by repeatedly questioning back the user regarding  seemingly ambiguous requests , traversing its set of ontological rules , till it has exactly the artifacts it needs to make the relevant API calls. In all,  its not a search engine , but an answer engine meant to make interactions inherently semantic ; hence more personal and meaningful.  For now the ontologies in place are those of restaurants/weather/sports/travel which have been integrated with various partner APIs like those of Yelp/Zagat , which can help users in accomplishing mundane tasks. The technology is still in Beta and is holds great promise for expansion to other domains as Siri grows its clientele among the iPhone Fanboys!


Siri-iPhone-4S-iOS-active-ontology.jpg