At this year’s Apple World Wide Developers Conference in San Francisco, one of the underlying themes Apple introduced was the concept of extensions. The goal being to blur the boundaries between apps on the Platform, through widgets, but more notably, allowing third-party developers access to many of Apple’s fundamental applications.
Opening up Siri, through SiriKit, will allow developers to extend their apps through the use of speech recognition as a trigger for features and actions within third-party apps. Extensions has allowed Apple to open up Maps, Payments, Messaging, and VOIP for deeper third-party app interaction.
In this article we take a look at the path that Apple is heading toward of a task-focused app-infused platform, and explain feature-by-feature, how it will change the way that the app ecosystem operates.
Whilst it was highly anticipated that Apple was going to finally introduce Siri to macOS, it came somewhat as a surprise that Apple had announced that Siri would be opened up to developers, through the new API, SiriKit. This is in fact Apple’s first publicly-acknowledged forey into deep learning, playing catchup to the likes of the Amazon Echo (which has over 1,000 integrated services already), Google, and Microsoft’s Cortana. Apple’s view is that it is ready to let third-party developers create an ecosystem for Siri extensibility, the same way it created an App community
On stage, Apple demonstrated Siri’s understanding of specific domains and intents, which are categories of behaviors that apps are able to support, allowing Siri to recognize the intended action. Although Apple will inevitably open up more domains, the current domains iOS 10’s Siri, as well as Maps, will support include:
- Audio or video calling
- Sending or receiving payments
- Searching photos
- Booking a ride
- Managing workouts
Extensions works as follows:
- A user makes a request, such as send a message to John on WeChat, that request would then get converted to an intent object within iOS.
- The appropriate app, such as WeChat would advertise as capable of receiving that intent, and intent object.
- The app would then process and provide a response, such as confirming the message to John on WeChat would has been sent, and any custom UI or audible feedback that would be part of the response, would be triggered.
Through SiriKit, apps can also advertise back to Siri more information on the interactions and activities your app supports, letting the system determine whether your app can indeed handle the user’s requests, and subsequently pass the request back to your app.
Apple demonstrated on stage the use of the ride-sharing app, Uber, and restaurant-booking app OpenTable, within Apple Maps. A user within a map location of a restaurant, was able to contextually book a reservation through OpenTable, and then call a ride through Uber, immediately after, all without leaving Maps.
Extensions works in the same way, whether the request originated from within Maps, or via Siri, with intent extensions identified, and intent UI extensions handing User Experience, which could also be different, depending on whether the request came from Maps or Siri.
The extenions platform would also support payment services, allowing a user to either send payment to another user, or request payments from a user, through the INSendPaymentIntent and INRequestPaymentIntent intents, respectively.
Messaging was another thing that was demonstrated on stage, whereas previously users were only able to send messages through iMessage, developers can now send messages and search messges using third-party messaging apps, like WhatsApp, Viber and WeChat, to name a few. Using Siri or Maps, users are able to elect their messaging platform of choice when interacting with their friends.
Analogous to messaging, developers are now able to tap into initiating voice calls through VOIP, bypassing traditional carrier calling channels. In addition to the obvious apps like Viber and WhatsApp, we may see Facebook Messenger, Slack and Zoom also implement their VOIP, and providing a small annotation through the system phone app.
Finally, in light of the popularity of ride-sharing apps like Lyft and Uber, Apple also added this as a domain and intent on SiriKit, and can originate from either Maps or Siri. The API allows you to specify a list of available rides, book a ride, and getting a status of an already-booked ride.
On the surface, the features and changes introduced to iOS 10 may have seemed incremental at best, but the underlining premise of this year’s World Wide Developers Conference is the blurring app boundaries through extensions and intents, and contextual awareness through deep learning. Through the opening up of the platform, via SiriKit and Extensions, the entire realm of how apps are interacted with changes in iOS 10.
Granted some of the aspects are still somewhat limited, or not opened up entirely, as is the case with Domains, but this is an extremely positive step, and one Apple needed to make, to compete with Amazon and Google. More so, this is another big leap toward’s Apple maturing it’s HomeKit iOT platform more so, allowing smarter home appliances through greater intent-understanding.