Starting in December, developers will be able to make bots for Google Assistant. The creation of interactions will be done through the recently announced Actions on Google. Google Assistant is an AI offering much like Amazon's Alexa. This makes Actions on Google the platform's version of Amazon's Skills.
Upon the opening of the Actions on Google program, developers will have access to Assistant's SDK. Developers will be able to create two types of requests: direct actions and conversational actions. Direct actions are suited for transactional requests such as home automation or requesting of media. Conversations on the other hand, are for tasks that require more of a back and forth interaction. Requesting an Uber, providing your destination and specifying which type of car you need would be an example. These interactions are powered by API.ai which Google acquired last month.
Google Assistant is already featured in the Allo chat app and Google Home. With the opening of the platform developers will be able to integrate it into their own services and devices. Bots already built with API.ai will also be available to integrate into Assistant. It's worth noting however, that Assistant is currently only available for Pixel phones. A Google spokesperson confirmed this to TechCrunch stating, "Our goal is to make the Google Assistant widely available to users, and we’ll continue to launch new surfaces over the course of the next year."
Google has already lined up a number of parnters for the Actions on Google launch. These include Spotify and CNN for media, Uber for ride sharing, OpenTable for restaurant reservations and more. Interested developers can sign up for news and updates.