Google Awareness API Brings Contextually Aware Apps to Android

Contextually aware apps are part of the science fiction future many dream about. Imagine your phone automatically reminding you to pick up some milk as you drive past the supermarket, or suggesting you call you mother on her birthday. The new Awareness API from Google will help bring functions like these to Android apps.

The Awareness API takes the data from seven different location and context signals and manages them in a single API. Google says developers can use the API to design context-based features without much impact on system resources. The API manages aspects such as battery life all while processing the signals in ways that were not previously possible.

Google says the API is really, really smart. For example, it can process raw sensor data to accurately determine what the user is doing. "Context is at the heart of the Awareness API," says Google. "Contextual data includes sensor-derived data such as location (lat/long), place (home, work, coffee shop), and activity (walking, driving). These basic signals can be combined to extrapolate the user's situation in more specific detail."

The seven signals include time, location, places, beacons, headphones, activity, and weather. The Awareness API is able to read all these items at the same time, analyze how they relate to one another, and then make suggestions to the end user. For example, your phone knows you're walking and senses that you plugged your headphones in, so it might suggest a Spotify playlist for exercising. Or, the weather apps knows a pop-up rain storm is about to drench you and gently suggests you dig out an umbrella or step inside for a few moments.

The API is split in two and each half behaves somewhat differently

The Fence API can react to changes in the user's environment. It lets developers combine multiple signals to create fences. When certain defined fence conditions are met, the app receives a callback so it can interact with users, even if only in the background. Speaking logically, it would go something like: "If X and Y, then Z."

The Snapshot API provides instant details about the user's environment by taking a picture of all seven signals at a particular point in time. Google says the Snapshot API uses a bit less power and memory than the Fence API thanks to its intelligent caching and cross-app optimizations.

The bad news? The Awareness API isn't available just yet. Right now, developers interested in the Awareness API can sign up for early access. There's a modicum of information available at that link, but not much. Google hasn't said when it will make the API available, nor when Android devices may be able to support it. 

 
Eric Zeman I am a journalist who covers the mobile telecommunications industry. I freelance for ProgrammableWeb and other online properties.

Comments