image01

We all love it when our device does something unexpected and helpful. Many times these little assists are possible because our device understands our context. Do you have an app that tells you it’s about to rain? Then it knows where you are and what the current and expected weather is in your location.

We take this sort of functionality for granted with the likes of Google Now having radically transformed our devices from smart phones to simple assistants. And there’s more to come with services such as Google Assistant due for launch this year.

In the meantime, Google want’s to make it easier for developers to include this sort of contextually aware intelligence onto their apps. To achieve this they announced the awareness API at Google IO 2016. The new API includes support for 7 signals to provide app developers with more context with less effort.

The signals included in the API are:

  • time
  • location
  • places
  • beacons
  • headphones
  • activity
  • weather

You may think those are simple things that many apps already include, and you’d be right. However, with the new API developers can simply plug into these signals and get a known response back, and this could just be the beginning. As more sensors, services and technologies come online we could see more signals added to the API suit.

if you’re interested in this sort of thing we’ve attached the full IO presentation below.

Source: Google Developers.
1 Comment
newest
oldest
Inline Feedbacks
View all comments
Dennis Bareis

The most important bit here is that it should reduce battery drain.