Over the past 12 months, we have seen a continual growth of accessibility features added to Android and Google Services. Live transcribe has brought users with hearing impairment better engagement in social settings. Live relay takes that feature a setup further with a chat interface that will interface with callers on your behalf.

One feature that’s been around for a while but goes unnoticed, is the accessible feature on Nest and Google Home devices. Many readers will be aware that my son lost his sight at the age of three to a cancerous brain tumour. When we got our first Google Home speaker, a realisation hit: We can see the lights on “Hey Google” triggering, but he didn’t have that visual feedback.

It didn’t take long to get some feedback from Google on the matter and direction to accessibility features.

Over time that it has evolved into a simple two step process for any individual speaker, you wish to enable. In the Home app, navigate to the speaker you want to enable then:

  • Go to the settings
  • Navigate into the Accessibility tab
  • Turn on start sound, end sound or both

For the vast majority of users, this is unneeded. If you can’t see the lights to know you’ve triggered the assistant, but can now hear it – that’s a big change. It means users are more independent with their technology use and connection to the outside world. Google has progressed so far with accessibility, they’re now catching the market leaders and creating new pathways.

1 Comment
newest
oldest
Inline Feedbacks
View all comments
Visura

Great tip, will be enabling that for myself.