At the recent Google IO 2019, Google released some amazing new accessibility features which utilise Machine Learning and Artificial Intelligence on Android devices.

Apple are also no slouch too when it comes to accessibility for iOS. In this blog post I would like to examine both the technology implemented and examine the context of this technology for iOS and Android.

Android – making technology accessible to everyone

Google IO held 2019 in May and contained some very exciting announcements for accessibility including; Live Transcribe, Google Lens text-to-speech, Live Caption, and Project Euphonia. Let’s have a look at each technology in turn.

Live Transcribe

Live Transcribe allows people with hearing difficulties to communicate. The phone listens in real time and transcribes on what is being said on the screen.

Google Lens

Google Lens offers the ability to convert text to speech, allowing those who find reading difficult and can use it to understand text in the users native language.

Live Caption

Those with hearing difficulties can enjoy video content that is currently not captioned. This could include virtually all video content making it accessible to all.

Project Euphonia

This project enables those with difficulties talking, and allows Android to understand what is being said. Using A.I and Machine Learning the technology can accurately transcribe what is being said, to quote Google;

“Most aspects of life involve communicating with others—and being understood by those people as well. Many of us take this understanding for granted, but you can imagine the extreme difficulty and frustration you’d feel if people couldn’t easily understand the way you talk or express yourself. That’s the reality for millions of people living with speech impairments caused by neurologic conditions such as stroke, ALS, multiple sclerosis, traumatic brain injuries and Parkinson’s.”

It is a fantastic use of the power of A.I.

Apple iOS

Apple has always been very strong with accessibility in iOS and macOS. This can be categorised into vision, hearing and mobility.


Vision accessibility is handled by VoiceOver. VoiceOver is a screen reader which works with all built in apps from Apple and many third party ones.

Voice Over on an iPhone

Voice Over on an iPhone

Voice Over can be turned on with a triple click of the Home button after it’s has been enabled in settings and your iPhone.
Apple also includes such vision features such as colour filters, inverted screen and font size.


To quote Apple “Apple has worked with top manufacturers to create hearing aids and sound processors designed specifically for iPhone and iPad.”
Closed captions are also supported, Textphone calls and using iMessage to keep in touch with friends and family.


Dictation on your mobile, Siri A.I, AssistiveTouch to control the phone inline with your physical situation, creating custom shortcuts for often used tasks and many other features that are built into iOS.

Siri Shortcuts


It’s clear that both Apple and Google are working hard on accessibility features of their mobile operating systems. However it’s equally clear that Google is ahead of Apple at the moment and should be applauded for the efforts and investment they’ve put into accessibility. Technology is opening up all sorts of opportunities and breaking down all sorts of boundaries.

Avatar photo

Ric McCorriston

Managing Director


Back to all insights