Accessibility is one of the main themes in Google’s latest Android features.
Individuals with speech and motor impairments will now be able to navigate their smartphones through self-selected facial gestures and eye movements, thanks to a new feature called Camera Switches.
When using this feature, the phone’s front-facing camera becomes a switch. Users are able to choose a certain movement to scan and a second movement to select. The movement can be anything from an open mouth, smile, raised eyebrow or looking a certain direction.
Also focused on the accessibility space, Project Activate is targeted at individuals that do not speak and those with neurological conditions. This feature lets folks communicate with their Android, also through facial gestures. Individuals using this tool are able to text a contact or get a caregiver’s attention through voice activation.
WHY IT MATTERS
Speech and motor impairments are fairly common in the United States. In fact, around 25% to 30% of children with autism spectrum disorder are minimally verbal or do not speak, according to Healthline.
There are also several common conditions that are linked to motor impairment, including spinal cord injury, cerebral palsy, multiple sclerosis, Parkinson’s disease, stroke and others.
A total of 26% of the U.S. population has some type of disability, according to the CDC.
THE LARGER TREND
Google isn’t the only tech giant looking to make its tech more accessible. In May, Apple announced a slew of new features for people with mobility, vision, hearing and cognitive disabilities. For example, Apple Watch’s AssistiveTouch feature uses the built-in gyroscope and the accelerometer sensor to pinpoint subtle muscle movements and let individuals navigate via gestures.
Facebook is also looking at the disability space. In July, the company published a study in the New England Journal of Medicine that demonstrated a potential breakthrough for individuals who lost their ability to speak or are impacted by paralysis.
The UCSF study, which was backed by Facebook, implanted a multi-electrode into the part of the brain that controls speech in patients who lost their ability to speak. In the study, a computer algorithm was able to translate brain activity into sentences in real time.