Google Expands Hands-free and Eyes-free Interfaces for Android | TechCrunch - Latest Global News

Google Expands Hands-free and Eyes-free Interfaces for Android | TechCrunch

As part of Accessibility Awareness Day 2024, Google is introducing some updates for Android that should be useful for people with mobility or visual impairments.

Project Gameface allows players to use their faces to move the cursor and perform common click-like actions on the desktop, and now it’s coming to Android too.

The project allows people with limited mobility to use facial movements such as raising an eyebrow, moving the mouth or turning the head to activate various functions. There are basic things like a virtual cursor, but also gestures that allow you to define, for example, the start and end of a swipe by opening your mouth, moving your head and then closing your mouth.

It can be customized to suit a person’s abilities, and Google researchers are working with Incluzza in India to test and improve the tool. For many people, the ability to play many of the thousands of games (probably millions, but thousands of good ones) easily and hassle-free on Android will certainly be more than welcome.

There’s a great video here showing the product in action and fitting; Jeeja in the preview image talks about how much she has to move her head to activate the gesture.

This type of granular customization is just as important as having someone able to adjust the sensitivity of your mouse or trackpad.

Another feature for people who can’t easily use a keyboard, be it on-screen or physical: a new non-text “look-to-speak” mode that allows people to use emojis either alone or as a proxy for a phrase or action to select and send.

You can also add your own photos, so someone on speed dial can attach commonly used phrases and emojis, as well as images of frequently used contacts, to photos of them, all accessible with a few glances.

For people with visual impairments, there are a variety of tools (of varying effectiveness, no doubt) that can help a user identify things that the phone’s camera sees. The use cases are endless, so sometimes it’s best to start with something simple, like finding an empty chair or recognizing and pointing out the person’s keychain.

Users can add custom object or location detection so that the instant description feature gives them what they need, not just a list of generic objects like “a cup and a plate on a table.” Which cup?!

Apple also unveiled some accessibility features yesterday, and Microsoft has some too. Take a minute to read through these projects that are rarely covered on the main stage (Gameface is), but are of great importance to those for whom they were developed.

Sharing Is Caring:

Leave a Comment