Google is Rolling Out a New Accessibility Feature That Lets You Talk by Staring at Emojis - Latest Global News

Google is Rolling Out a New Accessibility Feature That Lets You Talk by Staring at Emojis

Secure, Google I/O was a massive AI palooza focused primarily on twins. But Global Accessibility Awareness Day will add a few more AI-powered vision features to some of the Look to Speak app’s accessibility features, allowing users to speak by staring at images, icons and – yes – even emojis, which could offer a new way for people with language problems or reading and writing difficulties to communicate in everyday life.

The Look to Speak app has been around since 2020, but for those who don’t know, it’s essentially an eye-tracking app that allows users to select from a series of phrases and words that the phone then reads out loud. In one blog entryGoogle described that the existing Look to Speak app now has updates that allow users to choose between emojis instead. Users can personalize which icons or emojis they want to use to optimize these interactions. It’s a pretty useful feature that makes an existing accessibility app even more accessible, even if it doesn’t speak English.

Along with this update, several other Google apps will be adding additional features that make it easier for visually impaired users to find objects around them. The existing one Lookout app You should now be able to find what you’re looking for in a room by moving your phone’s camera in the vague direction of the object.

The app now has access to beta search mode. Essentially, you select an item from seven categories, including “Seating and Tables,” “Doors and Windows,” or “Bathrooms.” After selecting the category, you can move the phone around the room. The camera should be used to determine the direction and distance you are from the object or door. Lookout will too Generate AI-written descriptions of images to learn more about photos you take directly in the app. Google clarified to Gizmodo that Lookout uses a visual model developed by Google DeepMind, the same group it is working on Project Astra AI digital assistant with similar vision abilities.

Google Maps is also receiving a new update that will include the Lens-in-Maps descriptions in all supported languages. These added languages ​​should work with the updated voice guidance and screen lens in the Maps screen reader that the company offers added available for Search and Chrome in October.

At the end of last year, Google added wheelchair icons to identify ramps or accessible locations in Google Maps. Previously it was limited to Android and iOS, but now it’s also available to those browsing Maps on desktop. Additionally, you can now filter reviews to search for wheelchair accessibility when using the mobile app. Many users are required to identify all accessible locations. Still, the Mountain View company says accessibility information is available for more than 50 million locations thanks to Maps users. Google also allows companies to indicate whether they use Auracast for deaf or hard-of-hearing customers.

Last year’s big accessibility update included Project Gameface, a new feature that could be useful Facial expressions to control characters on screen. The code was limited to PCs when it was first released. Now the feature will be available for both Android developers and mobile apps.

Want more of Gizmodo’s consumer electronics tips? Check out our guides on best laptops, best televisionsAnd best headphones. If you want to learn more about the next big thing, check out our guide to Everything we know about the iPhone 16.

Sharing Is Caring:

Leave a Comment