gossipify logo 1

Android Accessibility will find objects with the camera

Google has announced new features for Android accessibility apps, such as Lookout and Look to Speak

On World Accessibility Awareness Day, celebrated this Thursday (16), the Google announced a series of accessibility features for Android and Google Maps. One of the new features is the arrival in the Lookout application of the possibility of finding objects, signals and more with the mobile phone camera.

Location of the object

The Lookout app is suitable for people with visual impairments and uses the camera to identify items and read text aloud. The tool has acquired a feature called “Find Mode” (“Search Mode”, in the free translation), in which you can find certain objects, such as bathrooms, seats and tables, by moving your mobile phone around an environment.

The app recognizes the object and reports the direction and distance to it. Furthermore, Google has confirmed that it will use artificial intelligence to generate automatic descriptions of photos taken directly through the application: initially the new feature will only be available in English.

Lookout finds objects and creates image summaries with artificial intelligence (image: Disclosure/Google)

Free text in Look to Speak

The Look to Speak application, which allows you to play certain sentences aloud with just your eyes, has gained a free text mode in which the user can customize expressions. In this case you can choose emojis, photos and other icons to activate each command.

More accessible maps

Google Maps is enriched with a series of new features focused on accessibility. First of all, the reporting of wheelchair accessible places has been extended to the desktop version of the app: previously it was only available on Android and iOS.

The platform GPS Integration with Google Lens for screen reading and voice guidance has also been improved, suitable for visually impaired or blind people. Now the mobile app can say aloud the category of places around you and the distance needed to reach them.

Additionally, businesses and other facilities that support the Auracast protocol can mention this attribute in their Maps profile. Auracast is used to transmit audio via Bluetooth devices and supports several hearing aid models, eliminating the need for another object to hear the aural description of a location.

GameFace Project

OR GameFace Project is a developer platform that lets you control Android using facial gestures. The computer interface was launched last year and the mobile version supports up to 52 different gestures, customizable for every need.

More news

Finally, the Mountain View giant revealed changes on two other platforms. The first is Project Relate, aimed at creating a customizable speech recognition model: the tool has gained more customization features and can import sentences from other applications, such as Google Documents.

Sound notifications, which identify surrounding noises, have a new design and make it easier to find audio. Accessibility was a very important point in the launch of Android 14 and the feat should be repeated with Android 15 — the system also got a new Beta version.

Trends on Canaltech:

Source: Terra

You may also like

Hot News



Join our community of like-minded individuals and never miss out on important news and updates again.

follow us