ARVR

Google Lookout describes surroundings to visually impaired users using AI



Google has launched a new app called Lookout, which is designed to give the visually impaired verbal information about their surroundings, using artificial intelligence technology. 

The app, which was first announced at Google’s I/O developer conference last year, is currently available to US users with a Pixel device, although Google Accessibility Engineering product manager Patrick Clary said in a blog post that Google “hope[s] to bring Lookout to more devices, countries and platforms soon.”

Clary says that the new app “draws upon similar underlying technology as Google Lens, which lets you search and take action on the objects around you, simply by pointing your phone [at them].”

Lookout is primarily designed to work in “situations where people might typically have to ask for help”; Google cites examples like “learning about a new space for the first time, reading text or documents” and daily tasks like “cooking, cleaning, and shopping.”

Looking out for users

Clary recommends that users wear their device “in a lanyard around [their] neck” or in a shirt’s front pocket. Once the app is opened, users just need to keep the phone pointed forward – Lookout will then describe the environment out loud for the user to hear. 

As the app uses artificial intelligence to determine the surroundings of the user, Clary says “Lookout will not always be 100 percent perfect”, and that the app “detects items in the scene and takes a best guess at what they are, reporting this to you.”

The launch of the Lookout app is the latest push in Google’s mission to provide more accessible apps. In February, the tech giant launched two new apps for Android, Sound Amplifier and Live Transcribe, which are designed to “make life a little easier” for those with hearing loss.

Via Zdnet



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.