iPhones can now tell blind users where and how far away people are – ProWellTech 1

iPhones can now tell blind users where and how far away people are – ProWellTech

Apple has added an interesting new accessibility feature to the latest beta of iOS: a system that detects the presence and distance of people in the iPhone camera view, allowing blind users to effectively create social distance, among other things.

The function emerged from Apple’s ARKit, for which the company developed “People Occlusion”, with which the shapes of people can be recognized and virtual objects can pass in front of and behind them. The accessibility team realized that when combined with the precise distance measurements from the lidar units on the iPhone 12 Pro and Pro Max, this can be an extremely useful tool for those with visual impairment.

Of course, during the pandemic you immediately think of the idea of ​​staying half a meter away from other people. But knowing where others are and how far away they are is a basic visual task that we use all the time to plan where to go, what line to get on in the store, whether to cross the street, and so on .

The new feature that will be part of the Magnifying Glass app uses the Pro and Pro Max lidar and wide angle cameras and gives feedback to the user in a number of ways.

iPhones can now tell blind users where and how far away people are – ProWellTech 2

The lidar in the iPhone 12 Pro is shown in this infrared video. Each point indicates the exact distance from which it is reflected.

First of all, the user is informed whether people are visible at all. When someone is there, it shows how far the nearest person is in feet or meters and updates regularly as they get closer or further away. The sound corresponds in stereo to the direction in which the person is in the view of the camera.

Second, the user can set tones that correspond to specific distances. For example, if you set the distance to six feet, you will hear one tone when a person is more than six feet away and another when they are in that area. After all, not everyone wants a constant supply of exact distances when all they want is to stay two steps away.

The third feature, which may be especially useful for people with visual and hearing impairments, is a haptic pulse that goes faster as a person approaches.

Last is a visual feature for people who need a little help understanding the world around them. An arrow points to the recognized person on the screen. Blindness is a spectrum, after all, and any number of vision problems can cause a person to need a little help in this regard.

The system needs a decent picture on the wide-angle camera so that it doesn’t work in the dark. And while limiting functionality to the upper end of the iPhone line reduces range somewhat, the ever-increasing usefulness of such a device as a type of visual prosthesis likely makes hardware investment more palatable to people who need it.

This is nowhere near the first tool like this – many phones and dedicated devices have object and person-finding capabilities, but it doesn’t happen often that it comes built in by default.

Person recognition should be available for iPhone 12 Pro and Pro Max, which are running the release candidate for iOS 14.2 that has just been made available. Details are expected to appear soon to Apple’s dedicated iPhone website for accessibility.

Source link

Similar Posts