"When the app calculated that an object was closer than about six feet, the phone vibrated or sounded a tone."
Researchers at the University of Alicante in Spain developed the app that uses a phone's built-in 3D camera to detect obstacles, and then produces a vibration or tone to alert the user.
The app was tested with nine participants who had vision disabilities severe enough that they couldn't see objects in their way without help.
During the test, the participants wore a cell phone around their necks on a lanyard, with the cameras facing forward.
As the subjects walked, the cameras picked up on objects.
Because the 3D camera has two lenses, the phone had binocular vision, just like human eyes. This allowed the software to estimate the distance to objects within its field of view.
When the app calculated that an object was closer than about six feet, the phone vibrated or sounded a tone.
As the obstacle got closer, the frequency of vibrations or sound level increased, 'LiveScience' reported.
The app isn't yet ready for prime time, researchers said, noting that the particular model of phone they used in their testing has been discontinued.
The team is developing a version for Google Glass, after winning a grant from the Vodafone Spain Foundation in 2013 for an earlier version of the app.
A full version is expected to be available in 2015.
The study appeared in the IEEE Journal of Biomedical Health Informatics .