updated - November 15, 2019 Friday EST
The vOICe system, a technology that lets blind people see their surroundings using sound, is now available in a smartphone app.
The system was created based on research done by scientists who studied the brains of blind people who learned to perceive their world by turning images into soundscapes, according to Wired.
The app is now available on Android and uses the phone's camera to record areas and landscapes. The app helps users learn how to tell which sounds go with which shapes, according to Daily Mail.
The vOICe system was created in 1992, featuring a pair of goggles with a built-in camera and software that turns captured images into sounds. The pitch of the sound shows the vertical position of an object, while the timing and duration of the sound shows the horizontal position and width of the object, Wired reported.
The system, created by Dutch engineer Peter Meijer, can convert different forms of light into different sounds. A bright spot, such as a lamp, would be a beep, while a bright rectangle, such as a window during daylight, would turn into a noise burst. A grid, such as a trellis, becomes a rhythm, Daily Mail reported.
Amir Amedi, neuroscientist professor from the Hebrew University of Jerusalem, has been using the technology to help train blind people to see since 2007. Amedi used brain scans to find the silhouettes of bodies cause the visual cortex, an area that processes body types, in the participants' brains to light up with activity. Participants can recognize the presence of a human form and identify the posture of a person in an image and imitate it, according to Daily Mail.
In addition to turning camera views into soundscapes computer vision and sensory substation, the vOICe app features a talking compass, talking color identifier, talking GPS and a talking face detector, Daily Mail reported.
In his most recent work, Amedi created the EyeMusic app, which uses an algorithm to convert original black and white images created by vOICe into color, Daily Mail reported.
"The idea is to replace information from a missing sense by using input from a different sense," Amedi said. "It's just like bats and dolphins use sounds and echolocation to 'see' using their ears."
TOP 10 FRANCHISES OF 2019