University of Bristol and University of Laguna researchers have developed a system using video from portable cameras that calculates the distance of obstacles, predicts the movements of people and cars, and generates three-dimensional acoustic maps, compensating for head positioning using a gyroscopic sensor. This information is then transformed and relayed to a blind person as a three-dimensional ‘picture’ of sound.This would appear to be very similar in principle to echolocation used by whales, dolphins and bats.
Monday, July 6, 2009
Acoustic Imaging For The Blind
Via KurzweilAI:
Posted by Ken at
1:10 AM