Shashank Garg, Rohit Kumar Singh, and Ravi Raj Saxena of Delhi Technological University worked under the mentorship of Prof. Rajiv Kapoor worked on developing a gesture-based user interface for visually impaired persons.
“Sensing of gestures made in the air needs fairly expensive hardware and complex image processing algorithms,” says Shashank. “We were inspired by a project called SoundWave by Microsoft, which uses simple hardware such as a microphone and a speaker.”
“We used the idea of Doppler effect to detect gestures,” says Rohit. “The reflected wave from a hand that is making a gesture is shifted in frequency compared to the pilot tone. This shift in frequency is analyzed to detect a particular gesture.”
“We used a Beagleboard-XM in our project,” adds Ravi. “An inaudible sound wave of 18.892 kHz is given out by the speaker connected to the BeagleBoard. The wave reflected by a moving obstacle is captured using a microphone. Gesture detection algorithms are run on the Beagleboard XM and the information on the detected gesture is displayed on an LCD screen. An MSP430 microcontroller from TI is used to handle the display. We also played back the interpretation of the gesture to help a visually impaired user.”
“We used Simulink for developing the software and then used the target support package for Beagleboard to translate the code into C that can be compiled for Beagleboard,” reveals Shashank.
“We tested our system with three kinds of gestures, left swipe, right swipe, and double throw,” says Rohit. “Gestures were correctly recognized in more than 80% of the cases.”
ref : http://e2e.ti.com/