Existing non-visual wayfinding
- Most wayfinding / navigation applications for non-visual users describes location relative to the user.
- Examples: BlindSquare (Youtube video), and Sendero Seeing Eye GPS (Youtube video).
- BlindSquare has a feature which lets a user discover surroundings by panning the phone around.
- During navigation SeeingEye would use audio cues and narration to give updates and alerts (i.e. a description and a turn signal sound indicates which direction to turn).
Handisco smart white cane:
- Video on PBS about Handisco.
- Uses vibration and sound (like a Geiger counter) to convey proximity. Uses narration to give prompts and alerts.
- Smartcane video on YouTube
- Uses vibration exclusively to convey proximity to objects as to not compete with auditory information from the environment.
Model Based Sonification
- Chapter 16 in the Sonification Handbook has some interesting videos.
- Example S16.3: Tangible Data Scanning - similar to using the skater and dragging it around to scan the scene
- Example S16.4: Principal Curve Sonification of a noisy spiral data set - convey shape of a dataset (i.e. the track)
- Example S16.6: Particle Trajectory Sonification for 1 particle - conveying motion of a particle (i.e. skater). Note the difference in the quality of sound with different masses
- Chapter 17 in the Sonification Handbook
- Using k-sonar to depict shape of objects (example S17.5 to S17.11)
- Example S17.15: Smartsight square - conveying a square shape
- Chapter 20 in the Sonification Handbook
- Example S20.2: SWAN video VR demo - user navigating a physical space using sonar "pings" and earcons to confirm selections