Navigating Accessibility:
Exploring Non-Visual Modalities in an Obstacle Race Game
Overview
Developed an obstacle race game in Unity to explore non-visual modalities, such as audio and haptics, for spatial navigation. The project focused on improving gaming accessibility by examining how visually impaired users interact with virtual spaces using speech input, tactile input, audio output, and haptic feedback.
Key Skills and Tools
Designed and implemented a Unity-based game for multimodal interface evaluation.
Integrated Microsoft Speech Recognition API, spatial audio, and gamepad haptic feedback.
Conducted user evaluations guided by Nielsen’s Usability Heuristics to assess usability and interaction design.
Course: Multimodal Interaction and Interfaces
Date: October 2023 – January 2024
Collaborators: Guanyu Lin, Guoqing Liang, Haohao Yu