Navigating Accessibility: Exploring Non-Visual Modalities in an Obstacle Race Game

Overview

Developed an obstacle race game in Unity to investigate non-visual modalities, such as audio and haptics, for spatial navigation. The project focused on improving gaming accessibility by exploring how visually impaired users interact with virtual spaces using speech input, tactile input, audio output, and haptic feedback.

Course: Multimodal Interaction and Interfaces
Date: October 2023 – January 2024
Collaborators: Guanyu Lin, Guoqing Liang, Haohao Yu

Key Skills and Tools

  • Designed and implemented a Unity-based game for multimodal interface evaluation.

  • Integrated Microsoft Speech Recognition API, spatial audio, and gamepad haptic feedback.

  • Conducted user evaluations guided by Nielsen’s Usability Heuristics to assess usability and interaction design.

Previous
Previous

Interactive Sonification of Palestinian Displacement Data

Next
Next

Pandemic Shopping: A Collaborative Game Without Verbal Communication