Other Projects
Tandem Membrane Clarinet
Designed and analyzed the tandem membrane clarinet, an experimental wind instrument. Conducted acoustic spectrum analysis to evaluate sound production, harmonic structure, and formants, and investigated its potential for polyphonic, collaborative performance.
Course: Music Acoustics
Date: January–March 2024
Collaborators: Molly Gidfors Haraldsson, Sol Nordmark
Music Mood Detection Using Machine Learning
Developed a Unity-based obstacle race game to explore non-visual interaction modalities for spatial navigation and accessibility. Integrated Microsoft Speech Recognition API, spatial audio, and gamepad haptic feedback, and conducted usability evaluation using Nielsen’s Usability Heuristics.
Course: Multimodal Interaction and Interfaces
Date: October 2023 – January 2024
Collaborators: Guanyu Lin, Guoqing Liang, Haohao Yu
Exploring Non-Visual Modalities in an Obstacle Race Game
Developed a machine learning model for Music Emotion Recognition (MER) to predict valence and arousal in audio. Combined audio signal processing with regression and classification models, including XGBoost, to analyze large-scale audio datasets and extract acoustic features.
Implemented feature extraction and model pipelines in Python using scikit-learn, XGBoost, Pandas, and NumPy, with focus on feature selection, optimization, and evaluation metrics.
Course: Music Informatics
Date: August–October 2024
Collaborators: Mingcheng Kou, Daichi Taguchi, Haoyun Zhou