Penn PURM Linguistics Research Projects

Research Project

Published on Jan 21, 2024 · 8min read

In the summer of 2023, I developed three online musicolinguistic games using JavaScript, MongoDB, and HTML for the University of Pennsylvania's Phonetics Laboratory. The research aimed to explore how language acquisition influences rhythm and pitch perception, collecting data from over 200 runs of the software. Findings were presented at the Annual CURF Research Exposition at the University of Pennsylvania, Philadelphia, PA.

Overview

The original websites for these experiments have been sunsetted after the completion of the studies. However, for those interested, the archived code of each project remains available on my GitHub. You can explore how each game was designed to capture data and investigate correlations between language learning, rhythm, and pitch perception.

Penn PURM Taiko

Penn PURM Taiko is a JavaScript-based rhythm game inspired by *Taiko no Tatsujin*. It measures users' rhythmic perception to correlate with their levels of linguistic and musical training. Data collected through this game is stored in a MySQL database.

Penn PURM Taiko Game Screenshot

Screenshot of Penn PURM Taiko, measuring rhythmic perception.

Penn PURM Timbre

Penn PURM Timbre is another JavaScript game that measures players' abilities to differentiate between instrument timbres. This interactive experience helps compare players' capabilities in distinguishing vocal and instrumental sounds, contributing to an understanding of auditory discrimination.

Penn PURM Timbre Game Screenshot

Screenshot of Penn PURM Timbre, assessing instrument timbre differentiation.

Penn PURM VocalSense

Penn PURM VocalSense, developed with Next.js, tests players' abilities to distinguish between the intonations of two speakers. The game involves identifying which “elephant” speaks in a higher tone, adding a playful element to the study of intonation recognition and its ties to language learning.

Penn PURM VocalSense Game Screenshot

Screenshot of Penn PURM VocalSense, testing intonation recognition between speakers.

Technologies and Methods

In this research, I used Convolutional Neural Networks (CNNs) alongside Librosa, a powerful audio analysis library, to classify instruments and voices based on their timbral qualities. Here’s how they worked together:

Librosa for Feature Extraction: Librosa was used to extract key audio features like mel-frequency cepstral coefficients (MFCCs), spectral centroid, and zero-crossing rate. These features capture characteristics of the audio signal, allowing us to measure elements that relate closely to timbre, such as brightness, harmonic structure, and roughness.

CNNs for Classification: Convolutional Neural Networks are ideal for pattern recognition in image-like data. By converting audio samples into spectrograms, the CNN could “see” the sound and identify patterns associated with unique timbral qualities. The CNN learned to distinguish instruments and voices based on these timbral characteristics, which allowed for qualitative similarity classification even across different pitches and volumes.

Combining Librosa and CNNs: By using features extracted from Librosa, the CNN could learn subtle distinctions between sounds with similar timbres but different harmonic textures or resonant frequencies. This allowed the system to classify and group sounds by timbral qualities alone, providing powerful insights into auditory discrimination.

Spotlight: Project Overview Video

My work was highlighted by Penn’s Center for Undergraduate Research and Fellowships (CURF) because it showcases an innovative approach to understanding how language acquisition influences auditory processing, particularly rhythm and pitch perception. Through a series of interactive musicolinguistic games, I developed tools that collect and analyze data on participants’ auditory responses in a fun, engaging way. This research contributes to a deeper understanding of phonetic cognition and may offer valuable insights into language and music education. The project’s impact and interdisciplinary nature earned it recognition at the Annual CURF Research Exposition.

View this post on Instagram