Auditory Imagery of Speech and Non-Speech Sounds

2023

A team of students led by researchers in the O-Lab for auditory neuroscience determined whether the imagination of speech and nonspeech sounds can be distinguished using on a non-invasive measurement of activation in the brain, electroencephalography (EEG). Students collected and analyzed EEG data from human participants in response to both imagined and perceived sounds, as well as train a machine learning classifier to distinguish between the patterns of brain activity. The results of this study have the potential to advance understanding of imagination and related cognitive processes, such as memory. In addition, showing that imagination of speech and nonspeech have different patterns of brain activity could inform the creation of algorithms for brain-computer interfaces that can help with communication, especially for those with severe paralysis.

Project Lead: Tobias Overath

Project Manager: Evan Hare

View the team’s project poster here: Group5_AuditoryImagery_DataPlus

View the team’s project video here:

Watch a video from the Duke Center for Computational Thinking (CCT) about the project that features students’ learning experiences:

Contact

Mathematics

Related People

Data+ 2023: Auditory Imagery of Speech and Non-Speech Sounds

Data+ 2023: Auditory Imagery of Speech and Non-Speech Sounds

Data+ 2023: Auditory Imagery of Speech and Non-Speech Sounds