Computational and Experimental Methods

Perceptual integration of acoustic cues to laryngeal contrasts in Korean fricatives

Sarah Lee
Jonah Katz
2016

This paper provides evidence that multiple acoustic cues involving the presence of low-frequency energy integrate in the perception of Korean coronal fricatives. This finding helps explain a surprising asymmetry between the production and perception of these fricatives found in previous studies: lower F0 onset in the following vowel leads to a response bias for plain [s] over fortis [s*], despite the fact that there is no evidence for a corresponding acoustic asymmetry in the production of [s] and [s*]. A fixed classification task using the Garner paradigm provides evidence that low...

The influence of lexical statistics on temporal lobe cortical dynamics during spoken word listening

Emily Cibelli
Matthew Leonard
Keith Johnson
Edward F. Chang
2015

Neural representations of words are thought to have a complex spatio-temporal cortical basis. It has been suggested that spoken word recognition is not a process of feed-forward computations from phonetic to lexical forms, but rather involves the online integration of bottom-up input with stored lexical knowledge. Using direct neural recordings from the temporal lobe, we examined cortical responses to words and pseudowords. We found that neural populations were not only sensitive to lexical status (real vs. pseudo), but also to cohort size (number of words matching the phonetic input at...

The auditory representation of speech sounds in human motor cortex

Connie Cheung
Liberty S. Hamilton
Keith Johnson
Edward F. Chang
2016

In humans, listening to speech evokes neural responses in the motor cortex. This has been controversially interpreted as evidence that speech sounds are processed as articulatory gestures. However, it is unclear what information is actually encoded by such neural activity. We used high-density direct human cortical recordings while participants spoke and listened to speech sounds. Motor cortex neural patterns during listening were substantially different than during articulation of the same sounds. During listening, we observed neural activity in the superior and inferior regions of...

High-Resolution, Non-Invasive Imaging of Upper Vocal Tract Articulators Compatible with Human Brain Recordings

Kristofer E. Bouchard
David F. Conant
Gopala K. Anumanchipalli
Benjamin Dichter
Kris S. Chaisanguanthum
Keith Johnson
Edward F. Chang
2016

A complete neurobiological understanding of speech motor control requires determination of the relationship between simultaneously recorded neural activity and the kinematics of the lips, jaw, tongue, and larynx. Many speech articulators are internal to the vocal tract, and therefore simultaneously tracking the kinematics of all articulators is nontrivial — especially in the context of human electrophysiology recordings. Here, we describe a noninvasive, multi-modal imaging system to monitor vocal tract kinematics, demonstrate this system in six speakers during production of nine American...

Phonological encoding and phonetic duration

Melinda D. Fricke
2013

Co-advisors: Susanne Gahl and Keith Johnson