Language and Cognition

Many neighborhoods: Phonological and perceptual neighborhood density in lexical production and perception

Susanne Gahl
Julia Strand
2016

We examine the relationship of lexical representations, pronunciation variation, and word recognition, by investigating effects of two lexical variables: Phonological Neighborhood Density (the number of words that can be formed by a single phoneme substitution, addition, or deletion from the target word), as well as a measure of the perceptual similarity of a target word to other words in the lexicon. We show that perceptual similarity to other words affects recognition, but not production. Phonological Neighborhood Density, on the other hand, affects both word durations and...

The influence of lexical statistics on temporal lobe cortical dynamics during spoken word listening

Emily Cibelli
Matthew Leonard
Keith Johnson
Edward F. Chang
2015

Neural representations of words are thought to have a complex spatio-temporal cortical basis. It has been suggested that spoken word recognition is not a process of feed-forward computations from phonetic to lexical forms, but rather involves the online integration of bottom-up input with stored lexical knowledge. Using direct neural recordings from the temporal lobe, we examined cortical responses to words and pseudowords. We found that neural populations were not only sensitive to lexical status (real vs. pseudo), but also to cohort size (number of words matching the phonetic input at...

The auditory representation of speech sounds in human motor cortex

Connie Cheung
Liberty S. Hamilton
Keith Johnson
Edward F. Chang
2016

In humans, listening to speech evokes neural responses in the motor cortex. This has been controversially interpreted as evidence that speech sounds are processed as articulatory gestures. However, it is unclear what information is actually encoded by such neural activity. We used high-density direct human cortical recordings while participants spoke and listened to speech sounds. Motor cortex neural patterns during listening were substantially different than during articulation of the same sounds. During listening, we observed neural activity in the superior and inferior regions of...

High-Resolution, Non-Invasive Imaging of Upper Vocal Tract Articulators Compatible with Human Brain Recordings

Kristofer E. Bouchard
David F. Conant
Gopala K. Anumanchipalli
Benjamin Dichter
Kris S. Chaisanguanthum
Keith Johnson
Edward F. Chang
2016

A complete neurobiological understanding of speech motor control requires determination of the relationship between simultaneously recorded neural activity and the kinematics of the lips, jaw, tongue, and larynx. Many speech articulators are internal to the vocal tract, and therefore simultaneously tracking the kinematics of all articulators is nontrivial — especially in the context of human electrophysiology recordings. Here, we describe a noninvasive, multi-modal imaging system to monitor vocal tract kinematics, demonstrate this system in six speakers during production of nine American...

Phonological encoding and phonetic duration

Melinda D. Fricke
2013

Co-advisors: Susanne Gahl and Keith Johnson