Language and Cognition

Encoding of Articulatory Kinematic Trajectories in Human Speech Sensorimotor Cortex

Josh Chartier
Gopala K. Anumanchipalli
Keith Johnson
Edward F. Chang
2018

When speaking, we dynamically coordinate movements of our jaw, tongue, lips, and larynx. To investigate the neural mechanisms underlying articulation, we used direct cortical recordings from human sensorimotor cortex while participants spoke natural sentences that included sounds spanning the entire English phonetic inventory. We used deep neural networks to infer speakers’ articulator movements from produced speech acoustics.

Color naming reflects both perceptual structure and communicative need

Noga Zaslavsky
Charles Kemp
Naftali Tishby
Terry Regier
2018

Gibson et al. (2017) argued that color naming is shaped by patterns of communicative need. In support of this claim, they showed that color naming systems across languages support more precise communication about warm colors than cool colors, and that the objects we talk about tend to be warm-colored rather than cool-colored. Here, we present new analyses that alter this picture. We show that greater communicative precision for warm than for cool colors, and greater communicative need, may both be explained by perceptual structure.

Efficient compression in color naming and its evolution

Noga Zaslavsky
Charles Kemp
Terry Regier
Naftali Tishby
2018

We derive a principled information-theoretic account of cross-language semantic variation. Specifically, we argue that languages efficiently compress ideas into words by optimizing the information bottleneck (IB) trade-off between the complexity and accuracy of the lexicon.

Lexical competition in vowel articulation revisited: Vowel dispersion in the Easy/Hard database.

Susanne Gahl
2015

A widely-cited study investigating effects of recognition difficulty on the phonetic realization of words (Wright, 2004). Factors of lexical competition in vowel articulation. In J. Local, R. Ogden & R. Temple (Eds.), Papers in laboratory phonology, Vol. VI (pp. 26–50)) reported that vowel dispersion, i.e.

The Sapir-Whorf hypothesis and probabilistic inference: Evidence from the domain of color

Emily Cibelli
Yang Xu
Joseph Austerweil
Thomas Griffiths
Terry Regier
2016

The Sapir-Whorf hypothesis holds that our thoughts are shaped by our native language, and that speakers of different languages therefore think differently. This hypothesis is controversial in part because it appears to deny the possibility of a universal groundwork for human cognition, and in part because some findings taken to support it have not reliably replicated. We argue that considering this hypothesis through the lens of probabilistic inference has the potential to resolve both issues, at least with respect to certain prominent findings in the domain of color cognition.

Many neighborhoods: Phonological and perceptual neighborhood density in lexical production and perception

Susanne Gahl
Julia Strand
2016

We examine the relationship of lexical representations, pronunciation variation, and word recognition, by investigating effects of two lexical variables: Phonological Neighborhood Density (the number of words that can be formed by a single phoneme substitution, addition, or deletion from the target word), as well as a measure of the perceptual similarity of a target word to other words in the lexicon. We show that perceptual similarity to other words affects recognition, but not production.

The influence of lexical statistics on temporal lobe cortical dynamics during spoken word listening

Emily Cibelli
Matthew Leonard
Keith Johnson
Edward F. Chang
2015

Neural representations of words are thought to have a complex spatio-temporal cortical basis. It has been suggested that spoken word recognition is not a process of feed-forward computations from phonetic to lexical forms, but rather involves the online integration of bottom-up input with stored lexical knowledge. Using direct neural recordings from the temporal lobe, we examined cortical responses to words and pseudowords. We found that neural populations were not only sensitive to lexical status (real vs.

The auditory representation of speech sounds in human motor cortex

Connie Cheung
Liberty S. Hamilton
Keith Johnson
Edward F. Chang
2016

In humans, listening to speech evokes neural responses in the motor cortex. This has been controversially interpreted as evidence that speech sounds are processed as articulatory gestures. However, it is unclear what information is actually encoded by such neural activity. We used high-density direct human cortical recordings while participants spoke and listened to speech sounds. Motor cortex neural patterns during listening were substantially different than during articulation of the same sounds.