Encoding of Articulatory Kinematic Trajectories in Human Speech Sensorimotor Cortex


When speaking, we dynamically coordinate movements of our jaw, tongue, lips, and larynx. To investigate the neural mechanisms underlying articulation, we used direct cortical recordings from human sensorimotor cortex while participants spoke natural sentences that included sounds spanning the entire English phonetic inventory. We used deep neural networks to infer speakers’ articulator movements from produced speech acoustics. Individual electrodes encoded a diversity of articulatory kinematic trajectories (AKTs), each revealing coordinated articulator movements toward specific vocal tract shapes. AKTs captured a wide range of movement types, yet they could be differentiated by the place of vocal tract constriction. Additionally, AKTs manifested out-and-back trajectories with harmonic oscillator dynamics. While AKTs were functionally stereotyped across different sentences, context-dependent encoding of preceding and following movements during production of the same phoneme demonstrated the cortical representation of coarticulation. Articulatory movements encoded in sensorimotor cortex give rise to the complex kinematics underlying continuous speech production.

Josh Chartier
Gopala K. Anumanchipalli
Edward F. Chang
Publication date: 
May 17, 2018
Publication type: 
Recent Publication
Chartier, Josh; Anumanchipalli, Gopala K.; Johnson, Keith; Chang, Edward F. (2018) Encoding of articulatory kinematic trajectories in human speech sensorimotor cortex, Neuron 98, 1042-1054.