A widely-cited study investigating effects of recognition difficulty on the phonetic realization of words (Wright, 2004). Factors of lexical competition in vowel articulation. In J. Local, R. Ogden & R. Temple (Eds.), Papers in laboratory phonology, Vol. VI (pp. 26–50)) reported that vowel dispersion, i.e.
We examine the relationship of lexical representations, pronunciation variation, and word recognition, by investigating effects of two lexical variables: Phonological Neighborhood Density (the number of words that can be formed by a single phoneme substitution, addition, or deletion from the target word), as well as a measure of the perceptual similarity of a target word to other words in the lexicon. We show that perceptual similarity to other words affects recognition, but not production.
This paper investigates whether compensation for coarticulation in speech perception can be mediated by native language. Substantial work has studied compensation as a consequence of aspects of general auditory processing or as a consequence of a perceptual gestural recovery processes. The role of linguistic experience in compensation for coarticulation potentially cross-cuts this controversy and may shed light on the phonetic basis of compensation.
This paper provides evidence that multiple acoustic cues involving the presence of low-frequency energy integrate in the perception of Korean coronal fricatives. This finding helps explain a surprising asymmetry between the production and perception of these fricatives found in previous studies: lower F0 onset in the following vowel leads to a response bias for plain [s] over fortis [s*], despite the fact that there is no evidence for a corresponding acoustic asymmetry in the production of [s] and [s*].
In humans, listening to speech evokes neural responses in the motor cortex. This has been controversially interpreted as evidence that speech sounds are processed as articulatory gestures. However, it is unclear what information is actually encoded by such neural activity. We used high-density direct human cortical recordings while participants spoke and listened to speech sounds. Motor cortex neural patterns during listening were substantially different than during articulation of the same sounds.
A complete neurobiological understanding of speech motor control requires determination of the relationship between simultaneously recorded neural activity and the kinematics of the lips, jaw, tongue, and larynx. Many speech articulators are internal to the vocal tract, and therefore simultaneously tracking the kinematics of all articulators is nontrivial — especially in the context of human electrophysiology recordings.