Real-time lexical comprehension in young children learning American Sign Language (ASL)

Kyle MacDonald, Todd LaMarr, David Corina, Virginia A. Marchman, & Anne Fernald

Welcome to the web site for the (upcoming) paper "Real-time lexical comprehension in young children learning American Sign Language (ASL)" Here you will find links to several relevant materials:

  • Links to stimuli: Stimuli Set 1, Stimuli Set 2.
  • PDF version of the manuscript
  • PDF version of the online supplement containing details about the Bayesian Data Analysis
  • github page with data cleaning and analysis code
  • Link to the gating experiment for measuring critial sign onset


When children interpret spoken language in real time, linguistic information drives rapid shifts in visual attention, which can provide insights into developing efficiency of lexical access. But how does language influence visual attention when the linguistic signal and the visual world are both processed via the visual channel? We used precise measures of eye movements during real-time comprehension of a visual-manual language, American Sign Language (ASL) by 29 native, monolingual ASL-learning children (16-53 mos, 16 deaf, 13 hearing) and 16 fluent deaf adult signers. Deaf and hearing ASL-learners showed remarkably similar gaze patterns, suggesting comparable sensitivity to the constraints of processing ASL. All signers showed incremental language comprehension, initiating eye movements prior to sign offset. Finally, variation in children’s ASL processing was positively correlated with age and vocabulary size. Thus, despite channel competition, deployment of visual attention during ASL comprehension reflects information processing skills that are fundamental for language acquisition.