Purpose Many aided augmentative and alternative communication (AAC) systems require the use of an external display that is represented via a visual modality. It is critical to evaluate and understand visual-perceptual processing in individuals with disabilities who could benefit from AAC. One way to evaluate how individuals process visual materials is through research-based automated eye-tracking technologies that obtain a fine-grained stream of data concerning gaze paths of visual attention. Method The current study examined how individuals with autism spectrum disorder ( = 13), Down syndrome ( = 13), intellectual and developmental disabilities ( = 9), or typical development ( = 20) responded to a spoken prompt to find a thumbnail-sized navigation key within a complex AAC display, including a main visual scene display (VSD) and a navigation bar of four thumbnail-sized VSDs. Stimuli were presented on a monitor containing automated eye-tracking research technology that recorded patterns of visual attention. Results Participants across groups spent more time fixating on a target thumbnail VSD navigation image after the presentation of the spoken cue to look at the target, compared to before the presentation of the spoken cue; they also spent more time looking at the target thumbnail VSD than the other thumbnail-sized VSDs in the navigation bar after the cue. Discussion Participants were able to locate the target thumbnail VSDs, even within the context of a visually complex AAC display. Implications for the design of AAC displays and for assessment of comprehension are discussed.

Author