To assess how much language a person with ASD understands (or, to assess how much receptive language a person with ASD has) researchers use tools that often rely on verbal responses and/or overt behaviors such as pointing and gesturing. This can be problematic considering that many minimally verbal people with ASD have difficulty with these types of feedback. Being able to understand language and being able to appropriately express it are two different processes, and although it is easier to observe and evaluate expressive language (verbal and nonverbal communication of wants and needs), there is certainly a need to develop more accurate ways to assess receptive language to avoid underestimating cognitive abilities of the vast and diverse population of minimally verbal people with ASD.
This article is a review of a variety of assessments and technologies that are currently in use and new ones that have promising future applications all aimed at evaluating language comprehension without the use of verbal or physical feedback. Some of the tools discussed are Eye Tracking, ERPs (Event-Related Potentials) using EEG machines, and MEG brain imaging data. These strategies allow for a more passive study session- one where a participant is asked only to watch videos or listen to sounds and voices all while researchers are still able to obtain data about brain activity and language comprehension.
In this article you will also find a review of various intervention strategies aimed at helping minimally verbal kids develop more expressive language. The strategies and tools referenced here are Naturalistic Behavior Intervention, PECS (Picture Exchange Communication Systems), and Speech Generating Devices (such as the Proloquo2go app). How much these strategies have been shown to increase communication is discussed.
This article is very easy to read, informative, and discusses future directions in the field of research and intervention.