Visual search has been categorized as either slow, difficult search with focused attention, with set size-dependent speed, or easy feature search with quick target recognition and negligible set size dependency. Reverse hierarchy theory attributes these classes to high-level representations that facilitate quick, high-cortical-level vision at a glance as opposed to slow, low-level vision with scrutiny. Faces “pop out” of images of diverse objects as a result.
For a study, researchers questioned if the inability of people with autism to recognize faces interferes with their search for faces. When looking for faces, they compared the search times and set size slopes for kids with autism spectrum disorders (ASDs) with kids with neurotypical development (NT). Targets of the human face were quickly located with small set-size slopes. Although the between-group difference in slopes (18.8 vs. 11.3 ms/item) was considerable and suggested that faces may not “pop out” as easily, in their opinion, it was not enough to categorically distinguish ASD face search from that of NT kids. They also evaluated searches for several target categories, including non-face fundamental categories like homes and vehicles, as well as dog and lion faces.
In general, the ASD group moved slightly more slowly than the NT group, and their slopes were steeper. The dependencies on the target category were comparable overall, though, with human face searches being the fastest, nonface categories being the slowest, and dog and lion faces being in the middle. Investigators concluded that, despite its documented impacts on face identification, which may require vision with scrutiny, autism might spare vision at a glance, including face detection. The two perceptual modes indicated by the reverse hierarchy hypothesis are congruent with this distinction.