Categorical search: A (selective) review of behavioral and computational work

16 April 2013
Begin time: 

Categorical search, the task of finding and recognizing categorically-defined targets (e.g., cups, trash, bins, etc.) has been a neglected research topic, with the majority of studies in the search literature using instead picture previews of a target or other paradigms providing searchers with precise knowledge of a target’s exact appearance. Behavioral and computational work from my laboratory on the topic of how eye movements are directed during categorical search will be reviewed. Behaviorally, we show that eye movements can be guided to categorically-defined targets, a possibility that had been debated. We also show that this categorical guidance is proportional to the availability of target-defining information, is sensitive to subtle categorical similarity relationships, and is modulated by factors known to affect categorization, such as the hierarchical level used to specify a categorical target. Computationally, we borrow features and techniques from computer vision to model the eye movements made during categorical search. We show that this model can predict several core aspects of search behavior, including set size effects and the percentages of initial eye movements to a target (a conservative measure of search guidance). Recent work will also be discussed that uses SVM-based classifiers to decode the target category that a person is searching for from the nontarget objects that they preferentially fixate on target-absent trials—reading a searcher’s mind by analyzing their fixations. We conclude that categorical search is very similar to target-specific search, with the critical difference being that the visual features discriminating a target category from nontargets must be learned and retrieved from long-term memory before being used to guide movements of attention and gaze.