Title | Visual Search Asymmetry: Deep Nets and Humans Share Similar Inherent Biases |
Publication Type | Conference Poster |
Year of Publication | 2021 |
Authors | Gupta, SKant, Zhang, M, WU, CHIA-CHIEN, Wolfe, J, Kreiman, G |
Conference Name | NeurIPS 2021 |
Date Published | 12/2021 |
Abstract | Visual search is a ubiquitous and often challenging daily task, exemplified by looking for the car keys at home or a friend in a crowd. An intriguing property of some classical search tasks is an asymmetry such that finding a target A among distractors B can be easier than finding B among A. To elucidate the mechanisms responsible for asymmetry in visual search, we propose a computational model that takes a target and a search image as inputs and produces a sequence of eye movements until the target is found. The model integrates eccentricity-dependent visual recognition with target-dependent top-down cues. We compared the model against human behavior in six paradigmatic search tasks that show asymmetry in humans. Without prior exposure to the stimuli or task-specific training, the model provides a plausible mechanism for search asymmetry. We hypothesized that the polarity of search asymmetry arises from experience with the natural environment. We tested this hypothesis by training the model on augmented versions of ImageNet where the biases of natural images were either removed or reversed. The polarity of search asymmetry disappeared or was altered depending on the training protocol. This study highlights how classical perceptual properties can emerge in neural network models, without the need for task-specific training, but rather as a consequence of the statistical properties of the developmental diet fed to the model. All source code and data are publicly available at https://github.com/kreimanlab/VisualSearchAsymmetry. |
URL | https://nips.cc/Conferences/2021/Schedule?showEvent=28848 |
Citation Key | 5037 |
Associated Module:
CBMM Relationship:
- CBMM Funded