A recent study found that our perception of everyday objects is influenced by knowledge of how large or small the objects actually are rather than how they appear.
Researchers in the Department of Psychology found that a person’s knowledge about the size of everyday objects impacts how our brains process and interact with the visual environment, according to a new paper published in Nature Human Behaviour.
Sarah Shomstein, professor of cognitive neuroscience and head of GW’s Attention and Cognition Laboratory, and Andrew Collegio, former cognitive neuroscience graduate student in the psychology department, found that people attend more efficiently to smaller objects compared to larger ones, and that prior knowledge of object size, not the perceived size, influences how our brains engage with objects.
For example, you know that a stop sign is smaller than a car, or that the moon is larger than your hand. Because attentional focus is more concentrated within small objects, our brains process them faster—an adaptive behavior that aids cognitive scene processing. However, appearance of object size depends on how close or far it is from you, which introduces an inconsistency between the actual and the perceived size. A stop sign at the intersection will appear to be bigger than a car at the end of the next block, and your outstretched palm is larger than the moon.
“We know that we can only pay attention to a small subset of information around us, and object size is one of the factors that determines how much attention is paid to any particular object,” Dr. Shomstein said. “For the first time, we have been able to show that our brain has a mechanism in place to ensure that attention does not get stuck on objects that are perceptually large, rather, attention is adjusted according to our knowledge of how large or small objects are.”
In other words, what you know about the world around you determines what you see.
The researchers measured how long it took individuals to respond to stimuli presented in different regions of object images, which, while all equal size on the screen, represented objects of different real-world sizes. They found that participants responded faster to targets presented in images of small objects, such as a domino, than in images of larger objects, such as a pool table, even though how much space they take up during the test is the same. Participants were also asked to rate items on a scale from very small to very big, Dr. Collegio said.
“Your own personal ratings determine how fast or slow your attention is going to be at traversing that space,” he said. “If you think the pool table is really large, then your attention is going to be slower.”
This finding suggests that participants’ prior knowledge of objects’ size warps space contained within mental representations of everyday objects.
This research will help other researchers better understand what environmental information attracts attention, and predict how effectively people process particular objects in the environment. The brain can only process so much information at a time. How the brain allocates limited resources is an important question that predicts performance for jobs involving visual search, such as air traffic controllers tracking multiple planes or radiologists looking for small tumors. How we process information has an impact on our everyday lives, and solving object-guided attention is an integral piece of visual neuroscience.
Paul Scotti, B.S ‘17, and Joseph Nah, a CCAS doctoral candidate, were also co-authors of this paper.