ABOVE: © SHUTTERSTOCK.COM, KLEE048

Earlier this year, Brian Butterworth decided to figure out how many numbers the average person encounters in a day. He picked a Saturday for his self-experiment—as a cognitive neuroscientist and professor emeritus at University College London, Butterworth works with numbers, so a typical weekday wouldn’t have been fair. He went about his day as usual, but kept track of how frequently he saw or heard a number, whether that was a symbol, such as 4 or 5, or a word such as “four” or “five.” He flicked through the newspaper, listened to the radio, popped out for a bit of shopping (taking special note of price tags and car license plates), and then, at last, sat down to calculate a grand total.

“Would you like to take a guess?” he asks me when we speak over Zoom a couple of weeks later. I hazard that it’s well into the hundreds, but admit I’ve never thought about it before. He says: “I reckoned that I experienced about a thousand numbers an hour. A thousand numbers an hour is sixteen thousand numbers a day, is about five or six million a year. . . . That’s an awful lot of numbers.”

I reckoned that I experienced about a thousand numbers an hour.

—Brian Butterworth University College London

Butterworth didn’t conduct his thought experiment just to satisfy his own curiosity. He’s including the calculation in an upcoming book, Can Fish Count?, slated for publication next year. In it, he argues that humans and other animals are constantly exposed to and make use of numbers—not just in the form of symbols and words, but as quantities of objects, of events, and of abstract concepts. Butterworth is one of several researchers who believe that the human brain can be thought of as having a “sense” for number, and that we, like our evolutionary ancestors, are neurologically hardwired to perceive all sorts of quantities in our environments, whether that serves for selecting the bush with more fruit on it, recognizing when a few predators on the horizon become too many, or telling from a show of hands when a consensus has been reached.

Indeed, that most humans, even from a very early age, can quickly and accurately distinguish among different quantities of things is so obvious that it’s frequently taken for granted. This ability, known as numerosity perception, is distinct from counting—the process of keeping a tally while going through a set of objects—and is present in infants long before they learn words or symbols for particular numbers. It is evident, too, among adults in isolated human populations that typically don’t use numbers much in their daily lives. Moreover, it’s not human-specific: experiments with monkeys, crows, fish, and even bees indicate that numerosity perception, at least for relatively small quantities, is widely distributed across the animal kingdom. (See “Numerosity Around the Animal Kingdom.”)

How numerosity perception works neurologically, and how important it is in human cognition, are tougher questions to answer, and ones that have sparked debate among researchers. While some scientists propose that the so-called sense of number is just an offshoot of a more general perception of magnitude—an ability to say roughly how big something is in relation to something else—others argue that numerosity perception is an independent phenomenon, something that gives a special meaning to “four” and “five” as discrete quantities. Some scientists assign this number sense an even greater importance, claiming that it’s the foundation for humans’ capacity for numerical reasoning and arithmetic—that there’s a connection between our ability to quickly recognize the number of flowers in a vase, and our ability to understand why 2 + 4 = 6.

Thanks to growing interest in the subject, combined with advances in experimental and technological methods for studying the brain, data are trickling in to support these arguments—and occasionally confuse them. The result is far from being a clear picture, and disagreements still abound, but for Butterworth and others like him, multiple threads of evidence are now coming together to support numerosity, and our brain’s encoding of it, as a fundamental part of how we experience the world.

Making the case for humans’ sense of number

When David Burr first shared what would become some of his most famous work on numerosity perception around a decade ago, he was taken aback by the heckling he received. His findings, he’d argued to the crowd at a visual perception conference, indicated that humans automatically perceived an image’s numerosity in much the same way they perceive its color. But “some very angry people were yelling from the audience, ‘It’s got nothing to do with number!’” he recalls. “I didn’t expect that it was going to be so controversial.” 

A physiological psychologist at the University of Florence, Burr was suggesting that a person’s ability to estimate the number of dots on a screen is susceptible to a visual quirk known as adaptation, whereby looking at one thing can subconsciously influence our perception of another. A classic example is the color illusion in which staring at a red square for 30 seconds will cause you to perceive a white square as bluish-green. The effect is typically attributed to the brain’s tendency to become desensitized—red-sensitive photoreceptors fire for a while but then stop firing as much; the resulting imbalance between the excitability of red- and green-sensitive cells causes the illusion of greenness when looking at a white image. Psychologists interpret the existence of such adaptation effects as evidence of dedicated neural mechanisms for seeing color, and, accordingly, of color’s importance to how we perceive the world. “When something adapts, and adapts so readily,” Burr says, “that’s sort of saying that it’s a primary visual attribute.” 

© ISTOCK.COM, DA-KUK; © SHUTTERSTOCK.COM, GONIN 

He and University of Western Australia psychologist John Ross had found that looking at large numbers of dots on a screen tended to make people underestimate the number of dots in subsequent pictures, while looking at small numbers of dots had the opposite effect. Control experiments with dots in different patterns and densities indicated that the effect was specific to the number of dots, Burr adds, and not, as the hecklers were yelling, to some other quality of the image, such as the density of the dot pattern. This finding put number in the same category as color, motion, texture, and other visual attributes as something that the brain automatically picks up on, the pair wrote in a 2008 paper describing the results. “We propose that just as we have a direct visual sense of the reddishness of half a dozen ripe cherries, so we do of their sixishness.” 

The argument about whether numerosity is a fundamental feature of perception hardwired into our brains, or just a byproduct of perceiving something else, has continued ever since. Multiple neuroscientists now agree, at least to an extent, with the view that numerosity perception is important in its own right, especially when it comes to relatively low numbers of objects. (For images with more than 50, perhaps 100 objects, Burr says, it’s likely that object density, or what’s known as texture, does take over as the feature our brains are using to estimate quantity.)  

Some researchers have explored the idea using imaging techniques to study patterns of activity in the brain as people perform numerosity tasks. Elizabeth Brannon, a cognitive neuroscientist at the University of Pennsylvania, and colleagues presented people with images of dots that varied in dot size and number as well as the overall density and area of the pattern, while recording electrical activity from participants’ scalps or imaging their brains with functional MRI (fMRI). They’ve found in multiple studies that activity in the visual cortex is particularly sensitive to numerosity: changing the number of dots causes bigger spikes in activity than changes in other visual metrics, even when people aren’t asked to pay attention to how many dots they’re seeing.

While some scientists propose that the so-called sense of number is just an offshoot of a more general perception of magnitude, others argue that numerosity perception is an independent phenomenon.

Numerosity seems to be more than a static attribute, though. Burr has found that adaptation effects can be elicited with sequences of flashes—people who see a fast series of flashes will underestimate the number of flashes in a medium-paced sequence, for example. It’s not just visual stimuli, either: the same holds true for series of beeps. And Utrecht University’s Ben Harvey and colleagues have recently studied brain activity in people exploring different numbers of objects placed in their hands, a task designed to test so-called “haptic numerosity.” Using fMRI, they compared brain activity during this task with that of a visual numerosity task and found that while “there’s a distinct representation of haptic numerosity,” activity maps for the two types of numerosity overlap substantially, Harvey says.

It seems that at least somewhere in the processing of numerosity, too, the brain recognizes a connection between all these different formats—that three dots and three beeps are both versions of three. Burr has tried mixing types of stimuli in his adaptation experiments, and found that “a spatial display of 20 dots would appear like 15, let’s say, if [you] had been adapted to a fast sequence of either sounds or flashes,” he says. “It really gives us this idea of a generalized sense of number.” 

Neural mechanisms representing number in the brain

One clue about what a general number-sensing mechanism might look like, neurologically speaking, comes from the kinds of mistakes humans make when estimating or discriminating between different numbers of items. People will often confuse similar quantities, for example—they might have difficulty quickly telling the difference between a plate of eight chocolates and a plate of nine chocolates. Burr and many other researchers have repeatedly shown that this kind of error tends to get worse the smaller the proportional difference between two numbers: you’re more likely to confuse 23 chocolates with 25 than to confuse three with five or 10 with 40. People tend to get better at perceiving numbers of objects as they age: children are better at distinguishing similar numerosities than young infants are, and adults are more precise still.

Partly based on these observations, scientists have come up with a proposed cognitive system thought to be involved in numerosity estimation: the approximate number system (ANS). This system could receive sensory information—from the visual cortex, say—and extract the quantity of things in a way that gets less precise with magnitude. 

Andreas Nieder, a neuroscientist at the University of Tübingen in Germany and the author of A Brain for Numbers, has spent years working on the possible neural mechanisms underpinning such a system. By the time he started working on the problem around the turn of the millennium, fMRI studies by cognitive neuroscientist Stanislas Dehaene and others had started to link numerical processing in humans to particular brain regions such as the intraparietal sulcus (IPS), a groove in the parietal lobe involved in visual attention and various other cognitive processes. Nieder was curious about how exactly neurons in the brain might be coding for number, so he took advantage of something rather more direct than brain imaging: single-cell, electrophysiological recordings.

In some of their earliest experiments, Nieder and colleagues recorded from hundreds of individual neurons in the brains of macaques trained to discriminate between small numbers of dots on a screen. In a handful of studies in the 2000s and 2010s, the team reported that certain neurons in the macaque IPS and prefrontal cortex, an area typically associated with higher cognitive processes such as planning and decision making, were specifically sensitive to certain numerosities. That is, a particular neuron “has highest activity to a specific preferred or beloved number, say three,” Nieder says. “Three elicits the highest [electrical] discharge.” When the monkey viewed images containing numbers of items that were to either side of that preferred number—that is, two or four dots—there was a small discharge in the three-preferring neuron, but nothing like the spike it had emitted for three.

The same group also reported that the breadth of a particular neuron’s range—how likely it was to fire at quantities near but not equal to its preferred number—increased with increasing magnitude: a 12-preferring neuron was far more responsive to pictures of 11 and 13 items, say, than a three-preferring neuron was to two or four. The researchers later recorded from the brains of crows trained to discriminate one to five items. Despite crows lacking a neocortex—where the parietal and prefrontal cortices are located in primates—the researchers again reported evidence of number-specific neurons, this time in the avian endbrain, which some scientists have suggested governs more-complex cognitive functions. The findings hint at convergent evolution of number-representing systems, Nieder and colleagues argued in their paper.

See “Number-Selective Neurons Found in Untrained Crows

The work points to a common neural mechanism that’s pretty good at estimation, but prone to error when numbers are close together, and that gets less precise as numbers get bigger, Nieder says. In other words, “it’s entirely in line with the approximate number system.” The group has since replicated their results in macaques that weren’t trained to discriminate particular numerosities, finding a handful of neurons, albeit fewer than in trained monkeys, “that were already selectively representing these numerical quantities, without the monkey having a need to use this information to do any sort of discrimination,” Nieder says, suggesting “that there is something probably innate.” A fraction of these neurons may even respond to both visual and auditory stimuli, he notes—although they don’t necessarily respond to the same number in one modality as they do in the other.

A Favorite Number

In search of the neural pathways that allow the brain to perceive numbers of objects, neuroscientists including Andreas Nieder of the University of  Tübin-gen have carried out single-cell recordings in the brains of macaques, among other animals. In a series of experiments, Nieder’s team demonstrated that certain neurons in the intraparietal sulcus (IPS) and the prefrontal cortex (PFC)—parts of the brain that have been implicated in visual attention and higher cognitive processes, respectively—respond selectively to particular quantities of objects. The findings hint at the existence of dedicated “number neurons” that extract numerical information from sensory input. Some researchers propose that these number neurons actually lie downstream of a separate number-extracting system in the visual cortex, while others dispute the idea of a dedicated number-sensing system in the brain at all. More research is needed—in humans as well as in animals—to get to the bottom of how brains perceive and process numerical quantity, or numerosity.

In the pictured experiment, a macaque views a monitor screen displaying variable numbers of dots while researchers record from individual neurons in the IPS. The results reveal different activity profiles for different cells: some cells fire rapidly when the screen shows three dots (shown here) while others fire more in response to one or four. Importantly, cells do show some activity for numerosities close to their preferred numerosity, perhaps helping to explain how and why the brain sometimes makes mistakes when distinguishing between close-together quantities.

Circles show neurons’ responses for the pictured experiment with three dots.
See full infographic: WEB | PDF

Not everyone’s convinced that the IPS and prefrontal cortex neurons that Nieder studies are doing the numerosity-extracting, at least in humans. Harvey notes that at least some activity reported as number-specific may instead be related to attention or other aspects of task performance rather than to numerosity per se, and adds in an email that it’s unlikely that macaques and humans, which diverged more than 20 million years ago and have different brain structures, are using exactly the same neural machinery. Brannon and others have proposed that the mechanism for numerosity estimation in humans relies partly on populations of neurons involved in earlier sensory processing in the visual cortex—specifically, cell populations that increase their activity with increasing number. Harvey’s group, meanwhile, has used high-precision fMRI to detect distinct clusters of neurons in parts of the frontal and parietal lobes—although not the IPS—that respond to specific numerosities. 

It’s possible that multiple steps are involved, of course: populations of neurons whose activity increases with growing quantities could feed into nearby number-specific populations, which could pass signals to number-selective neurons or neuronal populations in other parts of the brain for later stages of perception or processing. “The way I see it, it’s not one process,” Harvey says. “There’s this initial process of giving a response to numerosity, then refining that response and distributing it to give this tuned response to numerosity . . . and then distributing it to lots of different brain areas that do different things with it, like multisensory integration, guiding attention, planning actions.”

So far, Nieder and his team haven’t been able to carry out single-cell recordings in the prefrontal cortex and IPS of humans, although they have done so from another part of the brain called the medial temporal lobe, which is involved in learning, in patients already undergoing surgery for epilepsy. “It’s not in the strict sense the area that’s associated with numerical processing,” Nieder says, but his team did find that there seem to be neurons that fire specifically in response to preferred numbers, as determined by experiments where patients had to estimate the number of dots on a screen.

Research on numerical cognition is rife with scientific debate, but nowhere does the conversation get gnarlier than when it comes to the putative relationship between humans’ understanding of nonsymbolic numbers.

Yet another point of uncertainty concerns the perception of very small numerosities, from one to four or so. These quantities, which most people instantaneously recognize, fall into what’s called the subitizing range (from the Latin subitus, meaning “sudden”). Psychologists hypothesize the perception of these quantities to be governed by a separate system from the ANS, but neither Harvey nor Nieder has seen evidence of this in their work, they say. Nieder adds that his team’s number-preferring neurons can account for behavioral observations all the way down to one—and perhaps below that, according to a recent study from his team reporting that trained crows may have a neural representation of zero.

Butterworth is also ambivalent. He notes that cognitive systems beyond the ANS may kick in for low numerosities—allowing us to recognize three dots as a triangle, say, or paying particular attention to individual items in the set—but that they’re probably not what’s assessing numerosity per se.   

The debate about the origins of arithmetic

Research on numerical cognition is rife with scientific debate, but nowhere does the conversation get gnarlier than when it comes to the putative relationship between humans’ understanding of nonsymbolic numbers, such as visual patterns of dots, and their use of symbolic numbers, such as number words or Arabic numerals used in math. 

Humans’ use of symbols to represent numerical concepts and perform calculations is unique across the phylogenetic tree (although some recent studies of Neanderthal artifacts suggest that our ancient hominin cousins might have gotten as far as basic numerical representation by scoring sets of lines into bone). Many researchers in linguistics and psychology have framed this ability as an offshoot of language, a sort of cognitive piggy-backing on existing brain circuits used to handle grammar and other linguistic concepts.

But some proponents of a more numerical worldview argue that there’s already a perfectly good system available in the brain to provide a framework for arithmetic and other forms of mathematics—the same one that handles the perception of number. “My position is that we inherit a mechanism for assessing the numerosity of objects in the environment . . . and my claim is that this forms the basis of learning about counting words and about arithmetic,” says Butterworth, who has worked with people who have specific difficulties with arithmetic and some types of numerical reasoning but typically not with language or other functions—a condition known as dyscalculia. While language is clearly important for using number words and extending the range of countable items, Butterworth argues, it’s not the basis of numerical cognition.

Scientists who share Butterworth’s mindset point to various pieces of evidence to support the view. Multiple studies of children’s math achievement, for example, have linked the ability to estimate numbers (without counting) to scores on standardized math tests. Functional MRI studies of professional mathematicians and non-mathematicians suggest that the areas of the brain active during numerical tasks don’t overlap with the regions typically associated with language processing, but instead involve intraparietal and other areas linked to number processing. And then there’s the literature on people with localized brain injuries, “where humans have very specific deficits in their dealings with numbers,” Nieder notes, but “language functions and other functions are preserved”—and vice versa. (See “One Hundred Years of Brain Lesion Case Reports” below.)

Neurological studies have also tried to carry over some of the concepts discovered with nonsymbolic number to symbolic number. In addition to neurons that fire in response to three dots, for example, Nieder’s research on epilepsy patients identified separate neurons that seem to respond specifically to certain Arabic numerals. There are fewer of them than there are of neurons responding to dot patterns, he notes, and they seem to be less specific to their number of preference. It’s not clear how this kind of neuronal representation might arise, nor how it could support arithmetic, he adds. “That’s something that we are actively exploring, and I hope that we can have some answers about this in the near future.” 

In the meantime, several groups are moving ahead with the practical implications of a connection between nonsymbolic and symbolic number. Brannon and others have reported that training  adults to improve their number sense, or approximation abilities, helps boost arithmetic performance—although her team also recently failed to replicate this effect in a study of 300 or so comparable subjects. Butterworth is working on several projects aiming to use a computer game to train the ANS in people with dyscalculia and subsequently improve their arithmetic skills. “I think it’s terribly important,” he says, noting that math education could be informed by a better understanding of how the brain handles numbers. Recalling his thought experiment from a few Saturdays ago, he adds, “If you’re not very good at interpreting these numbers, this is going to cause you a problem in everyday life.” 

Like the rest of the research on numerical thinking, though, the question of approximation’s role in arithmetic is hardly resolved. Harvey suggests that researchers may be unhelpfully blurring boundaries between different scientific concepts, and different neural mechanisms, when they talk in the same breath about numerosity perception, number processing, and math or other higher cognitive functions. Nieder, meanwhile, says he’d “have a hard time thinking of how else the brain would have information about what a numerical quantity means without having this approximate number system,” but acknowledges that the debate is “probably not on the way to being settled. . . . Almost nothing in this domain is really agreed upon by everyone.” 

One Hundred Years of Brain Lesion Case Reports

Many insights into the human brain have come from observations of people who have had parts of their brain damaged, typically through injury or disease. Research on number processing is no exception. Below is a selection of studies, highlighted in the 2019 book A Brain for Numbers and other reviews of the subject, of people who suffered lesions and later struggled to estimate or use numbers. 

The Scientist STaff

1908

Neurologists Max Lewandowsky and Ernst Stadelmann describe a 27-year-old man with damage to the cerebral cortex who is unable to recognize arithmetic symbols and has trouble performing calculations, but doesn’t show any deficits in language.

1918

Physician Georg Peritz describes WWI soldiers who suffered gunshot wounds to the head and developed specific difficulties performing simple mathematical operations. Based on the position of injuries, he suggests that the left angular gyrus in the parietal lobe might play an important role in numerical processing.

1919

Swedish neurologist Salomon Eberhard Henschen studies multiple cases of people who have problems interpreting numerals but have no corresponding difficulties with word or music processing. He proposes that the three functions are separate, and coins the term “acalculia” to describe the inability to perform simple mathematical operations. He suggests a link between the angular gyrus and this deficit.   

1982

Neuropsychologist Elizabeth Warrington describes patient “DRC,” who became acalculic after suffering a stroke affecting the left parietal lobe. While he had maintained a “concept of quantity,” she writes, he had difficulty performing mathematical operations—an issue Warrington connects to a deficit in language-based neural processes. 

2003

Researchers in France describe two acalculic patients: one, who has an injury to the left parietal lobe, struggles to approximate, compare, and subtract numbers with dot arrays or Arabic numerals; the other has damage to the left temporal lobe and has trouble performing calculations but can approximate and process dot patterns. The team interprets the cases as having different underlying deficits, the first in numerosity perception and the second in verbal processing—support for the two systems being separate in the brain.