Repeating a word: as the brain receives (yellow), interpretes (red), and responds (blue) within a second, the prefrontal cortex (red) coordinates all areas of the brain involved. (video credit: Avgusta Shestyuk/UC Berkeley).
Recording the electrical activity of neurons directly from the surface of the brain, using electrocorticograhy (ECoG)*, neuroscientists were able to track the flow of thought across the brain in real time for the first time. They showed clearly how the prefrontal cortex at the front of the brain coordinates activity to help us act in response to a perception.
Here’s what they found.
For a simple task, such as repeating a word seen or heard:
The visual and auditory cortices react first to perceive the word. The prefrontal cortex then kicks in to interpret the meaning, followed by activation of the motor cortex (preparing for a response). During the half-second between stimulus and response, the prefrontal cortex remains active to coordinate all the other brain areas.
For a particularly hard task, like determining the antonym of a word:
During the time the brain takes several seconds to respond, the prefrontal cortex recruits other areas of the brain — probably including memory networks (not tracked). The prefrontal cortex then hands off to the motor cortex to generate a spoken response.
In both cases, the brain begins to prepare the motor areas to respond very early (during initial stimulus presentation) — suggesting that we get ready to respond even before we know what the response will be.
“This might explain why people sometimes say things before they think,” said Avgusta Shestyuk, a senior researcher in UC Berkeley’s Helen Wills Neuroscience Institute and lead author of a paper reporting the results in the current issue of Nature Human Behavior.
For a more difficult task, like saying a word that is the opposite of another word, people’s brains required 2–3 seconds to detect (yellow), interpret and search for an answer (red), and respond (blue) — with sustained prefrontal lobe activity (red) coordinating all areas of the brain involved. (video credit: Avgusta Shestyuk/UC Berkeley).
The research backs up what neuroscientists have pieced together over the past decades from studies in monkeys and humans.
“These very selective studies have found that the frontal cortex is the orchestrator, linking things together for a final output,” said co-author Robert Knight, a UC Berkeley professor of psychology and neuroscience and a professor of neurology and neurosurgery at UCSF. “Here we have eight different experiments, some where the patients have to talk and others where they have to push a button, where some are visual and others auditory, and all found a universal signature of activity centered in the prefrontal lobe that links perception and action. It’s the glue of cognition.”
Researchers at Johns Hopkins University, California Pacific Medical Center, and Stanford University were also involved. The work was supported by the National Science Foundation, National Institute of Mental Health, and National Institute of Neurological Disorders and Stroke.
* Other neuroscientists have used functional magnetic resonance imaging (fMRI) and electroencephelography (EEG) to record activity in the thinking brain. The UC Berkeley scientists instead employed a much more precise technique, electrocorticograhy (ECoG), which records from several hundred electrodes placed on the brain surface and detects activity in the thin outer region, the cortex, where thinking occurs. ECoG provides better time resolution than fMRI and better spatial resolution than EEG, but requires access to epilepsy patients undergoing highly invasive surgery involving opening the skull to pinpoint the location of seizures. The new study employed 16 epilepsy patients who agreed to participate in experiments while undergoing epilepsy surgery at UC San Francisco and California Pacific Medical Center in San Francisco, Stanford University in Palo Alto and Johns Hopkins University in Baltimore. Once the electrodes were placed on the brains of each patient, the researchers conducted a series of eight tasks that included visual and auditory stimuli. The tasks ranged from simple, such as repeating a word or identifying the gender of a face or a voice, to complex, such as determining a facial emotion, uttering the antonym of a word, or assessing whether an adjective describes the patient’s personality.
Abstract of Persistent neuronal activity in human prefrontal cortex links perception and action
How do humans flexibly respond to changing environmental demands on a subsecond temporal scale? Extensive research has highlighted the key role of the prefrontal cortex in flexible decision-making and adaptive behaviour, yet the core mechanisms that translate sensory information into behaviour remain undefined. Using direct human cortical recordings, we investigated the temporal and spatial evolution of neuronal activity (indexed by the broadband gamma signal) in 16 participants while they performed a broad range of self-paced cognitive tasks. Here we describe a robust domain- and modality-independent pattern of persistent stimulus-to-response neural activation that encodes stimulus features and predicts motor output on a trial-by-trial basis with near-perfect accuracy. Observed across a distributed network of brain areas, this persistent neural activation is centred in the prefrontal cortex and is required for successful response implementation, providing a functional substrate for domain-general transformation of perception into action, critical for flexible behaviour.