Like most people,
I have been walking more than usual during the pandemic and enjoying
it. My meetings with students and colleagues have turned into walking
meetings around campus for over a year. Now, I have a problem: School is
starting soon, and I don't want to go back to the classroom. We all saw
this coming. Give employees a taste of the outdoors, and they might not
want to go back to their offices and desks. So I am thinking of
teaching my fall courses outside.
Yet while I was researching this possibility, I discovered a problem.
I had always read that walking increased cognitive functioning and problem solving, but it turns out that it's not that simple. In 2014, a new study showed that walking decreased rational and linear thinking and increased divergent thinking and imaginative mind-wandering. Uh oh. Will my students learn less if I teach them while walking?
I teach philosophy, a discipline that prides itself on rational and
analytic thinking. So is walking going to clash with the values of my
profession?
But wait, aren't philosophers famous for walking and being rational?
Socrates loved to stroll and philosophize, and Aristotle taught his
classes while he walked up and down the walkways of the Lyceum. The
Stoics walked and talked on outdoor porches with art on display. Seneca,
a Roman Stoic, told other Stoics,
"We should take wandering outdoor walks so that the mind might be
nourished and refreshed by the open air and deep breathing." Plato's
school was even outside, in a grove of trees called Akadēmía. So either the Greeks weren't as obsessed with rationality as we like to think — or maybe we are missing something.
What is the connection between walking and thinking, and is it still
good for us if it makes us more irrational? We may have heard by now how
walking makes us feel good by releasing endorphins, lowers risk our of depression; increases cognitive functioning; strengthens memory; enhances creativity; and produces a protein
essential for neuronal development and survival, synaptic plasticity,
and cognitive function. It sounds great, but how are all these related,
and why do we have to sacrifice our hard-earned rationality to get them?
This is what I wanted to know before sending my students to the realms
of unreason.
A lot is happening to our bodies and brains on a walk, but one
fascinating thing stands out. They are all related to an increase in
what neuroscientists call "spontaneous cognitive fluctuations."
Scientists have been telling us that the background noises our brains
make are random and unimportant for almost a century; hence, they have
filtered and averaged them out of their studies. Yet increasing evidence
shows that this "noise" is neither random nor unimportant.
The brain is always active. Even when we lay motionless in bed
thinking of nothing, billions of neurons in our brain are firing. Yet,
in almost a hundred years of research, scientists still can't find the
cause or consequences of this activity.
Spontaneous fluctuations are like the dark matter or "junk" DNA of the
brain. They make up the vast majority of brain activity but are shrouded
in mystery.
Yet, what little we have recently discovered about them is already
profoundly shifting our models of consciousness. Moreover, we now know
that this flux is not just present when we are inactive but is involved
in all brain functioning. It even eats up two-thirds of the brain's total energy supply.
That's a big deal. We also know for sure it's not coming from bodily
activities like breathing or heart movements, or from the electrical
instruments used to measure it.
The most interesting positive finding we have so far is that these
spontaneous fluctuations are neither random nor deterministic, but have
an unpredictable "fractal" structure.
A fractal is a pattern that roughly repeats across scales — like a tree
whose few big branches have many smaller branches with even more leaves
that look like tiny branches. Scientists have found that spontaneous
neural activity follows a similar branching pattern throughout the
brain, and has a related proportion of a few slow and strong frequencies
to more faster and higher frequencies scientists call "pink noise."
With this discovery, researchers are now starting to observe changes in
these fluctuations related to aging, consciousness, mental health, experiences of art and nature, and memory.
One of the most compelling explanations for why healthy cognitive
fluctuations have this fractal structure is that they were an
evolutionary adaptation to aid humans in identifying, navigating, and
remembering the fractal patterns ubiquitous in nature. For example,
early humans spent a lot of their time walking around looking for things
bathed in a world of fractal sights and sounds. This is why our eye
moments and searching patterns employ fractal patterns of a few long and many short motions. Even the way we walk
is fractal and becomes less so as we age. Fractal patterns are easy on
the eyes, endlessly fascinating to see and hear and even inspire
feelings of beauty.
When we take a walk outside, the fractal rhythms of our heart synchronize with the fractal rhythms of our lungs
and our fractal gait. Researchers have also shown that our wandering
bodies make our minds wander too. On a walk, our brain waves slow down.
The underlying spontaneous fluctuations bubble up more easily, creating
experiences of spontaneous thoughts and associations that seem to come
from nowhere. We often call them "moments of inspiration."
Seeing and hearing natural fractals helps slow our brainwaves down, and slowing down our brain waves allows our spontaneous fluctuations to help identify and memorize patterns and rhythms more easily or work through problems unconsciously as dreaming does.
The exciting conclusion of this research is that while it's wonderful
that walking increases blood circulation, the primary source of
walking's cognitive benefits seems to come from its effects on the
mysterious spontaneous fluctuations of our brains. This has certainly
changed the way I think about the nature of intelligence, consciousness,
memory, and education. Reason is not the source of intelligence; it's
the product of it.
So while my students are out on their walks, they might be
mind-wandering more than in the classroom, but this is a good thing.
Perhaps this is what those Greek philosophers understood and what we
have forgotten.
Recent studies on walking also show that walking with other people
synchronizes their brain and bodily rhythms resulting in increased
empathy, cooperation, and sharing. So walking may also be beneficial for everyone's social and emotional education.
Knowing all of this, part of me still wants my students to sit down
in a classroom and listen to every word I say, even if the studies show
they are less likely to remember it than if we were walking through the
trees like Plato's students did. But I am going to take them outside
anyway against my rationalist bias and see what our spontaneous
fluctuations are capable of this quarter.
Familiar
categories of mental functions such as perception, memory and attention
reflect our experience of ourselves, but they are misleading about how
the brain works. More revealing approaches are emerging.
Neuroscientists
have tried to map various categories of mental function to specific
regions of the brain, but recent work has shown that the definitions and
boundaries of those regions are complex and context-dependent.
Neuroscientists
are the cartographers of the brain’s diverse domains and territories —
the features and activities that define them, the roads and highways
that connect them, and the boundaries that delineate them. Toward the
front of the brain, just behind the forehead, is the prefrontal cortex,
celebrated as the seat of judgment. Behind it lies the motor cortex,
responsible for planning and coordinating movement. To the sides: the
temporal lobes, crucial for memory and the processing of emotion. Above
them, the somatosensory cortex; behind them, the visual cortex.
Not only do researchers often depict the brain and its functions much
as mapmakers might draw nations on continents, but they do so “the way
old-fashioned mapmakers” did, according to Lisa Feldman Barrett,
a psychologist at Northeastern University. “They parse the brain in
terms of what they’re interested in psychologically or mentally or
behaviorally,” and then they assign the functions to different networks
of neurons “as if they’re Lego blocks, as if there are firm boundaries
there.”
But a brain map with neat borders is not just oversimplified — it’s
misleading. “Scientists for over 100 years have searched fruitlessly for
brain boundaries between thinking, feeling, deciding, remembering,
moving and other everyday experiences,” Barrett said. A host of recent
neurological studies further confirm that these mental categories “are
poor guides for understanding how brains are structured or how they
work.”
Neuroscientists generally agree about how the physical tissue of the
brain is organized: into particular regions, networks, cell types. But
when it comes to relating those to the task the brain might be
performing — perception, memory, attention, emotion or action — “things
get a lot more dodgy,” said David Poeppel, a neuroscientist at New York University.
No one disputes that the visual cortex enables sight, that the
auditory cortex enables hearing, or that the hippocampus is essential
for memory. Damage to those regions impairs those abilities, and
researchers have identified mechanisms underlying them in those areas.
But memory, for example, also requires brain networks other than the
hippocampus, and the hippocampus is turning out to be key to a growing
number of cognitive processes other than memory. Sometimes the degree of
overlap is so great that the labels start to lose their meaning.
“The idea that there’s some kind of strong parallelism between mental
categories that neuroscientists use to try and understand the brain and
the neural implementation of mental events is just wrong,” Barrett
said.
And while the current framework has led to important insights, “it’s
gotten us stuck in certain traps that are really stifling research,”
said Paul Cisek,
a neuroscientist at the University of Montreal — an outcome that has
also directly hobbled the development of treatments for neurological and
psychological conditions.
That is why Barrett, Cisek and other scientists argue that for us to
truly understand how the brain works, concepts at the field’s core may
need to be revised, perhaps radically. As they grapple with that
challenge, they are uncovering new ways to frame their questions about
the brain, and new answers: This month alone, one such approach revealed an unexpected link
between memory formation and metabolic regulation. But even if a new
framework succeeds in explaining the brain’s operation, some researchers
wonder whether the price of that success will be a loss of connection
to our human experience.
‘More Aliases Than Sherlock Holmes’
When functional magnetic resonance imaging (fMRI) and other powerful
technologies made it possible to examine living brains in increasingly
sophisticated ways, neuroscientists enthusiastically started searching
for the physical basis of our mental faculties. They made great strides
in understanding the neural foundations of perception, attention,
learning, memory, decision-making, motor control and other classic
categories of mental activity.
But they also found unsettling evidence that those categories and the
neural networks that support them don’t work as expected. It’s not just
that the architecture of the brain disrespects the boundaries between
the established mental categories. It’s that there’s so much overlap
that a single brain network “has more aliases than Sherlock Holmes,”
Barrett said.
Recent work
has found, for instance, that two-thirds of the brain is involved in
simple eye movements; meanwhile, half of the brain gets activated during
respiration. In 2019, several teams of scientists found that most of
the neural activity in “perception” areas such as the visual cortex was
encoding information about the animals’ movements rather than sensory inputs.
This identity crisis isn’t limited to neural centers of perception or
other cognitive functions. The cerebellum, a structure in the brains of
all vertebrates, was thought to be dedicated almost exclusively to
motor control, but scientists have found that it’s also instrumental in
attention processes, the regulation of emotions, language processing and
decision-making. The basal ganglia, another ancient part of the brain
usually associated with motor control, has been similarly implicated in
several high-level cognitive processes.
Some of these confusing results may come from methodological
problems. To find where the human brain performs different functions,
for instance, neuroscientists typically correlate cognitive processes
with patterns of brain activity measured by fMRI. But studies suggest
that researchers need to be more alert to irrelevant muscle twitches and fidgets that may contaminate the readings.
“You think that your results are telling you something about high-level cognition,” said György Buzsáki,
a neuroscientist at the NYU School of Medicine, “when in fact, it may
reflect nothing else except that, because of the task, [the subject’s]
eyes are moving differently.”
But he and other scientists believe the recent findings also
highlight a deeper conceptual problem in neuroscience. “We divide the
real estate of the brain according to our preconceived ideas, assuming —
wrongly, as far as I’m concerned — that those preconceived ideas have
boundaries, and the same boundaries exist in brain function,” Buzsáki
said.
In 2019, Russell Poldrack,
a neuroscientist at Stanford University, and his colleagues set out to
test how appropriate the recognized categories for mental function are.
They gathered a massive amount of behavioral data — obtained from
experiments designed to test different aspects of cognitive control,
including working memory, response inhibition and learning — and ran it
through a machine learning classifier. The resulting classifications
defied expectations, mixing up traditional categories of brain results
and sorting them into new groups that seemed to “move together in terms
of some much more generic constructs,” Poldrack said — constructs for
which we don’t yet have labels, and which might not relate directly to
our conscious experience.
Another study by Poldrack’s colleagues found that tasks meant to
measure either perception or memory “weren’t really measuring different
constructs after all,” Poldrack said. “It suggests that those two
categories are really imprecise.” It’s not that “perception” or “memory”
is a useless term, he emphasized. But “if we want to understand what
the brain does, we probably need much more precise ways to understand
particular functions.”
The fact that it’s not even clear how to differentiate tests of
perception from those of memory suggests that those categorical
constructs “may not actually be the real organizing features of the
mind,” Poldrack said.
Some scientists push back, arguing that so long as we know that the
visual cortex isn’t just involved in vision, or that a memory network is
doing more than its name suggests, we don’t necessarily need to rethink
the categories themselves. But “sometimes an overly broad, vague use of
a term can have detrimental effects on the types of experiments and
hypotheses we generate,” said John Krakauer, a neuroscientist at Johns Hopkins University.
That’s perhaps been most obvious in research on emotions and mood.
Fear and Confusion
Joseph LeDoux
is a neuroscientist at NYU known for his pioneering work on the
amygdala, which is often referred to as the fear center of the brain.
But that framing, he says, is very wrong — and very harmful. “I kept
being introduced over the years as someone who discovered how feelings
of fear come out of the amygdala,” he said. “But I would always kind of
flinch when I would be introduced this way. Finally, I had enough.”
LeDoux has spent the past decade emphasizing that the amygdala isn’t
involved in generating fear at all. Fear, he points out, is a cognitive
interpretation of a situation, a subjective experience tied up in memory
and other processes. The psychological phenomena that some people
experience as fear may be experienced as something very different by
others. Research shows that the feeling of fear arises in the prefrontal
cortex and related brain areas.
The amygdala, on the other hand, is involved with processing and
responding to threats — an ancient, subconscious behavioral and
physiological mechanism. “The evidence shows that it’s not always fear
that causes the behavior,” LeDoux said.
Calling the amygdala the fear center might seem innocuous, he
continued, but “then the amygdala inherits all the semantic baggage of
fear.” That mistake can distort attempts to develop medications,
including those aiming to reduce anxiety. When potential treatments are
tested in animals under stress, if the animals behave less timidly or
show less physiological arousal, it’s usually interpreted as a reduction
in anxiety or fear levels. But a medication can change someone’s
behavioral or physiological responses — those outputs of the amygdala —
without curing feelings of anxiety, LeDoux said.
“The whole field is suffering because of this confusion,” he said.
Similar problems occur in other areas, he added, such as studies of
perception, where the physical processing of the sensory stimulus and
the conscious experience of it are often bundled together. In both
cases, LeDoux believes “these need to be pulled apart.”
Functional in Context
But teasing apart the significance of different brain areas is
further complicated by the discovery that the involvement of neural
systems in particular functions isn’t simply all or nothing. Sometimes
it’s contingent on the details of what’s being processed.
Take the part of the medial temporal lobe called the perirhinal
cortex — a crucial component of the classic “memory” system in the
cortex. Elisabeth Murray
of the National Institute of Mental Health and others did experiments
in which humans and monkeys were asked to select a desired image from a
pair that were morphed to resemble each other to varying degrees.
They found that the perirhinal cortex was involved in the performance
of the task only when a particular amount of feature overlap was
present. If the images were more similar or less, the perirhinal cortex
had nothing to do with how well the humans or monkeys did. Similarly,
the inferior temporal cortex, traditionally assigned a role in visual
perception, was found to be crucial for memory tasks, but only in
certain contexts.
To the retired neurobiologist Steven Wise, formerly of NIMH, the
findings imply that instead of categorizing cortical areas in terms of
their specialized visual, auditory, somatosensory or executive
functions, researchers should study the different combinations of
information they represent. One region might be involved in representing
simple combinations of features, such as “orange” and “square” for an
orange square. Other regions might have evolved to represent more
complex combinations of visual features, or combinations of acoustic or
quantitative information.
Wise argues that this brain organization scheme explains why there’s
so much unexpected functional overlap in the traditional maps of mental
activity. When each region represents a particular combination of
information, “it does that for memory, and for perception, and for
attention, and for the control of action,” Wise said.
That’s also why the perception and memory tasks that Murray used in
her experiments only sometimes involved the perirhinal cortex: As the
images in each task morphed, the combinations of features that
distinguished them changed.
Wise’s representational framework is just one way of rethinking the
brain’s subdivisions. While other researchers agree that the parts list
guiding most neuroscientific research has problems, there’s little
consensus about how to address it.
And even scientists in favor of a more radical rethinking of the
field find it difficult to outline. “It’s easy to show how things are
not working. The hard part now is where to go from here,” said Luiz Pessoa,
a neuroscientist at the University of Maryland. “I’ve [often] caught
myself using a whole lot of terms that I was criticizing the very use
of. How can I say everything without saying ‘attention,’ ‘emotion,’
‘motivation’?”
Cisek, in Montreal, is one of several researchers starting to rebuild
the conceptual categories from an evolutionary perspective. For the
past five years, he has been painstakingly making his way through
vertebrate evolution, examining the progressive specialization of behavioral systems.
“Functional subdivisions do exist in the brain,” he said. “And they
actually have an evolutionary history to them. If we could identify that
history, it’ll help us identify the concepts better.”
Cisek has already used his new breakdown of brain activities to
explain why, for instance, the basal ganglia plays a key role in some
decision-making tasks but not others. “You realize that neither the term
‘decision-making’ nor the term ‘attention’ actually corresponds to a
thing in the brain,” he said. “Instead, there are certain very pragmatic
circuits in the brain, and they do certain things like ‘approach’ or
‘avoid.’ … Some of those things are going to look a bit like attention.”
Buzsáki takes a similar view. “We have to look at brain mechanisms
first, and why and how those things evolved,” he said. For instance,
memories, future planning and imagination are all partly encoded by the
same neural mechanisms, which makes sense from an evolutionary
perspective because the same system can be recycled for different
purposes. “You may be better off thinking about all of [those] as one,”
he said.
This approach is already leading to some intriguing discoveries. For
years, Buzsáki has studied sharp wave-ripples, a type of brain activity
in the hippocampus that enables the storage and retrieval of memories.
But this month in Nature,
his former doctoral student David Tingley and others in Buzsáki’s lab
revealed an entirely new function for them: helping to regulate blood
sugar levels.
“We are linking two very different extremes,” Buzsáki said — a basic
metabolic process and a high-level cognitive one. He’s now hoping to
uncover a deeper connection between the two, and to obtain insights into
how sharp wave-ripples for body regulation might have been repurposed
for memory formation.
Don’t Panic
Alternative approaches to studying mental categories are possible,
too. Barrett, Pessoa and others, for instance, are considering
whole-brain neural activity and an assortment of behaviors at the same
time. “You study the whole system as its parts interact,” Barrett said.
Functional categories such as memory, perception and attention can then
be understood as “features of the brain state.”
Because of the counterintuitive groupings that emerged in his earlier
study of behavioral data, Poldrack continues to be interested in
model-free, data-driven searches for new categories. He thinks mental
concepts could potentially be rewritten in computational terms — perhaps
as a simplified version of the mathematical descriptions that define
layers in artificial neural networks.
Each of these potential solutions has shortcomings. “But you don’t
evaluate a new approach by all the questions it answers that the old one
couldn’t,” Barrett said. “You evaluate it on the basis of what new
questions does it stimulate.”
“There is no right way to do this,” she added. “There are only better ways and worse ways.”
Poldrack agreed. “I don’t think any of us would want to tell people:
Don’t use the word ‘memory’ anymore,” he said. But to understand the
brain, we might need to challenge our intuitions about how it works —
“in the same way that quantum mechanics is challenging to comport with
our understanding of physical phenomena in the world.”
Another important consideration is how meaningful a new framework
might end up being. “You may gain in terms of knowledge, but you may
actually stop understanding yourself,” Krakauer said.
When we wonder how the brain works, he explained, we want it to mean:
What’s happening in my brain when I fall in love? Or when I’m excited?
If we move too far away from our subjective experience and familiar
cognitive concepts, he worries that what we learn about the brain might
be like “42” in The Hitchhiker’s Guide to the Galaxy: the correct answer, but not to the question we had in mind. “Now, are we willing to live with that?” Krakauer asked.
Educational psychologist Kenneth Kiewra has some advice to help you retain and remember more.
Photo by Iryna Tsiak on Unsplash
Minda Zetlin
You’re sitting in an important meeting with a client, a professional workshop that you’ve paid a lot of money for, or an MBA class.
You want to absorb as much of the information you’re getting as you
possibly can. Is there a way to take notes that will help you both
remember more information and be able to bring back as much as possible
when you read over your notes later on?
Yes, it turns out. Kenneth Kiewra is a professor of
educational psychology at the University of Nebraska-Lincoln who's been
studying note-taking techniques for 40 years. In 2019, he distilled that
knowledge into a Quartz piece offering seven steps to taking better notes. It's all great advice and well worth the read. Here are my favorite of his tips:
1. Write as much as you can, writing by hand rather than on a keyboard or mobile device.
If you’re anything like me, you may find this advice
contradictory. After all, I can more or less capture everything someone
says when typing on a computer keyboard and much less if I’m writing
longhand, even when I’m using abbreviations, such as “bsns” for business
and “mgt” for management.
But, Kiewra explains, there are two reasons that taking
notes on paper is better than taking notes on a laptop. The first is
that students who are using a computer are much more likely to
multitask, checking email, doing other homework or even playing video
games whenever they get bored during a lecture. (Yup, I’ve done that.)
The second reason is that my tendency to write down everything someone
says, what Kiewra calls verbatim notes, may be useful when I’m doing
interviews, but it’s not the best way to absorb information. For one
thing, it’s easy to miss visual information, such as charts and graphs.
For another, research shows that verbatim notes are associated with
“shallow, non-meaningful learning,” he writes. “Because longhand notes
are qualitatively better than laptop notes, reviewing them leads to
higher achievement than reviewing laptop notes.”
2. Sweat the details.
Most of us, and most college students as well, are pretty
good at writing down the main points of any lecture or presentation, or
what Kiewra refers to as Level 1 learning. But we gain much more
knowledge and understanding when we go deeper than Level 1, past the
main points and general principles into the facts and details.
As an example, let’s take Brexit (Britain's planned exit from the European Union), something I’ve been writing about a lot lately.
Level 1.
Brexit is a mess. British leadership can’t seem to agree among
themselves or with European leaders about what the relationship between
the E.U. and Britain should look like when (and if) Britain leaves the
E.U.
Level 2. The
biggest disagreement among Britons is what should happen if--as seems
likely--Britain is unable to negotiate trade agreements with the E.U.
that Parliament will approve by October 31, which is currently the
deadline for Brexit. Some, including Britain’s new prime minister, Boris
Johnson, favor a no-deal or “hard” Brexit, but most members of
Parliament, and most British people, seem to be against that.
Level 3. The
biggest area of disagreement, and thus the likeliest reason for a hard
Brexit, focuses on Ireland, where there’s a line dividing Northern
Ireland, which is part of Britain, from the Republic of Ireland, which
is a member of the E.U. Back when that line was an international border,
it was a focal point for violence. No one wants to see an international
border, complete with checkpoints and customs and immigration officials
there again. Unless Britain is willing to abide by E.U. customs laws,
at least temporarily, there’s no way to avoid it. But some Brexit
proponents say abiding by E.U. customs laws defeats the purpose of
leaving the E.U.
You get the idea. Kiewra writes that in one
study, students were able to retain 80 percent of a lesson’s main
ideas, but recalled less and less as you went down to Level 2, Level 3,
and Level 4. In particular, only 13 percent of students wrote down
examples, even though examples are often the best way to understand new
ideas.
Go deeper than Level 1 whenever you can, writing down as
many details as you can capture. And for heaven’s sake, make sure to
write down any examples that come up.
3. Revise your notes as soon as you can.
Kiewra writes that one common mistake people make is that
they take notes, and then review notes, but never revise them. You
should revise your notes as soon as possible after a lecture, meeting,
or workshop, or even during the event if there’s a pause or break. Read
over your notes, Kiewra advises, using them to try to recall what was
said. Write down any additional details or points of information or
ideas that reading your notes helps bring to mind. (It’ll help if you
leave plenty of space in your original notes for these additions.)
Your revised notes will contain a lot more information and
detail than your notes taken in the moment did. And both the notes
themselves and the act of having written them and then added to them
will help you retain much much, more of what you learned for future use.
Minda Zetlin is a business technology writer and speaker, co-author of The Geek Gap, and former president of the American Society of Journalists and Authors. She lives in Snohomish, Washington.
After
a year of lockdown, many of us are finding it hard to think clearly, or
remember what happened when. Neuroscientists and behavioural experts
explain why
Before
the pandemic, psychoanalyst Josh Cohen’s patients might come into his
consulting room, lie down on the couch and talk about the traffic or the
weather, or the rude person on the tube. Now they appear on his
computer screen and tell him about brain fog. They talk with urgency of
feeling unable to concentrate in meetings, to read, to follow
intricately plotted television programmes. “There’s this sense of
debilitation, of losing ordinary facility with everyday life; a
forgetfulness and a kind of deskilling,” says Cohen, author of the
self-help book How to Live. What to Do. Although restrictions are now
easing across the UK, with greater freedom to circulate and socialise,
he says lockdown for many of us has been “a contraction of life, and an
almost parallel contraction of mental capacity”.
This
dulled, useless state of mind – epitomised by the act of going into a
room and then forgetting why we are there – is so boring, so lifeless.
But researchers believe it is far more interesting than it feels: even
that this common experience can be explained by cutting-edge
neuroscience theories, and that studying it could further scientific
understanding of the brain and how it changes. I ask Jon Simons,
professor of cognitive neuroscience at the University of Cambridge,
could it really be something “sciencey”? “Yes, it’s definitely something
sciencey – and it’s helpful to understand that this feeling isn’t
unusual or weird,” he says. “There isn’t something wrong with us. It’s a
completely normal reaction to this quite traumatic experience we’ve
collectively had over the last 12 months or so.”
What
we call brain fog, Catherine Loveday, professor of cognitive
neuroscience at the University of Westminster, calls poor “cognitive
function”. That covers “everything from our memory, our attention and
our ability to problem-solve to our capacity to be creative.
Essentially, it’s thinking.” And recently, she’s heard a lot of
complaints about it: “Because I’m a memory scientist, so many people are
telling me their memory is really poor, and reporting this cognitive
fog,” she says. She knows of only two studies exploring the phenomenon
as it relates to lockdown (as opposed to what some people report as a
symptom of Covid-19, or long Covid): one from Italy, in which
participants subjectively reported these sorts of problems with
attention, time perception and organisation; another in Scotland which
objectively measured participants’ cognitive function across a range of
tasks at particular times during the first lockdown and into the summer.
Results showed that people performed worse when lockdown started, but
improved as restrictions loosened, with those who continued shielding
improving more slowly than those who went out more.
Loveday
and Simons are not surprised. Given the isolation and stasis we have
had to endure until very recently, these complaints are exactly what
they expected – and they provide the opportunity to test their theories
as to why such brain fog might come about. There is no one explanation,
no single source, Simons says: “There are bound to be a lot of different
factors that are coming together, interacting with each other, to cause
these memory impairments, attentional deficits and other processing
difficulties.”
One powerful factor could be the
fact that everything is so samey. Loveday explains that the brain is
stimulated by the new, the different, and this is known as the orienting
response: “From the minute we’re born – in fact, from before we’re born
– when there is a new stimulus, a baby will turn its head towards it.
And if as adults we are watching a boring lecture and someone walks into
the room, it will stir our brain back into action.”
Most
of us are likely to feel that nobody new has walked into our room for
quite some time, which might help to explain this sluggish feeling
neurologically: “We have effectively evolved to stop paying attention
when nothing changes, but to pay particular attention when things do
change,” she says. Loveday suggests that if we can attend a work meeting
by phone while walking in a park, we might find we are more awake and
better able to concentrate, thanks to the changing scenery and the
exercise; she is recording some lectures as podcasts, rather than
videos, so students can walk while listening. She also suggests spending
time in different rooms at home – or if you only have one room, try
“changing what the room looks like. I’m not saying redecorate – but you
could change the pictures on the walls or move things around for
variety, even in the smallest space.”
Brain fog has resulted partly from ‘degraded social interaction’. Illustration: Franz Lang/The Guardian
The
blending of one day into the next with no commute, no change of scene,
no change of cast, could also have an important impact on the way the
brain processes memories, Simons explains. Experiences under lockdown
lack “distinctiveness” – a crucial factor in “pattern separation”. This
process, which takes place in the hippocampus, at the centre of the
brain, allows individual memories to be successfully encoded, ensuring
there are few overlapping features, so we can distinguish one memory
from another and retrieve them efficiently. The fuggy, confused
sensation that many of us will recognise, of not being able to remember
whether something happened last week or last month, may well be with us
for a while, Simons says: “Our memories are going to be so difficult to
differentiate. It’s highly likely that in a year or two, we’re still
going to look back on some particular event from this last year and say,
when on earth did that happen?”
Perhaps one of
the most important features of this period for brain fog has been what
Loveday calls the “degraded social interaction” we have endured. “It’s
not the same as natural social interaction that we would have,” she
says. “Our brains wake up in the presence of other people – being with
others is stimulating.” We each have our own optimum level of
stimulation – some might feel better able to function in lockdown with
less socialising; others are left feeling dozy, deadened. Loveday is
investigating the science of how levels of social interaction, among
other factors, have affected memory function in lockdown. She also
wonders if our alternative to face-to-face communication – platforms
such as Zoom – could have an impact on concentration and attention. She
theorises – and is conducting a study to explore this – that the lower
audio-visual quality could “create a bigger cognitive load for the
brain, which has to fill in the gaps, so you have to concentrate much
harder.” If this is more cognitively demanding, as she thinks, we could
be left feeling foggier, with “less brain space available to actually
listen to what people are saying and process it, or to concentrate on
anything else.”
Carmine
Pariante, professor of biological psychiatry at King’s College London,
is also intrigued by brain fog. “It’s a common experience, but it’s very
complex,” he says. “I think it is the cognitive equivalent of feeling
emotionally distressed; it’s almost the way the brain expresses sadness,
beyond the emotion.” He takes a psycho-neuro-immuno-endocrinological
approach to the phenomenon – which is even more fascinating than it is
difficult to say. He believes we need to think about the mind, the
brain, the immune and the hormonal systems to understand the various
mental and physical processes that might underlie this lockdown haze,
which he sees as a consequence of stress.
We
might all agree that the uncertainty of the last year has been quite
stressful – more so for some than for others. When our mind appraises a
situation as stressful, Pariante explains, our brain immediately
transmits the message to our immune and endocrine systems. These systems
respond in exactly the same way they did in early humans two million years ago
on the African savannah, when stress did not relate to home schooling,
but to fear of being eaten by a large animal. The heart beats faster so
we can run away, inflammation is initiated by the immune system to
protect against bacterial infection in case we are bitten, the hormone
cortisol is released to focus our attention on the predator in front of
us and nothing else. Studies have demonstrated that a dose of cortisol
will lower a person’s attention, concentration and memory for their
immediate environment. Pariante explains: “This fog that people feel is
just one manifestation of this mechanism. We’ve lost the function of
these mechanisms, but they are still there.” Useful for fighting a lion –
not for remembering where we put our glasses.
When
I have experienced brain fog, I have seen it as a distraction, a kind
of laziness, and tried to push through, to force myself to concentrate.
But listening to Loveday, Simons and Pariante, I’m starting to think
about it differently; perhaps brain fog is a signal we should listen to.
“Absolutely, I think it’s exactly that,” says Pariante. “It’s our body
and our brain telling us that we’re pushing it too much at the moment.
It’s definitely a signal – an alarm bell.” When we hear this alarm, he
says, we should stop and ask ourselves, “Why is my brain fog worse today
than yesterday?” – and take as much time off as we can, rather than
pushing ourselves harder and risking further emotional suffering, and
even burnout.
For
Cohen, the phenomenon of brain fog is an experience of one of the most
disturbing aspects of the unconscious. He talks of Freud’s theory of
drives – the idea that we have one force inside us that propels us
towards life; another that pulls us towards death. The life drive, Cohen
explains, impels us to create, make connections with others, seek “the
expansion of life”. The death drive, by contrast, urges “a kind of
contraction. It’s a move away from life and into a kind of stasis or
entropy”. Lockdown – which, paradoxically, has done so much to preserve
life – is like the death drive made lifestyle. With brain fog, he says,
we are seeing “an atrophy of liveliness. People are finding themselves
to be more sluggish, that their physical and mental weight is somehow
heavier, it’s hard to carry around – to drag.” Freud has a word for
this: trägheit – translated as a “sluggishness”, but which
Cohen says literally translates as “draggyness”. We could understand
brain fog as an encounter with our death drive – with the part of us
which, in Cohen’s words, is “going in the opposite direction of
awareness and sparkiness, and in the direction of inanimacy and shutting
down”.
This brings to mind another psychoanalyst: Wilfred Bion.
He theorised that we have – at some moments – a will to know something
about ourselves and our lives, even when that knowledge is profoundly
painful. This, he called being in “K”. But there is also a powerful will
not to know, a wish to defend against this awareness so that we can
continue to live cosseted by lies; this is to be in “–K” (spoken as
“minus K”). I wonder if the pandemic has been a reality some of us feel
is too horrific to bear. The uncertainty, the deaths, the trauma, the
precarity; perhaps we have unconsciously chosen to live in the misty,
murky brain fog of –K rather than to face, to suffer, the true pain and
horror of our situation. Perhaps we are having problems with our
thinking because the truth of the experience, for many of us, is simply
unthinkable.
I ask Simons if, after the
pandemic, he thinks the structure of our brains will look different on a
brain scan: “Probably not,” he says. For some of us, brain fog will be a
temporary state, and will clear as we begin to live more varied lives.
But, he says, “It’s possible for some people – and we are particularly
concerned about older adults – that where there is natural neurological
decline, it will be accelerated.”
Simons
and a team of colleagues are running a study to investigate the impact
of lockdown on memory in people aged over 65 – participants from a
memory study that took place shortly before the pandemic, who have now
agreed to sit the same tests a year on, and answer questions about life
in the interim. One aim of this study is to test the hypothesis of
cognitive reserve – the idea that having a rich and varied social life,
filled with intellectual stimulation, challenging, novel experiences and
fulfilling relationships, might help to keep the brain stimulated and
protect against age-related cognitive decline. Simons’ advice to us all
is to get out into the world, to have as rich and varied experiences and
interactions as we can, to maximise our cognitive reserve within the
remaining restrictions. The more we do, the more the brain fog should
clear, he says: “We all experience grief, times in our lives where we
feel like we can’t function at all,” he says. “These things are
mercifully temporary, and we do recover.”
You've read
in the last year
…
we have a small favour to ask. Through these turbulent and challenging
times, millions rely on the Guardian for independent journalism that
stands for truth and integrity. Readers chose to support us financially
more than 1.5 million times in 2020, joining existing supporters in 180
countries.
With
your help, we will continue to provide high-impact reporting that can
counter misinformation and offer an authoritative, trustworthy source of
news for everyone. With no shareholders or billionaire owner, we set
our own agenda and provide truth-seeking journalism that’s free from
commercial and political influence. When it’s never mattered more, we
can investigate and challenge without fear or favour.
Unlike
many others, we have maintained our choice: to keep Guardian journalism
open for all readers, regardless of where they live or what they can
afford to pay. We do this because we believe in information equality,
where everyone deserves to read accurate news and thoughtful analysis.
Greater numbers of people are staying well-informed on world events, and
being inspired to take meaningful action.
We
aim to offer readers a comprehensive, international perspective on
critical events shaping our world – from the Black Lives Matter
movement, to the new American administration, Brexit, and the world's
slow emergence from a global pandemic. We are committed to upholding our
reputation for urgent, powerful reporting on the climate emergency, and
made the decision to reject advertising from fossil fuel companies,
divest from the oil and gas industries, and set a course to achieve net
zero emissions by 2030.
Having been in indoors for two weeks because someone in my family tested positive to Corona Virus, I am longing to get out and walk and sample the fresh air once more.
Give a shoutout to Lital Levy on social or copy the text below to attribute.
Regular exercise changes the structure of our bodies’ tissues in
obvious ways, such as reducing the size of fat stores and increasing
muscle mass. Less visible, but perhaps even more important, is the
profound influence exercise has on the structure of our brains – an influence that can protect and preserve brain health and function throughout life. In fact, some experts believe that the human brain may depend on regular physical activity to function optimally throughout our lifetime.
Here are just a few ways exercise changes the structure of our brain.
Memory
Many studies suggest that exercise can help protect our memory as we
age. This is because exercise has been shown to prevent the loss of
total brain volume (which can lead to lower cognitive function), as well
as preventing shrinkage in specific brain regions associated with
memory. For example, one magnetic resonance imaging (MRI) scan study
revealed that in older adults, six months of exercise training increases brain volume.
Another study showed that shrinkage of the hippocampus (a brain
region essential for learning and memory) in older people can be reversed by regular walking. This change was accompanied by improved memory function and an increase of the protein brain-derived neutropic factor (BDNF) in the bloodstream.
Fight back against disinformation. Get your news here, direct from experts
BDNF is essential for healthy cognitive function due to its roles in cell survival, plasticity (the brain’s ability to change and adapt from experience) and function. Positive links between exercise, BDNF and memory have been widely investigated and have been demonstrated in young adults and older people.
BDNF is also one of several proteins linked with adult neurogenesis, the brain’s ability to modify its structure by developing new neurons
throughout adulthood. Neurogenesis occurs only in very few brain
regions – one of which is the hippocampus – and thus may be a central
mechanism involved in learning and memory. Regular physical activity may
protect memory in the long term by inducing neurogenesis via BDNF.
While this link between exercise, BDNF, neurogenesis, and memory is
very well described in animal models, experimental and ethical
constraints mean that its importance to human brain function is not quite so clear. Nevertheless exercise-induced neurogenesis is being actively researched as a potential therapy for neurological and psychiatric disorders, such as Alzheimer’s disease, Parkinson’s disease and depression.
Blood vessels
The brain is highly dependent on blood flow, receiving approximately
15% of the body’s entire supply – despite being only 2-3% of our body’s
total mass. This is because our nervous tissues need a constant supply
of oxygen to function and survive. When neurons become more active,
blood flow in the region where these neurons are located increases to meet demand. As such, maintaining a healthy brain depends on maintaining a healthy network of blood vessels.
Regular exercise increases the growth of new blood vessels in the
brain regions where neurogenesis occurs, providing the increased blood
supply that supports the development of these new neurons. Exercise also improves the health and function
of existing blood vessels, ensuring that brain tissue consistently
receives adequate blood supply to meet its needs and preserve its
function.
Finally, regular exercise can prevent, and even treat, hypertension (high blood pressure), which is a risk factor for development of dementia. Exercise works in multiple ways to enhance the health and function of blood vessels in the brain.
Inflammation
Recently, a growing body of research has centred on microglia, which
are the resident immune cells of the brain. Their main function is to
constantly check the brain for potential threats from microbes or dying or damaged cells, and to clear any damage they find.
With age, normal immune function declines and chronic, low-level
inflammation occurs in body organs, including the brain, where it
increases risk of neurodegenerative disease,
such as Alzheimer’s disease. As we age, microglia become less efficient
at clearing damage, and less able to prevent disease and inflammation.
This means neuroinflammation can progress, impairing brain functions – including memory.
But recently, we’ve shown that exercise can reprogramme these microglia
in the aged brain. Exercise was shown to make the microglia more energy
efficient and capable of counteracting neuroinflammatory changes that
impair brain function. Exercise can also modulate neuroinflammation in
degenerative conditions like Alzheimer’s disease and multiple sclerosis.
This shows us the effects of physical activity on immune function may
be an important target for therapy and disease prevention.
So how can we ensure that we’re doing the right kind of exercise – or
getting enough of it – to protect the brain? As yet, we don’t have
robust enough evidence to develop specific guidelines for brain health
though findings to date suggest that the greatest benefits are to be
gained by aerobic exercise – such as walking, running, or cycling. It’s recommended adults get a minimum of 150 minutes per week
of moderate intensity aerobic exercise, combined with activities that
maintain strength and flexibility, to maintain good general health.
It must also be noted that researchers don’t always find exercise has beneficial effect
on the brain in their studies – likely because different studies use
different exercise training programmes and measures of cognitive
function, making it difficult to directly compare studies and results.
But regardless, plenty of research shows us that exercise is beneficial
for many aspects of our health, so it’s important to make sure you’re
getting enough. We need to be conscious of making time in our day to be
active – our brains will thank us for it in years to come.