Showing posts with label Brain. Show all posts
Showing posts with label Brain. Show all posts

Tuesday, 24 November 2020

Overtaxed Working Memory Knocks the Brain Out of Sync

Memory 


Overtaxed Working Memory Knocks the Brain Out of Sync

Researchers find that when working memory gets overburdened, dialogue between brain regions breaks down. The discovery provides new support for a broader theory about how the brain operates.

Quanta Magazine

Give a shoutout to Tom Ezzatkhah on social or copy the text below to attribute.

Humans can hold only four or five items in their conscious awareness, or working memory, at one time. Overloading that capacity causes the neurological juggling act behind working memory to fall apart. Credit: Malina Omut for Quanta Magazine.

In 1956, the renowned cognitive psychologist George Miller published one of the field’s most widely cited papers, “The Magical Number Seven, Plus or Minus Two.” In it, he argued that although the brain can store a whole lifetime of knowledge in its trillions of connections, the number of items that humans can actively hold in their conscious awareness at once is limited, on average, to seven.

Those items might be a series of digits, a handful of objects scattered around a room, words in a list, or overlapping sounds. Whatever they are, Miller wrote, only seven of them can fit in what’s called working memory, where they are available for our focused attention and other cognitive processes. Their retention in working memory is short-lived and bounded: When they’re no longer actively being thought about, they’re stored elsewhere or forgotten.

Since Miller’s time, neuroscientists and psychologists have continued to study working memory and its surprisingly strict limitations. They have found that the limit may really be closer to four or five items than seven. And they have studied the ways in which people work around this constraint: We can remember all the digits of a phone number by “chunking” digits (remembering 1, then 4, as the single item 14, for instance), or develop mnemonic devices for shuffling random digits of pi out of longer-term storage.

But the explanation for why working memory starts to falter at such a seemingly low threshold has been elusive. Scientists can see that any attempt to exceed that limit causes the information to degrade: Neuronal representations get “thinner,” brain rhythms change and memories break down. This seems to occur with an even smaller number of items in patients who have been diagnosed with neurological disorders, such as schizophrenia.

The mechanism causing these failures, however, has remained unknown until recently.

In a paper published in Cerebral Cortex in March 2018, three scientists found that a significant weakening in “feedback” signals between different parts of the brain is responsible for the breakdown. The work not only provides insights into memory function and dysfunction, but also offers further evidence for a burgeoning theory of how the brain processes information.

Synchronized Humming in the Brain

Earl Miller, a neuroscientist at the Picower Institute for Learning and Memory at the Massachusetts Institute of Technology; Dimitris Pinotsis, a research affiliate in his lab; and Timothy Buschman, an assistant professor at Princeton University, wanted to know what sets the capacity limit of working memory so low.

They already knew that a network involving three brain regions — the prefrontal cortex, the frontal eye fields and the lateral intraparietal area — is active in working memory. But they had yet to observe a change in neural activity that corresponded to the steep transition between remembering and not remembering that comes with exceeding the working memory limit.

So they returned to a working memory test that Miller’s lab had performed a few years earlier, in which the researchers showed monkeys a series of screens: first, a set of colored squares, followed briefly by a blank screen, and then the initial screen once more, this time with the color of one square changed. The animals had to detect the difference between the screens. Sometimes the number of squares fell below their working memory capacity, sometimes above. Electrodes placed deep in the monkeys’ brains recorded the timing and frequency of brain waves produced by various populations of neurons as they completed each task.

These waves are essentially the coordinated rhythms of millions of neurons that become active and go quiet simultaneously. When brain areas exhibit matching oscillations, both in time and in frequency, they’re said to be synchronized. “It’s like they’re humming together,” Miller said. “And the neurons that hum together are talking.” He likens it to a traffic system: The brain’s physical connections act like roads and highways, while the patterns of resonance created by these oscillating brain waves “humming” together are the traffic lights that actually direct the flow of traffic. This setup, researchers hypothesize, somehow seems to help “bind” active networks into a firmer representation of an experience.

In their recent work, Miller and his colleagues mined the oscillation data they’d collected from the monkeys for information about how this three-part memory network functions. They built a detailed mechanistic model that incorporated assumptions about the network’s structure and activity, based on previous research: the locations and behaviors (say, excitatory or inhibitory) of specific neural populations, for example, or the frequencies of certain oscillations. The researchers then generated several competing hypotheses for how the different brain areas might be “talking” to one another — including the direction and strength of that dialogue — as the monkeys had to remember more and more items. They compared those computations to their experimental data to determine which of the scenarios was most likely.

MemoryLimits_560.jpg

Credit: Lucy Reading-Ikkanda/Quanta Magazine.

Their modeling confirmed that the three brain regions act like jugglers engaged in a complex game of catch. The prefrontal cortex seems to help construct an internal model of the world, sending so-called “top-down,” or feedback, signals that convey this model to lower-level brain areas. Meanwhile, the superficial frontal eye fields and lateral intraparietal area send raw sensory input to the deeper areas in the prefrontal cortex, in the form of bottom-up or feedforward signals. Differences between the top-down model and the bottom-up sensory information allow the brain to figure out what it’s experiencing, and to tweak its internal models accordingly.

Miller and his colleagues found that when the number of items to be remembered exceeded the capacity of the monkeys’ working memory, the top-down feedback connection from the prefrontal cortex to the other two regions broke down. The feedforward connections, on the other hand, remained just fine.

The weakening of the feedback signals, according to the group’s models, led to a loss of synchrony between the brain areas. Without the prediction-oriented communications from the prefrontal cortex, the working memory network fell out of sync.

Updating the Model

But why is the top-down feedback so vulnerable to an increase in the number of items to be remembered? The researchers’ hypothesis is that the modeled information coming from the prefrontal cortex essentially represents a set of predictions about what the brain will perceive in the world — in this case, the contents of the items being held in working memory. “For example, as you are reading this sentence, you will have expectations about the current word, phrase and sentence,” Karl Friston, a neuroscientist at University College London who was not involved with the study, wrote in an email. “Having a representation or expectation about the current sentence means you have an implicit representation of the past and future.”

Many neuroscientists believe that the brain relies heavily on such “predictive coding” of sensory data to perform its routine cognitive and command functions. But Miller and his colleagues theorize that when the quantity of items placed in working memory gets too large, the number of possible predictions for those items cannot easily be encoded into the feedback signal. As a result, the feedback fails and the overloaded working memory system collapses.

Miller’s lab and others are working to carve out a more important role for the interplay between brain waves in scientists’ model of working memory, which traditionally places most of the emphasis on the firing activity of individual neurons. They’re also currently investigating why the upper bound on working memory hovers around four or five items, and not some other number. Miller thinks the brain is juggling the items being held in working memory one at a time, in alternation. “That means all the information has to fit into one brain wave,” he said. “When you exceed the capacity of that one brain wave, you’ve reached the limit on working memory.”

“The question now is where all this is going to take us,” said Rufin VanRullen, a researcher at the French National Center for Scientific Research who finds the team’s modeling and conclusions “powerful,” pending further experimental confirmation. “We need to actually go inside the brain and find more direct evidence for these connections.”

The potential payoff is high. Cementing a predictive coding model for working memory won’t just enable a better understanding of how the brain works and what might go wrong in neurological diseases. It also has critical implications for what we mean by “intelligence” — and even selfhood, according to Friston. As a start, having a better grasp of what the brain’s feedback connections are doing could lead to big steps in artificial intelligence research, which currently focuses more on feedforward signals and classification algorithms. “But sometimes a system might need to make a decision not about what it sees but based on what it remembers,” Pinotsis said.

Jordana Cepelewicz is a staff writer at Quanta Magazine who covers biology.

Sunday, 12 July 2020

Slow Walking

Walking



  • 12 October 2019

How fast people walk in their 40s is a sign of how much their brains, as well as their bodies, are ageing, scientists have suggested.
Using a simple test of gait speed, researchers were able to measure the ageing process.
Not only were slower walkers' bodies ageing more quickly - their faces looked older and they had smaller brains.
The international team said the findings were an "amazing surprise".
Doctors often measure gait speed to gauge overall health, particularly in the over-65s, because it is a good indicator of muscle strength, lung function, balance, spine strength and eyesight.
Slower walking speeds in old age have also been linked to a higher risk of dementia and decline.

'Problem sign'

In this study, of 1,000 people in New Zealand - born in the 1970s and followed to the age of 45 - the walking speed test was used much earlier, on adults in mid-life.
The study participants also had physical tests, brain function tests and brain scans, and during their childhood they had had cognitive tests every couple of years.
"This study found that a slow walk is a problem sign decades before old age," said Prof Terrie E Moffitt, lead author from King's College London and Duke University in the US.
Even at the age of 45, there was a wide variation in walking speeds with the fastest moving at over 2m/s at top speed (without running).
In general, the slower walkers tended to show signs of "accelerated ageing" with their lungs, teeth and immune systems in worse shape than those who walked faster.
Image copyright Duke University
Image caption Researchers tested the walking speed of participants on an 8m-long pad
The more unexpected finding was that brain scans showed the slower walkers were more likely to have older-looking brains too.
And the researchers found they were able to predict the walking speed of 45-year-olds using the results of intelligence, language and motor skills tests from when they were three.
The children who grew up to be the slowest walkers (with a mean gait of 1.2m/s) had, on average, an IQ 12 points lower than those who were the fastest walkers (1.75m/s) 40 years later.

Lifestyle link

The international team of researchers, writing in JAMA Network Open, said the differences in health and IQ could be due to lifestyle choices or a reflection of some people having better health at the start of life.
But they suggest there are already signs in early life of who is going to fare better in health terms in later life.
The researchers said measuring walking speed at a younger age could be a way of testing treatments to slow human ageing.
A number of treatments, from low-calorie diets to taking the drug metformin, are currently being investigated.
It would also be an early indicator of brain and body health so people can make changes to their lifestyle while still young and healthy, the researchers said.

Related Topics

More on this story

Related Internet links

The BBC is not responsible for the content of external Internet sites

Why aren't more of our politicians wearing masks?

'Being in the bath just isn't the same as a pool'

Memories of Srebrenica massacre 25 years on

National Gallery's five star feast for the eyes ★★★★★

The young jobless of '09 - did a cash boost help them?

'I'm so sorry for everything': Lockdown exes talk

The two men fighting for Poland's future

'Our Irish island is the last stop before America'

Elsewhere on the BBC



t to clarify face coverings in England

Wednesday, 29 April 2020

This Is Why You Can’t Remember Yesterday

Remembering

Science explains why time is so disorienting and mind-numbing these days.

 

 

 

If the thousands of tweets referencing the movie Groundhog Day are any indication, those Americans under stay-at-home directives are feeling the dull weight of monotony pressing down on their shoulders. Variety may be the spice of life, but it’s also the substance of memory. Without novel experiences to demarcate one day or week from the next, the shape of time can bend and stretch in disorienting ways.
“When we look back at those days and weeks where not much happened — where it’s the same every day — not much is stored in memory and time feels [as though it has] passed very quickly,” says Marc Wittmann, a research fellow at the Institute for Frontier Areas in Psychology and Mental Health in Freiburg, Germany.
Wittmann has written extensively about “felt time.” He says that while monotony can compress the brain’s perception of time over long periods, boredom can slow down the perception of time’s passage “in the here and now” — meaning minutes or hours seem to drag on and on.
Along with boredom, anxiousness can also make time appear to slow to a snail’s pace, he says. While the overlapping Covid-19-related threats of sickness, economic hardship, and social instability are enough to make anyone feel uneasy, experts who study social isolation say that too little face-to-face interaction can be a potent promoter of anxiety in and of itself.
Without novel experiences to demarcate one day or week from the next, the shape of time can bend and stretch in disorienting ways.

Paranoia, missing routines, and disorientation

“Human beings by their nature are social animals, and when you deprive them of social interaction, that has massive repercussions,” says Dr. Terry Kupers, a psychiatrist at the Wright Institute in Berkeley, California.
Much of Kupers’ work has examined the psychological effects of solitary confinement in U.S. prisons. “The situation of a prisoner in solitary confinement is qualitatively different and much more dire than that of a citizen in shelter-in-place,” he says. “But I think people who are sheltering in place can experience some of the same psychological symptoms as people in solitary confinement.” That may be especially true for those Americans who live alone and are not able to connect face-to-face with friends and loved ones.
“One of the first symptoms to emerge is anxiety,” Kupers says. “People who are isolated have panic attacks and feel very anxious.” Paranoia is another common emotion.
“When you don’t have other people to talk to, thoughts and ideas can get very jumbled.” He says that human beings seem to be somewhat hardwired for paranoid thinking, and that spending time in the company of others tends to moderate this emotion. When that kind of interaction is denied or limited, thoughts can wander into irrational places.
Zoom calls and FaceTime chats — as well as regular phone calls, text exchanges, and other digital interactions — are surely better than nothing, Kupers says. “When that’s the only way [to connect], I think it’s important to do that,” he adds. “But I think [these are] nowhere near the same as the contact we would have if we were together in a room.”
Finally, he says besides missing social interactions, the lack of a regular routine can cause issues. “A disorientation comes from not having markers associated with a daily schedule,” he says. To avoid this disorientation, it’s helpful to get up at the same time each day, and to follow a regular schedule of work, chores, exercise, and other activities.
“Creating a schedule that approximates normal life can help one from falling into disorientation and confusion,” he says. Going to bed and getting up at the same time each day can also help calibrate the body’s internal clocks in ways that promote deep sleep and prevent daytime grogginess and other cognitive symptoms.
“When you don’t have other people to talk to, thoughts and ideas can get very jumbled.”

The anxious brain craves “flow activities”

While distraction is normally viewed as a bad thing, it can be helpful in certain situations — like when a person is anxious and trying to avoid unconstructive thoughts.
“There are things the anxious brain wants to do, and those are not necessarily helpful things,” says Kate Sweeny, a professor of psychology at the University of California, Riverside. Worrying is one of them, she says. Fretting about Covid-19 or the challenges it presents is useful if a person can take steps to address those concerns. “But if you’ve done what you can, your goal should be to actively engage in activities that distract your brain from those anxious thoughts,” she says.
Some of her research has examined how different forms of distraction can help people weather periods of uncertainty and anxiety — like when someone is awaiting the results of a biopsy. She says that the most helpful activities are ones that induce “flow,” or the experience of complete enjoyment or absorption.
A riveting film or TV show could fit the bill nicely, which helps explain why a lot of Americans fell hard for the misfit intrigue of Tiger King. But Sweeny says that the most flow-y pursuits tend to have elements of personal challenge and feedback.
Baking bread — another activity that seems to have caught the fancy of cooped-up Americans — checks these boxes. So do video games. One of Sweeny’s studies, published last year in the journal Emotion, found that Tetris was broadly effective at inducing flow, and the same is surely true of newer, more advanced video games. (The attention-grabbing, flow-inducing power of video games is so well established that “gamification” is now a popular approach to UX design in apps and online platforms — from language-learning programs to social media sites.)
“I’m not saying everyone should play video games to manage their worry,” she says. “But if you’re feeling overwhelmed with worry, there’s some inherent utility in turning that down, and flow activities can do that.”

Markham Heid
Written by

Health and science writer. Father of two. Technoskeptic, though not a technocynic.

Friday, 24 April 2020

How to Train Your Brain to Remember Almost Anything

Remembering





Success is largely based on what you know — everything you know informs the choices you make. And those choices are either getting you closer to what you want or increasing the distance between you and your ultimate goals in life.
Many people want to learn better and faster, retain more information, and be able to apply that knowledge at the right time.
But the reality is that we forget a lot of what we learn. Human forgetting follows a pattern. In fact, research shows that within just one hour, if nothing is done with new information, most people will have forgotten about 50% of what they learned. After 24 hours, this amount increases to 70%, and if a week passes without that information being used, up to 90% of it could be lost.
To improve knowledge acquisition and retention, new information must be consolidated and securely stored in long-term memory.
According to Elizabeth Bjork, PhD, a professor of cognitive psychology at UCLA who worked on a theory of forgetting along with Piotr Wozniak, a Polish researcher best known for his work on SuperMemo (a learning system based on spaced repetition), long-term memory can be characterized by two components: retrieval strength and storage strength. Retrieval strength measures how likely you are to recall something right now, or how close it is to the surface of your mind. Storage strength measures how deeply the memory is rooted.
Research shows that within just one hour, if nothing is done with new information, most people will have forgotten about 50% of what they learned.
If we want our learning to stick, we have to do more than just aim to read a book every week or passively listen to an audiobook or podcast. Instead, reread chapters you didn’t comprehend the first time, write down or practice what you learned the previous week before continuing to the next chapter or lesson, or take notes, if that works for you. If you are struggling to remember, refer to the information. By forcing yourself to remember past information, you’re cementing the new knowledge in your mind.
Research indicates that when a memory is first recorded in the brain—specifically in the hippocampus—it’s still “fragile” and easily forgotten.
Our brains are constantly recording information on a temporary basis to separate vital information from the clutter — scraps of conversations you hear on your way to work, things you see, what the person in front of you was wearing, discussions at work, etc. The brain dumps everything that doesn’t come up again in the recent future as soon as possible to make way for new information. If you want to remember or use new information in the future, you have to deliberately work on storing it in your long-term memory.
This process is called encoding — imprinting information into the brain. Without proper encoding, there is nothing to store, and attempts to retrieve the memory later will fail.
In the late 19th century, Herman Ebbinghaus, a psychologist, was the first to systematically tackle the analysis of memory. His forgetting curve, which explains the decline of memory retention in time, contributed to the field of memory science by recording how the brain stores information.
Ebbinghaus once said, “With any considerable number of repetitions, a suitable distribution of them over a space of time is decidedly more advantageous than the massing of them at a single time.”
In a University of Waterloo report that looks at how we forget, the authors argue that when you deliberately remember something you’ve learned or seen not long ago, you send a big signal to your brain to hold onto that information. They explain,When the same thing is repeated, your brain says, ‘Oh — there it is again, I better keep that.’ When you are exposed to the same information repeatedly, it takes less and less time to ‘activate’ the information in your long-term memory and it becomes easier for you to retrieve the information when you need it.”
Most lifelong learning will inevitably involve some reading and listening, but by using a variety of techniques to commit new knowledge to memory, you will cement new information quicker and better.

Spaced repetition

One method is spaced repetition — repeating intake of what you are trying to retain over a period of time. For example, when you read a book and really enjoy it, instead of putting it away, reread it again after a month, then again after three months, then again after six months, and then again after a year. Spaced repetition leverages the spacing effect, a memory phenomenon that describes how our brains learn better when we separate out information over time. Learning something new drives out old information if you don’t allow sufficient time for new neural connection to solidify.

The 50/50 rule

Dedicate 50% of your time to learning anything new and the rest of your time to sharing or explaining what you have learned to someone or your audience.
Research shows that explaining a concept to someone else is the best way to learn it yourself. The 50/50 rule is a better way to learn, process, retain, and remember information.
For example, instead of completing a book, aim to read half, and try recalling, sharing, or writing down the key ideas you have learned before proceeding. Or better still, share that new knowledge with your audience.
You could even apply the 50/50 rule to individual chapters instead of the whole book. This learning method works really well if you aim to retain most of what are learning. The ultimate test of your knowledge is your capacity to transfer it to another person.
“The best way to learn something truly is to teach it — not just because explaining it helps you understand it, but also because retrieving it helps you remember it,” says Adam Grant.

Topic demonstrations

Another valuable method is to make the most of topic demonstrations to understand a topic inside out. Unlike simply reading or listening to an explanation, demonstrations show you how something works and help you visualize the concept. When learning photography, design, public speaking, negotiation, or a useful new technology, watching instructional videos that demonstrate what you’re trying to learn can improve your retention rate.

Sleep

Finally, use sleep as a powerful aid between learning sessions. Sleep after learning is a critical part of the memory-creation process, and sleep before learning strengthens your capacity.
Evidence shows that short naps help reinforce learned material. The authors explain, “We suggest that the mere onset of sleep may initiate active processes of consolidation which — once triggered — remain effective even if sleep is terminated shortly after.” The results show that even a period of sleep is enough to help you remember what you’ve learned. Longer naps (60-plus minutes) are also great for storing new information into our permanent memory. A good night’s sleep is even better for memory recall and clear thinking.
The more the mind is used, the more robust memory can become. Taking control of information storage will not only help you retain new bits of information but also reinforce and refine the knowledge you already have.

Written by

Founder @AllTopStartups | Featured at Business Insider,

Forbes, etc. I share practical tools for wealth, health, and happiness at https://postanly.substack.com


Friday, 1 November 2019

How Will Your Thinking and Memory Change with Age?

How Will Your Thinking and Memory Change with Age?:

Click on the link above for the full article!

 How well eight-year-olds score on a test of thinking skills may be a predictor of how they will perform on tests of thinking and memory skills when they are 70 years old, according to a study published in the October 30, 2019, online issue of Neurology®, the medical journal of the American Academy of Neurology. The study also found that education level and socioeconomic status were also predictors of thinking and memory performance. Socioeconomic status was determined by people’s occupation at age 53.

Friday, 24 May 2019

Early Life Exposure to Nicotine Alters Neurons, Predisposes Brain to Addiction Later in Life

Early Life Exposure to Nicotine Alters Neurons, Predisposes Brain to Addiction Later in Life | Newswise: News for Journalists:

 Neonatal exposure to nicotine alters the reward circuity in the brains of newborn mice, increasing their preference for the drug in later adulthood, report researchers at University of California San Diego School of Medicine in a published study.

 Newswise: Early Life Exposure to Nicotine Alters Neurons, Predisposes Brain to Addiction Later in Life
Credit: Photo credit: Cell Image Library, NCMIR
A stained micrograph of a mouse purkinje neuron, a type of brain cell that releases the GABA neurotransmitter, and which is affected by nicotine exposure.
Newswise — Neonatal exposure to nicotine alters the reward circuity in the brains of newborn mice, increasing their preference for the drug in later adulthood, report researchers at University of California San Diego School of Medicine in a study published “in press” April 24, 2019 in Biological Psychiatry.

A UC San Diego School of Medicine team of scientists, headed by senior author Davide Dulcis, PhD, associate professor in the Department of Psychiatry, with colleagues at Veterans Affairs San Diego Healthcare System and Michigan State University, found that exposure to nicotine in the first few weeks of life (through maternal lactation) induced a variety of long-term neurological changes in young mice.

Specifically, it caused a form of neuroplasticity that resulted in increased numbers of modified neurons in the ventral tagmental area (VTA) of the brain following nicotine re-exposure as adults. These neurons displayed a different biochemistry than other neurons, including greater receptivity to nicotine and a greater likelihood of subsequent addictive behavior.

“Previous studies have already shown that maternal smoking and early postnatal exposure to nicotine are associated with altered children’s behaviors and an increased propensity for drug abuse in humans,” said Dulcis. “This new research in mice helps elucidate the mechanisms of how and why. Neonatal nicotine exposure primes VTA neurons for a fate they normally would not have taken, making them more susceptible to the effects of nicotine when the animals are again exposed to nicotine later in life.”  

When young neurons are exposed to a foreign drug, such as nicotine, they create a molecular “memory,” said first author Ben Romoli, PhD, a postdoctoral fellow in the Dulcis’ lab. By increasing the expression of nicotine receptors and the molecular marker Nurr1, a protein that is normally found only in dopaminergic neurons, these GABA- and Glutamate-expressing neurons acquire the “readiness” to switch to a dopaminergic program when properly motivated by nicotine in the adult.

“We found that when the same animals are exposed to nicotine in adulthood, a fraction of these ‘primed’ glutamatergic neurons in the reward center begins to express genes required to produce dopamine. More dopamine in the system generates enhanced reward responses that lead to increased nicotine preference.”

Dulcis said uncovering the molecular mechanism and the identity of the neuronal network involved is an important step toward a fuller comprehension of how a complex condition like addiction may work.

“Our pre-clinical work identified new cellular and molecular targets that may guide future clinical studies to refine treatment strategies,” Dulcis said. “Because we found that this form of nicotine-induced neuroplasticity facilitates addiction to other addictive substances, such as ethanol in adults, uncovering the mechanism contributing to increased addiction susceptibility offers the rare opportunity to discover new ways to interfere with the mechanism of drug-mediated plasticity and prevent the negative consequences on reward-seeking behavior in the adult.”

Researchers said the results are highly relevant to tobacco control programs because the neonatal nicotine effect observed in the study were induced by exposure through maternal lactation and current state and local policies do not regulate this particular type of nicotine intake.

“We are planning to investigate whether early exposure to other commonly used drugs, such as alcohol or recently legalized marijuana or opioids, can induce similar adaptations of the reward center that affects drug preferences in adulthood,” said Dulcis. “It would be also interesting to determine whether this form of neurotransmitter plasticity is inducible or reversible at different stages of life when the brain is still extremely plastic and prone to drug addiction, like in adolescence.”

The scientists are also investigating applications aimed at improving the behavioral performance of animal models for diseases associated with a loss of dopaminergic neurons, such as Parkinson’s disease.

Co-authors of the study include: Adrian F. Lozada and Darwin K. Berg, UC San Diego; Ivette M. Sandoval and Frederic P. Manfredsson, Michigan State University; and Thomas S. Hnasko, UC San Diego and Veterans Affairs San Diego Healthcare System.

Funding for this research came, in part, from the National Institutes of Health, the Kavli Institute for Brain and Mind (grant 2012-18), the Tobacco-Related Disease Research Program (271R-0020) and the National Institute of Neurological Disorders and Stroke (5R21NS098079).

Sunday, 10 June 2018

How To Boost Brain Health

Boost Brain Power


 Vital information on brain health!

Story at-a-glance

  • The less inflammatory your diet is, the faster you’re going to get well, because inflammation is nearly always a contributor to neurological dysfunction. Sugar, damaged omega-6 oils, trans fats and processed vegetable oils need to be avoided
  • A simple stress reduction technique done before meals can improve your digestion and absorption of nutrients, boost your immune function and relax your body
  • Melatonin provides the best protection for your neurons against free radical damage, and you need a healthy release of melatonin through the night to calm and heal your brain
  • As you get older, the enzymatic activity required to produce melatonin becomes impaired. To boost your body’s production, take 5-HTP before bedtime. 5-HTP is converted to serotonin, which is further converted to melatonin
  • Natural alternatives to diuretics, commonly prescribed for hypertension, are discussed, as are the many health benefits of proteolytic enzymes

Thursday, 1 March 2018

The Brain Changing Benefits of Exercise

The Brain Changing Benefits Of Exercise

It has been well documented how beneficial exercise is for the brain and body fitness in general.


What's the most transformative thing that you can do for your brain today? Exercise! says neuroscientist Wendy Suzuki. Get inspired to go to the gym as Suzuki discusses the science of how working out boosts your mood and memory -- and protects your brain against neurodegenerative diseases like Alzheimer's.

This talk was presented at an official TED conference, and was featured by our editors on the home page.

About the speaker
Wendy Suzuki · Neuroscientist, author
Wendy Suzuki is researching the science behind the extraordinary, life-changing effects that physical activity can have on the most important organ in your body: your brain.

Friday, 30 October 2015

Study Reveals How Brain Multitasks

 It is interesting to find out how the Brain multitasks and what goes on when it does.  The Brain is such a complex part of the body where all the work and functions are going on.  Being aware of the operations in the brain helps us understand how we react to various situations.

 Study Reveals How Brain Multitasks

Friday, 16 October 2015

Can Work Stress Be Linked to Stroke?

 There are so many relatively young people who have strokes today that this article needs consideration if you have a lot of stress in your life, whether at home or at work.  Strokes are serious conditions which in some cases it is difficult to recover from, some people never do. 

 
Can Work Stress Be Linked to Stroke?

Saturday, 19 September 2015