Test-driving the brain could reveal early signs of Alzheimer’s

1818

By Brian Murphy, Queen’s University Belfast

In some ways the human brain is like a well-oiled car engine, purring along without being noticed, until something goes awry. Neuro-imaging techniques such as electroencephalogram (EEG) or fMRI – which measures brain activity by detecting associated changes in blood, give us different ways of peering into the working brain, but mostly this happens in the rarefied world of academic research experiments, or in a clinic once we already know something has gone wrong.

We know much less about how the brain works out in our day-to-day lives, and when people are doing real-life tasks, like talking to a friend, working at their desk, or watching TV.

Why does this matter? Well think about the car again. If you’re buying a used car, you’ll want to know “healthy” it is. One of the first things you’ll do is listen to the engine. But you can tell much more by how it performs on a test drive – slow around town, fast out on the motorway, and back by some windy country roads. And two important things are happening in neuroscience now that mean that soon you, or your doctor, might be able to “test-drive” your brain regularly, to catch any early signs of destructive diseases like Alzheimer’s.

One development is that EEG or “brainwave” technology is now getting so compact and so cheap that it is being sold as a consumer wearable device. And brainwaves carry a lot of information about a person’s brain health. New research we carried out at Queen’s University Belfast, working with Gabriele Miceli’s lab at the University of Trento in Italy, found that cognitive decline can be detected in older people, from just 30 minutes of brainwaves recorded in the lab. We took 40 healthy people ranging in ages from 25 to 80, and ten people with various early forms of dementia (age-related cognitive decline) in their 60s and 70s and found we could measure a healthy person’s “brain age” by how quickly and strongly it responded to pictures we flashed on a screen.

As with the rest of your body, the brain slows down naturally with ageing. In the healthy people we saw brain responses (naturally) grow slower and weaker with age, but those with dementia were outliers – their brains appeared to be older than their chronological age would suggest. Their brain activity appeared older, in a way that could be used in the future for automatic diagnosis.

Bringing stories alive

With another team based in Carnegie Mellon University, I also looked at how the brain reacted while reading one of the chapters of Harry Potter and the Philosopher’s Stone. When we are immersed in reading a story, a lot is going on. Apart from our emotional engagement with a good yarn, we need to recognise individual words, retrieve their meaning from our “mental dictionary,” sew those words into sentences, and then keep track of how the characters and story develops.

Traditional brain imaging experiments would tackle each of these mental processes separately. But in this work – led by Tom Mitchell and PhD student Leila Wehbe – we were able to observe these processes interacting during a real-world task. Eight people’s brains were scanned in an fMRI research scanner to isolate the parts of the brain involved in the complex and enjoyable process of reading. We were able to track how the brain performed in real time, during a real-world activity – a kind of brain workout.

We hope that these two developments – the consumerisation of brainwave technologies, and the ability to track brain activity during complex real-world tasks – could be used in detecting cognitive problems, and for measuring the effect of medication and other therapies. Memory and vocabulary are some of the first mental capabilities to go downhill as people get older. Combining slowing or weakening patterns from EEG with the engaging task and “all-round” workout of the story reading could help work out the best way forward.

Similar methods could be used for specific language and learning problems such as dyslexia, and also to measure attentional and emotional involvement as a probe for conditions like ADHD and autism. We are exploring commercial applications of this technology and how it may apply to consumers in their day-to-day life.

The Conversation

This article was originally published on The Conversation.
Read the original article.

LEAVE A REPLY