Human attention secrets cracked in bid to avoid digital-age disasters

Researcher
Professor Anina Rich; Dr Hamid Karimi-Rouzbahani; Associate Professor Alexandra Woolgar
Writer
Sarah Maguire
Date
3 June 2021
Faculty
Faculty of Medicine, Health and Human Sciences

Share

With self-driving cars around the corner, breakthrough research at Macquarie University is tackling the dangers that lurk when humans hand their decision-making over to computers.

Researchers at Macquarie University are helping to lay early groundwork for technology that could help avoid potential tragedy when humans fail to notice vital computer errors.

Eyes wide shut: Monitoring is tough for human attention, which can drop off dramatically over time.

In a world first, they mapped brain signals in a way that allowed them to predict when someone had lost attention and would therefore miss a crucial moment in a network monitoring task – similar, say, to an air traffic or train network control system.

Hooked up to a Magnetoencephalography (MEG) machine in the KIT-Macquarie Brain Research Laboratory, the 21 subjects monitored several dots moving on trajectories towards a central fixed object, but deflecting before making contact. Their job was to press a button to deflect a moving dot if it instead violated its trajectory and continued towards a collision with the central object.

The experiment – funded by the Australian Research Council – showed that the less often the dot violated its trajectory, the more likely the participants were to miss it, showing that attention dropped off dramatically over time when targets were infrequent.

We can tell when somebody’s brain doesn’t have the relevant information about the collision coming up. The next step is, to see whether we can do that in real time.

The move towards automated and semi-automated systems in high-risk domains such as power-generation, trains and airplanes, not to mention self-driving vehicles, means humans increasingly must pay sustained attention in order to catch infrequent errors, the researchers point out.

“A computer is most often making the decision about who is going where, and keeping track of where everyone is –  but a human has to watch that, and if that computer makes a mistake, a human has to be ready to jump in to fix the error before you get a tragedy,” explains senior author Anina Rich, Professor of Cognitive Science and head of Macquarie’s Perception in Action Research Centre.

“The problem is that monitoring – putting us in a situation where we are not finding a target very often, and when we find one we have to respond accurately and quickly – is something that is really tough for human attention.

“By contrast, if you are in the condition where you have to intervene quite frequently, say 50 per cent of the time, then people can maintain attention.”

How disaster can happen

A decrease in performance over time can have tragic consequences, such as the UK’s Paddington railway disaster in 1999 that killed 31 people and injured more than 400, after a slow response to a stop signal resulted in a train moving 600 metres too far and into the path of an oncoming train.

Attention seeking: In high-risk domains that have become automated or semi-automated, if a computer makes a mistake, humans have to be ready to jump in and fix it, says Professor Rich.

“These modern environments challenge our attentional systems and make it urgent to understand the way in which monitoring conditions change the way important information about the task is encoded in the human brain,” say the researchers, including lead author Dr Hamid Karimi-Rouzbahani, Honorary Fellow in Macquarie’s Department of Computing who was a postdoc during the study and is now a Newton Fellow at Cambridge University, UK.  Associate Professor in Cognitive Science Dr Alexandra Woolgar was the third collaborator on the project.

While scientists have extensively studied these ‘vigilance decrements’ in humans, Rich’s team went further by recording what was happening in the brain across time as the participants performed their tasks.

“The benefit of the technique we have used is that we have fine-grained temporal information – as the trial is unfolding, we are measuring the neural activity across time,” explains Rich.

Semi-automated vehicles are already on the road, where they control lane changes and speed but the driver has to be ready to take control at any point.

“We also developed a new task that includes a lot of features of real-world monitoring scenarios; most previous vigilance experiments have used simple displays with static stimuli.”

Rich says the study has shown proof of concept that it is possible to use the pattern of activity recorded in the brain to actually decode whether someone has lost crucial information about a task.

“This is a real step forward to be able to start to predict behavioural errors,” Rich says.

“We can tell when somebody’s brain doesn’t have the relevant information about the collision coming up. The next step is, to see whether we can do that in real time and give them feedback to prevent that error.”

And then came self-driving cars

Rich says humans’ poor performance in monitoring tasks is becoming an increasing concern as the era of self-driving cars fast approaches.

Warning signs: Self-driving cars of the future could come with a built-in EEG cap to monitor for dangerous lapses of attention, says Professor Rich.

“Semi-automated vehicles are already on the road, where they control lane changes and speed but the driver has to be ready to take control at any point,” Rich says.

“In an attempt to make cars safer by having computers make these judgements, we are setting ourselves up in a situation that is incredibly difficult for human attention.”

The Macquarie team’s methods will be used in a new collaboration with the University of Utah, where researchers in Professor David Strayer’s applied cognition lab are concerned with reducing driver distraction to make roads safer.

In an attempt to make cars safer ... we are setting ourselves up in a situation that is incredibly difficult for human attention.

Strayer’s experiments have involved monitoring drivers’ attention using Electroencephalography (EEG) electrodes embedded in headgear, which record the brain’s electrical activity but provide less detailed and ‘noisier’ information than MEG about what the brain is doing.

But Rich’s team already has good pilot data that suggest their method can also predict attention lapses using EEG, a much cheaper and more available technology than MEG, which measures magnetic signals.

“The difference between what Strayer has been doing in using EEG in trying to detect problematic lapses in attention is the level of specificity we can achieve using this method,” Rich explains.

“The level they have been using so far is trying to detect lapses in attention in general, but that doesn’t distinguish between a moment when you just drift away for a second but it’s not problematic, and when you have lost a certain amount of information about the road in front of you that could be dangerous, which our method has the potential to detect.”

Self-driving cars that were able to detect when brain activity showed a dangerous lapse in attention could set off a warning signal, or an emergency braking system, for instance.

“Certainly the evidence suggests that any drop in speed can reduce the severity of an accident, so the fraction of an instant between when the car detects that something is wrong and works out that you are not paying attention, could make the difference between hitting a child or not,” Rich says.

“It is not completely out of the question to imagine that when you buy your self-driving car it might come with a built-in EEG cap for you. This sounds futuristic but it’s probably not that far in the future.”

Dr Anina Rich is Professor in the Department of Cognitive Science, Director of the Perception in Action Research Centre.

Dr Hamid Karimi-Rouzbahani is an Honorary Fellow in the Department of Computing and a Newton Fellow at Cambridge University.

Dr Alexandra Woolgar is an Associate Professor in the Department of Cognitive Science and an Executive Member of the  Perception in Action Research Centre.

Share

Back To Top

Recommended Reading