Macquarie University is part of a group of seven universities from Australia and the US that has just been awarded a grant under the US-Australia International Multidisciplinary University Research Initiative (AUSMURI) to develop the science behind robust Human-Bot Cybersecurity Teams (HBCT). The interdisciplinary team brings together diverse expertise spanning computer security, machine learning, psychology, decision sciences, and human-computer interaction, on the initiative dubbed CATCH: Cybersecurity Assurance for Teams of Computers and Humans.
To address the growing challenge and complexity of cybersecurity defence, machine learning bots have become part of cybersecurity defence teams alongside human analysts. These bots reduce the burden on human analysts by filtering the plethora of information involved in a cyber attack, thus freeing up cognitive resources for tasks related to the high-level mission.
This new research will address the gap in understanding how to manage, observe, and improve hybrid teams that are made up of humans and autonomous machines, particularly in the face of adversaries.
The project will investigate how to build trust, shared mental models, decision-making processes and adaptability within HBCTs.
Professor Dali Kaafar, Executive Director of The Optus Macquarie University Cyber Security Hub and Associate Professor Shlomo Berkovsky from Macquarie University’s Australian Institute of Health Innovation are investigators on the project.
“Macquarie University, through contributions from its experts from the Cyber Security Hub, will help develop solutions to help explain machine learning outcomes and build AI techniques that are robust against cyber security adversaries,” says Professor Kaafar.
“We will also contribute in building shared mental models that can be used to steer coherent and efficient outcomes for AI and humans as a cyber team even in the presence of novel instances of cyber attacks.”
AUSMURI Principal Investigator Professor Benjamin Rubinstein of the University of Melbourne says: “Complex problems require innovative solutions and both basic and problem-based research.
“The joint MURI/AUSMURI program is a significant investment in step change research. We expect a range of outcomes from this ambitious program, from human-friendly explanations of AI decisions, to new defences against attacks on machine learning, and mathematical accounts of human decision making within human-bot teams.”
Funding is provided under the Next Generation Technologies Fund, led by Defence Science and Technology (DST). Dr David Kershaw, Chief Science Engagement and Impact Division said the AUSMURI program not only helps to grow local skills and expertise, but also supports Australian university staff collaborating with US academics to address high-priority topics in Defence capability.
“The joint project, led in Australia by the University of Melbourne, will explore how cyber bots can learn and form teams, either amongst themselves or with humans, to counter cyber threats,” Dr Kershaw said.
“Improved security through cyber autonomy is critical for Defence’s future in highly challenging and adverse environments. This research aligns with Defence’s Science Technology and Research programs, known as STaR Shots. It provides another valuable opportunity to work alongside counterparts in the United States,” Dr Kershaw said.
Seven universities comprise the group: University of Wisconsin-Madison, Carnegie Mellon University, University of California San Diego, Penn State, University of Melbourne, Macquarie University, and University of Newcastle.