Contact tracing is one of the best tools we have to flatten the COVID-19 curve and keep people as safe as possible. Mobile tracing apps would be extremely useful in tracing contacts from known sources, such as the Ruby Princess passengers or Bondi backpackers.
Singapore has so far managed to keep COVID-19 under control. They’ve announced they’re open-sourcing their mobile app 'TraceTogether', making it available to any health authorities who want it.
This is a potential lifesaver. In a pandemic, time is among our biggest enemies. Building a contact-tracing app from scratch is not an option.
The app tracks interactions between those diagnosed with COVID-19 and the wider community. Australia is fast-tracking the review process so TraceTogether can be adopted and deployed.
People may fear such apps could later be used for mass surveillance.
Even in desperate times, there are privacy concerns. People may be reluctant to download the app because of privacy issues. They may fear such apps could later be used for mass surveillance. With such a large pool of people likely to share their data, we need to take all steps necessary to protect their privacy, and encourage them to use a tool which might help to bring the pandemic under control.
Dr Hassan Asghar and Professor Dali Kaafar from Macquarie University’s Cybersecurity Hub, together with the University of Melbourne’s Farhad Farokhi and Ben Rubinstein, analysed the app’s privacy and security implications. They suggest ways to strengthen its privacy without drastically changing its design, so it can be implemented quickly.
How the app works
TraceTogether uses Bluetooth to exchange information between users. Signal strength approximates distance between users. The app shares time, and (temporary) user IDs, logging this, encrypted, on the device. When someone installs the app, a centralised authority, like the Department of Health, stores their mobile number, plus a newly-generated user ID, on its server.
The server uses its private key to generate temporary IDs, transmitting them to the corresponding user. These temporary IDs are exchanged between users near each other. A user diagnosed with COVID-19 is asked for consent to upload the app’s encrypted data logs to the server. With these, the registry can contact other users who were in contact with them.
The authors identify potential privacy threats from the central authority, and malicious users (scammers, hackers).
Privacy from other users is built in. The app generates temporary user IDs, refreshing them frequently. These IDs are generated by the server based on people’s phone numbers and their permanent ID so the central authority can determine the identities of users if needed. The temporary nature of the IDs means other users can’t track someone for long. But the server can.
If a user tests positive, they can consent to the server retrieving their data.
Users’ data is safe from snoopers: data logs on the user’s phone are encrypted, so hackers can’t read the data. The server has the decryption key for the data logs, which are only sent to the server for determining close contacts between people.
The app keeps data secure from other users and snoopers, but not from the central registry. The server can retrieve users’ data logs, decrypt and read them. It can also link the temporary IDs to real identities.
There are some privacy features. The server only asks for data logs from infected users or people who have been near one. Data logs only contain relative distance, not precise location.
Data on phones is deleted after 21 days. The server can know a user’s private data, but this is a feature of most apps. If a user tests positive, they can consent to the server retrieving their data, to identify users who have been in contact with them. At this point, potentially uninfected users lose control over their privacy.
Tweaks for more privacy
The app could be (mis)used for surveillance. The central registry could obtain and decrypt data logs from a large number of users, for mass surveillance. Although data logs on the device are deleted after 21 days, there is no guarantee data logs decrypted at the central server would be.
Privacy: Professor Dali Kaafar, pictured, and Dr Hassan Asghar are part of a group of cyber security experts from Macquarie University and the University of Melbourne who have come up with better ways for the contact tracing app to protect identity.
The app could be tweaked for more privacy, reconfiguring it so the app generates temporary user IDs. This way, only the user knows their identity. They would have to allow their device to share the list of their temporary IDs.
- How to keep afloat during times of economic crisis
- Can you create your own Game of Thrones-style language?
The server would find temporary IDs that have been in contact with the infected user and broadcast them. Users getting a message containing their temporary ID can respond by identifying themselves.
Future versions should be decentralised. The server could push temporary IDs of diagnosed users to the apps, allowing other users to determine if they have been in contact with them.
Locally and randomly generated IDs can’t be linked to true identities. The server would not know the identities of the users within infection range of someone who tested positive. This fundamental change in the design might not be possible to do quickly.
Dr Dali Kaafar is a Professor in Macquarie's Department of Computing and Executive Director of Optus Macquarie University Cyber Security Hub.
Dr Hassan Asghar is a Lecturer in Macquarie's Deparment of Computing and a researcher in cryptography, information security and privacy at Macquarie University.