A guide to the terms and conditions of the COVIDSafe app

--

Source: AAP

They still haven’t even agreed on how it will be shared with health departments across Australia, and who exactly will have access.

From my understanding, the app data which you collect and give away willingly to the Australian Government can be used to prosecute you and others for “breach of the law in relation to contact tracing under the Biosecurity Determination 2020 and the Biosecurity Act 2015”.

The purpose of the contact tracing as per the app’s “notice of collection purposes” is to ensure if you test positive, you follow orders, eg self-isolate and give up the data. Yes, you can (and will more than likely) be forced to give up the data. Failure to do so can land you in prison for five years.

The Biosecurity Determination 2020 states that the data will be used to prosecute a breach under section 479 of the Biosecurity Act 2015. Section 479 can be found on page 535 of 730 pages of the Act. It states, “A person must comply with emergency requirements under section 477(1) that applies to that person and a person must comply with a direction given under section 478(1) that applies to that person”.

So, as I understand it, if you have the app and you test positive and declare it (under the Act you can be forced to declare it or go to prison), don’t leave the boundaries of your yard. Because the data in the app can prove your location because it’s location-trackable. How else would it know you have breached isolation if ICT can be used to prove it? It would save the police having to waste their time doing random checks ensuring people are self-isolating.

Bluetooth is a location-trackable tool, as it pings off beacons just like a phone pings off towers — even when turned off. Bluetooth is also very easy to hack (bluejacking). Bluetooth is turned off by default on all phones. Once turned on, your phone is susceptible to being ‘jacked off’ by hackers who can take over your phone. The app requires your bluetooth to be turned on at all times, particularly in public, but the app, Act and Determination does not guarantee your phone is safe from hackers.

We have always been taught by police to minimise stolen information, that it’s our responsibility to keep our online presence secure. Using bluetooth 24/7 is not being responsible, it’s not secure. Also, for iPhone users, there has been conjecture that you must leave your phone unlocked for the app to work. Who in their right mind would leave their phone unlocked? Would you give up your security for an app?

Why isn’t the app mandatory? My guess is that it would be a breach of human rights to make ‘tracking’ mandatory. So they’re relying on paranoia and fear to do the tracking legwork. After all, the terms and conditions are there for everyone to read prior to consent, but does anyone read them? In dealing with legalese, it’s what’s not written or said that is important.

From my understanding, you can be forced (if you don’t want to go to prison) to give up the data for any reason, and that’s not just for COVID-19 related data. Personally, I don’t want to be ‘guilty by association’ merely for being in the wrong place at the wrong time for too long because it’s location and person-to-person tracking. I’ve been in too many wrong places at the wrong times throughout my life, for reasons beyond my control.

We all know the government knows our data re finances, health etc. That’s a given through the ATO, MyGov etc. It’s our associations and everyday enjoyments of life offline they don’t know about, but they want to know. These are the last of our freedoms — the right to enjoy life without interference and being dictated to. Why give it up? Why be monitored and tracked? Why lay the groundwork for a social credit system?

By the way, under the Determination you are not allowed to guilt or coerce others into getting the app. Heavy penalties apply. This is also stated in normal size print on the app download page. The COVIDSafe app was just one contact tracing option. These alternatives guarantee more privacy:

“A team at Canada’s McGill University is working on a solution that uses “mix networks” to send cryptographically “hashed” contact tracing location data through multiple, decentralised servers. This process hides the location and time stamps of users, sharing only necessary data.

This would let the government alert those who have been near a diagnosed person, without revealing other identifiers that could be used to trace back to them…

US-based advocacy group The Open Technology Institute has argued in favour of a “differential privacy” method for encrypting contact tracing data. This involves injecting statistical “noise” into datasets, giving individuals plausible deniability if their data are leaked for purposes other than contact tracing.

Zero-knowledge proof is another option. In this computation technique, one party (the prover) proves to another party (the verifier) they know the value of a specific piece of information, without conveying any other information. Thus, it would “prove” necessary information such as who a user has been in proximity with, without revealing details such as their name, phone number, postcode, age, or other apps running on their phone.

Some approaches to contact tracing involve specialised hardware. Simmel is a wearable pen-like contact tracing device. It’s being designed by a Singapore-based team, supported by the European Commission’s Next Generation Internet program. All data are stored in the device itself, so the user has full control of their trace history until they share it.

This provides citizens a tracing beacon they can give to health officials if diagnosed, but is otherwise not linked to them through phone data or personal identifiers… inviting the private sector to help develop solutions (backed by peer review) could have encouraged innovation and provided economic opportunities…

Fundamentally, once you’ve told the government something, it has broad latitude to share that information using legislative exemptions and permissions built up over decades. This is why, when it comes to data security, mathematical guarantees trump legal “guarantees”.”

Meanwhile in South Korea, for contact tracing, “the authorities relied on mobile-phone GPS data, credit card transaction records and CCTV footage. While this use of personal data is legal in South Korea and proved effective in combating the virus, it also raised significant privacy concerns. Over the past two months, some patients whose detailed travel history was made public have been blamed, as if they had recklessly put others at risk of infection. The country’s National Human Rights Commission and advocacy organisations have called for an appropriate balance between protecting the public and respecting individual rights, and this debate continues today”.

It’s probably more prudent to wait till the government has the regulations formally legislated. This way legal experts can then look into legal loopholes that may have been included giving access to government in ways it shouldn’t be. Likewise I want the source code to be released as promised so it too can be thoroughly investigated. I am not happy with the way it has been implemented whereby the encryption keys are stored on the servers with the data. This goes against all best practices for data security.

Even if the encryption keys were stored separately I don’t like any one individual/government department having full access to it and would prefer to see the key broken up into chunks and distributed using a technology such as blockchain to ensure that multiple sources would all have to agree and work together by combining these individual chunks to actually de-encrypt the data.

Scope creep is a big concern. My understanding is the app already asks for location services permission, so a stroke of a pen and the app is changed, what it does and how it works is changed, and you who’ve already installed it get an update and now it’s not just Bluetooth handshakes and encrypted data that you have to consent to upload, but GPS tracking, live data upload and it’s being used to prosecute you for absolutely anything. That’s an extreme example, but scope always creeps with this sort of thing. The metadata logging situation is a prime example.

--

--

Dana Pham CPHR (pronouns: who/cares)
Dana Pham CPHR (pronouns: who/cares)

Written by Dana Pham CPHR (pronouns: who/cares)

Trans-inclusionary radical feminist (TIRF) | Liberal Arts phenomenologist from @notredameaus | Anglo-catholic 🇦🇺 | all opinions expressed here are my own

No responses yet