Apple, Google, coronavirus and privacy: a perfect storm that raises whether the remedy is worse than the disease
Last Friday Apple and Google announced an ambitious joint initiative: a Bluetooth-based contagion tracking system that would be integrated into iOS and Android and therefore would turn our phones into "sneak" potentials if we have been in contact with a person infected with the coronavirus.
The proposal of both giants was made with a declaration of intent according to which this system would be developed "fully respecting the privacy and security of users". We have heard that message other times: the problem is whether we can believe it, and despite the theoretical good intentions of Google and Apple, several experts in this field claim not.
How contact traceability works
Although we had previously explained how the system works, it does not hurt to make a brief reminder. The technical documents offered by both Apple and Google reveal the foundations of this "contact traceability" (contact tracing) that according to some experts can be crucial in avoiding future reputations of the coronavirus pandemic that has confined us for weeks.
The idea is to propose a contact traceability system that allows monitoring of infections through our mobile devices. The first implementation could be a mobile application jointly developed by Apple and Google that users would have to install, but everything indicates that the idea is to integrate said system into iOS and Android through future updates of these operating systems.
This system makes our phone record all the phones that are close to it over time. Each phone transmits via Bluetooth a unique code that identifies it - but that theoretically discourages the owner - and the phones that are close to it register that code (in addition to sending their own).
This information is part of a database that goes into action when a person turns out to be positive for coronavirus. If that happens, you can choose to report it on that contact traceability system. If you do, the rest of the phones that were once close to that phone will receive a notification of when they were, alerting those people of the potential risk that they are also infected and offering them information on how to act in that case.
It is not clear who is responsible for disseminating this information on the confirmation of the contagion: it is likely that the person affected does not do it, but rather a legitimate provider of health services - in Spain that could be Social Security - thus avoiding false diagnoses issued by part of bad actors, for example.
First problem: security
The idea is certainly interesting, but it poses clear threats to the security and privacy of users. The technical documents published by Apple and Google are in charge of trying to reassure us about the first, for example. Three levels of keys would be created to encrypt and protect all this transfer of information, each one destined for a different scope:
The proximity identifiers (which are the lowest level of these three keys) would be kept in our mobile, and if we end up testing positive for coronavirus, the daily keys of all the days that we have been infected would be shared so that everyone who was nearby can receive that notification.
Combining those two keys raises some questions about the security of the system. Those doubts were raised by cryptographer Matt Tait, who claimed that once daily keys are public, it is possible to discover that proximity IDs are associated with a specific ID - which is just what the app will end up doing to confirm to other people that they could be exposed.
If you * do * get COVID, there's a nasty edge-case that needs to be (and can be) addressed, which is that a daily key can be used to correlate all of your proximity IDs for the corresponding day.- Pwn All The Things (@pwnallthethings) April 10, 2020
Moxie Marlinspike also didn't seem too convinced that the system was as interesting as Apple and Google promised. This hacker and developer knows what he is talking about: he is the co-creator of the Signal protocol, the cryptographic protocol that enables end-to-end encryption of voice, video and text communications (instant messaging) and is used, for example, in the application of the same name, and also on WhatsApp, Facebook Messenger and Skype.
This developer also noted that everything seems safe and private until you test positive for coronavirus. In this case, the MAC address of the Bluetooth chip of your device (something like its DNI) becomes linkable and, therefore, there is a certain risk that this data will be used for purposes other than those proposed by Google and Apple for the system. .
That seems untenable. So to be usable, published keys would likely need to be delivered in a more 'targeted' way, which probably means ... location data.- Moxie Marlinspike (@moxie) April 10, 2020
He was not only pointing to that problem, but to the enormous amount of information that such a system can generate. Although the daily keys only occupy 16 bytes, the number of contagions would cause "hundreds of MB to be downloaded by all the mobiles". That, according to him "is unsustainable", which would mean that "to make them usable, the keys should be delivered in a more 'targeted' way, which probably means ... location data."
Do not go, there is still more: the problem of privacy
But it doesn't end there. Marlinspike himself indicated how this entire system can be contaminated by trolls that do not stop circulating in certain areas and end up reporting an infection that in reality is not and that ends up causing many people to believe that they are exposed when in fact they are not. .
In my opinion - these types of data are poor proxies for the ground truth we really seek: actual # COVID19 infection rates - which can only be truly known by widespread testing. If we had testing in place, it would make the need to pursue these privacy-invasive techniques moot- ashkan soltani (@ ashk4n) April 10, 2020
That, according to other experts such as Ashkan Soltani - who was part of the FTC and was an adviser to Obama - can only be remedied with massive tests of coronavirus, something that "would make the need to follow these privacy invasion techniques questionable" .
The Security Group of the Computer Laboratory of the University of Cambridge also raised many doubts about the validity of this proposal by Apple and Google in a detailed and very complete article.
For starters, they cited the work of Serge Vaudenay (PDF) a French cryptographer who works at the prestigious Swiss institution EPFL and who argued that "proximity traceability systems are of paramount importance in controlling the COVID-19 pandemic. At the same time , they come with serious privacy threats. " "In fact," he noted, "it is surprising that decentralization creates more threats to privacy than it solves. Sick people who are reported anonymously can be deanonymized, private encounters can be discovered, and people can be coerced into disclose your private data. "
They also cited the cases of systems of this type such as TraceTogether - whose code was recently published as Open Source - that are already underway, for example, in Singapore. This type of system, they indicated, "is not anonymous. [...] It is not a matter of consent or anonymity, but of being persuasive and having a good relationship with patients."
Waiting for diagnostic tests to help in the task of pandemic control is currently impractical given the limited number of tests that can be done per day and the time it takes to provide the results, but the health authorities also need data from location. Not just for a potential location of infections - or to reduce the massive data traffic of the system, as Marlinspike pointed out - but to know where to build, for example, new field hospitals or send more equipment and medical personnel.
To all this is added the fact that this type of system is a candy for trolls. As the document pointed out, it is easy to imagine some malicious user hitting a dog with a mobile phone and letting it run around the park and the street, or that a group of bad actors - specifically citing Russia - used the application to carry out denial attacks. of service and thus spread panic. And of course it would help "little Johny to self-report the symptoms so that everyone who goes to school had to go home" and thus avoid the pandemic: vacation instantly.
The fact that the application or the system had this "opt-in" behavior in which the user would be the one to decide whether or not to participate in the monitoring system does not help either. That, as the experts in this group indicate, would mean that "nobody has incentives to use it, except those who want to try it and people who religiously comply with everything the government asks for. If the adoption is 10-15%, as in Singapore it won't be very useful. "
For those responsible for this reflection, traceability applications "are simply part of the do-something-for-God-itis". For them we do not need these traceability systems, but rather people who are in charge of managing this type of information, as it is being managed so far with consultation services and potential telephone diagnosis, for example. The final paragraph is forceful:
"Our efforts should be directed at expanding contagion testing, manufacturing ventilators, retraining everyone with a medical history, from veterinary nurses to physical therapists, to using them, and building field hospitals. We must report crap. when we see them, and not give politicians the false hope that "techno-magic" will allow them to avoid difficult decisions. Otherwise we had better stay on the sidelines. The answer should not be driven by cryptographers but by epidemiologists, and we must learn what we can from the countries that have managed best so far, such as South Korea and Taiwan. "
Jason Bay, who heads TraceTogether, who is also senior director at the Government Technology Agency in Singapore, was also not in favor of using this solution as a "panacea against coronavirus." For him trying to conceive these systems as a solution to the problem "is an exercise in technological triumphalism. There are lives at stake. False positives and false negatives have consequences in real life (and death). We use TraceTogether as a supplement to the traceability of contact [from the health authorities], not to replace it. "
Jaap-Henk Hoepman, professor of Digital Security at Radboud University Nijmegen in the Netherlands and head of the Privacy & Identity Lab, also denounced the problem of such a system.
"We must stop Apple and Google in this initiative," he said, "or else get rid of our smartphones, because they will really become Stasi agents in our pockets." Like the rest of the experts, he pointed out that this theoretically decentralized monitoring scheme became centralized as soon as the telephone was forced to inform the authorities of a potential contagion.
Source: Business Insider
That "effectively turns our smartphones into a global mass surveillance tool." In fact, this expert especially criticized the Cupertino company. "Any illusion that we had that we could somehow tame the Stasi agent in our pocket, buying more expensive iPhones because Apple promised to take our privacy seriously, or being careful with the applications that we install or not on our phones, it's just that: an illusion. "
For him, the dangers of such a system are not that it could effectively help this future phase of return to normality, but everything that could come after having implemented this type of system in our mobiles. Hoepman mentioned some examples:
- Police could quickly see who has been near the victim of a crime by activating the victim's phone as "infected."
- The same could be applied to find the sources of the journalists or the "snitches" who leak information.
- A company could install Bluetooth beacons equipped with this software at points of interest to mark certain beacons as "infected" and thus locate those who passed by.
- If you have Google Home at home, Google could use this system to identify everyone who visited your home.
- Your partner, if jealous, could install a secret application on your mobile to follow you and check with whom you have been in contact, or parents to spy on their children.
Many of them do not actually need this technology to become a reality because there are already other methods of carrying out similar actions, but the truth is that the technology proposed by Apple and Google, no matter how well-intentioned it is, raises serious questions about whether the Remedy can be, as the saying goes, worse than disease.
The debate is there, and it may well be summed up by many in a phrase they pointed to in an interesting (and somewhat pessimistic) study of what it will be like to return to normalcy on the part of Vox. There they talked about how at least this is to try this, even if it means "building a huge state of digital surveillance. I care about my privacy, but not as much as I care about my mother."
Image | Unsplash