by Socrates Vertellis (LLM-DEA) – Attorney at law, Co-founder of Intel-Lex
The pandemic of Covid 19 has achieved in a few months what competition and struggle for market conquest let seem inconceivable until recently, i.e. the cooperation and collaboration of two of the most severe rivals in the technology and software industry, Google and Apple. The two tech giants announced last Friday they will stand shoulder to shoulder to deploy a mechanism enhancing contact tracing in an attempt to warn smartphone users about having socialised with people affected by the virus.
At the beginning, the two companies will release APIs that enable interoperability between Android and iOS devices using apps from public health authorities. The apps will be available to download via Google Play or Apple Store. At a second stage the companies will work to enable a broader Bluetooth-based contact tracing platform by building this functionality into the underlying platforms. The system will be using on-board radios on smart devices to transmit an anonymous ID over short ranges, through Bluetooth beaconing. Servers relay user’s last 14 days of rotating IDs to other devices, which search for a match. A match is determined based on a threshold of time spent and distance maintained between two devices. If a match is found with another user that has told the system that they have tested positive, one is notified and can take steps to be tested and to self-isolate. Google and Apple claim that their mechanism is fully compliant with privacy requirements of transparency and consent as on the one hand users will have to opt-in to participate in this hybrid public health record and on the other the system respects privacy by design anonymizing at the outset users IDs.
However such reassurance cannot be enough to alleviate privacy concerns, not necessarily related solely to Covid 19. First studies have demonstrated that today with the evolution of algorithms one could hardly talk of real anonymisation. In addition, records comprising Covid-19 victims could be further enriched with other sensitive data thanks to big data, data analytics and the IoT, creating huge datasets with personal data of extreme importance to the individual. Who can guarantee that this delicate, personal information will not be used to discriminate against its owner? Who can guarantee that despite the good willing on the part of developers, their system will not collect as well other information for which the user has not opted in, all the more cutting-edge machine learning is programmed to evolve on its own? Besides, who can exclude that in the future even governments in the EU, where privacy has strong foundation, will not allow this or similar sorts of contact tracing on the pretext of averting crimes or terrorism? Covid 19 could be the perfect experiment to assess people’s tolerance. Finally, what if those datasets are hacked and the information leaked to the dark net? Undertakings and governments could face demands of millions of Euros for ransom or as compensation by individuals.
The above-mentioned comments are just food for thought. By no means do they intend to excommunicate technology, about which the writer by the way is a great enthusiast. We simply want to emphasize the hard ramifications and negative impact solutions that seem prima facie very promising and innocuous might have on our lives.