Second dialogue:
Protecting data against vulnerabilities:
Questions of trust, security, and privacy of data
Our second – data security – discussion on the Road to Bern (22-23 April 2020) comes at a time when the Coronavirus crisis raises many societal and digital questions. The more our daily routines shift online, the more exposed our data and digital infrastructure become.
This text maps some of the current data protection and security issues in the wider context of future digital developments.
Apprehension of the digital risks and the ability to make informed trade-offs is and will increasingly shape our decisions on data security. We make these decisions and trade-offs all the time – from our personal and professional life to everything in between. Some data security decisions, in particular those pertaining to health, trade, intellectual property, etc. need to be made at the global level.
Data security is becoming a very vivid and tangible aspect of our digital life. To give an example, our decision whether or not to use the Zoom platform involves a complex risk assessment and trade-off. This analysis starts with an overview of Zoom controversies and then ‘zooms out’ to wider issues of data security and privacy protection.
Data and Security Trade-Offs: Zoom Controversies
As the Coronavirus crisis unfolds, Zoom becomes one of the main online meeting platforms. Recent reports show that in the span of a few months the platform went from 10 million to 200 million users. High usability and a simple interface made Zoom the go-to solution for many. As the platform started gaining ground, data exposure and security vulnerabilities came into focus. The issue of Zoom’s security and privacy became a popular media topic overnight, and fell prey to simplified coverage. Headlines report on bans imposed by governments and private sector companies against the platform. In order to fix privacy and security issues, the company has instantly issued a number of patches, formed a security board consisting of known professionals from the security community, and announced a 90-day feature freeze.
The table below illustrates a comprehensive understanding of the usability, risks, and trade-offs as a pre-condition for making informed decisions online. In this case we refer to the use of Zoom, but such analysis is applicable to any data and security risks on the Internet.
Risk | Solution | Comment |
Privacy of the meeting: Unauthorised access, intrusion, , or hijacking of online sessions (which became known as ‘Zoom-bombing’) by uninvited parties, which have been known to post indecent content during the meeting |
|
This has been the most reported problem, and cause for policy action by quite a few governments; however, in essence it is more related to user inconvenience (interruption of meetings, exposing people to indecent materials) than to risk to data generated by online meeting sessions. It is a generic problem, not only linked to Zoom (though it became visible mostly in Zoom due to its surge in popularity and number of meetings). More importantly, it is solved by simple security measures, or rather ‘digital hygiene’ for online meetings. |
Zoom’s sharing of personal data of users with Facebook via its Apple app | Soon after the report, Zoom issued updates to its apps, and stopped sharing data with Facebook. | This is a more structural problem triggered by the data-driven business model of many tech industries. Companies generating revenue through subscriptions to their products tend to not sell their data to other tech platforms. Given its structural nature, this aspect has to be closely monitored by consumer watchdogs, tech communities and users. Of major concern in such cases is transparency of privacy policies. This relates to what data are being collected (application-related data such as the hardware, operating system or browser run on the device, to make the user experience best possible; but also personal data, which are not necessary for the proper running of the service), and how these data are being used (eg. for user experience, statistics, or third parties). In addition, access rights demanded by the mobile apps (such as accessing the microphone, camera, user contacts, or other) should be clear, and minimised to necessary for the work of the service. This is, however, a general issue related to all the various services and applications.. |
Confidentiality of users and communications, in particular related to sharing of personal data, or access to communication (voice, video, chat, and files) or records by third parties (either intruders or law enforcement authorities). This includes:
|
Zoom specifies that it uses encryption for data in transit (like most other platforms do). Users may also need to install encryption tools on their own devices, to protect data on the sending/receiving ends as well, before they go to transit/after being received. Zoom admits it doesn’t encrypt data on its servers (like most other platforms as well). Unless Zoom decides to introduce encryption on servers (which will likely be market-driven), other elements may influence protecting data on servers:
|
Zoom is not different from other platforms when it comes to privacy and data protection (consult survey). While most services declare the highest standards on data security (e.g. the GDPR, encrypted communication while in transit, special protection of hardwares), the level of implementation of their declared policies and legal obligations remains to be seen. Again, transparency of security measures undertaken by the provider are of critical importance. Where encryption is used, there are also concerns raised regarding Zoom’s encryption keys being generated by servers in China (which is, however, linked to a broader issue in relation to weak points in global issuing of security certificates and keys). |
Based on the survey above, when we make a decision on whether or or not to use Zoom, we have to make two types of informed trade-offs. The first one relates to the risks that we are ready to tolerate while using the platform and the usability we are looking for. The second trade-off pertains to our preferred choice over Zoom and other platforms – do we gain more data security or usability by using alternatives such as Cisco Webex, Adobe Connect, Microsoft Teams or Tencent Voov? (see: a comprehensive overview of platforms and their features)
National Security and Privacy Protection
National security versus privacy protection is another facet where informed trade-offs need to be made on the basis of complex risk analysis. While this issue frequently comes up in the context of the fight against terrorism, the outbreak of a global health crisis brought the interplay between national security and privacy protection into sharper focus. In order to clamp down on COVID-19, tech companies and governments worldwide started developing contact-tracing applications and systems that gather data on individuals’ whereabouts, contacts, and health. While battling pandemics takes priority, the emergency measures adopted by governments laid bare the complexity of maintaining a balance between, on one side, privacy and data protection, and, on the other side, national security. Moreover, they have raised the question of how to ensure that such emergency measures are not maintained as default practices even after the pandemic is over.
Shifting data risk landscape
The Coronavirus crisis is reshaping the data risks landscape. For example, with working from home, data has moved from the ‘safe environments’ of companies, banks and governments to less safe home networks. In addition, what previously were regarded as less risky services such as online grocery shopping or mask supplies became increasingly critical for our society. That said, the most vulnerable of them all became the question of data and security of hospitals and health institutions. For example, a number of health facilities including the second largest hospital in the Czech Republic, hospitals in the US and Spain, as well as the Hammersmith Medicines Research – a company which tested for Ebola and is waiting to do medical trials on a possible COVID-19 vaccine fell victim to ransomware attacks.
Generally speaking, personal data requires a high level of protection and safety, as many regulatory frameworks have outlined. Among them is the EU’s General Data Protection Regulation, which tackles the various stages of data handling – from data collection and user consent, to protection of such data and retention periods.
For many other types of data that are in the public domain, including scientific data from CERN or weather data from the WMO, protection measures focus on ensuring data confidentiality, integrity, availability and usability. The participating organisations in the Road to Bern initiative have balanced policies that address a wide spectrum of data and their uses, ranging from highly sensitive information to data that can be freely accessed. The ICRC and WIPO, co-hosts of the second dialogue of the Road to Bern, are at the forefront of developing and delivering data policies that reflect this data diversity.
Higher exposure of health data
Personal health data gathered through patient registries has become essential for the development of AI and big data systems which are able to identify viruses early, to predict the development of other medical conditions and to create potential new treatment for the Coronavirus. While certain general data protection policies do exist as mentioned in the section above, new questions on the role of data anonymisation as well privacy protection of medical data across different regions and countries are being raised.
Companies compromise on users’ privacy?
Data encryption is one of the most effective measures used by companies and organisations to secure data. It adds another layer of protection, as it makes it more difficult for sensitive data to be breached, sold, de-anonymised, or compromised, and is as such crucial in terms of data ownership. In recent years, in order to protect users’ privacy and confidentiality considerations, tech companies have also started encrypting user-to-user communications. In practice, this means encrypting communications between users and servers of the companies, with only some companies turning to encryption of content on their servers as well, or ideally end-to-end encryption – all the way from one end (one user), via servers, to the other end (other user). Zoom, like most other online meeting platforms (except Webex), is not using end-to-end encryption for the time being.
However the outbreak of the Coronavirus has shown that global crises such as health pandemics can impact the users’ right to privacy. Amid COVID-19, tech giants such as Google, Apple and Facebook have started sharing anonymised information including location data so as to prevent the spread of the pandemic and allow for contact tracing. While they claim that such data will not be accessible for any other purpose, concerns have been raised over potential exploits of such technologies and measures in the future.
Data and Cybercrime
The Coronavirus crisis further accelerated the trend from 2019 when over 4 billion records were breached by cybercriminals. As fear of the spread of the pandemic continues to take grip,security breaches center around the spread of malicious PDF, Word documents and mp4 files, containing misleading information about the Coronavirus. Financial institutions have been encouraged to prepare for an increase in cyberattack as criminals seek advantage of potential chaos caused by the coronavirus. The implications of the erosion of data security will extend beyond the current crisis as users’ trust in digital services and the Internet dwindles.
Protection of data and intellectual property
Intellectual property rights (IPRs) are an important legal mechanism for data protection. WIPO has a wide range of IPRs instruments and policies that can help increase data safety. For example, its experience and expertise in developing digital rights management (DRM) approaches could be used for protection of other types of sensitive data such aso location sharing data on mobile devices.