The most important thing in a pandemic is to keep reasonable social
distancing and to apply preventive measures. Contact-tracing apps are designed
to automatically alert people to whether they are at high risk of having the
COVID virus. In order to detect a possible risk of infection, contact
tracing application being a mobile phone application using Bluetooth can
establish the location of individuals. Location indications are based on the
proximity of individuals who have been diagnosed positive of the Covid-19
virus. The application does not bring individual benefits but in return for
using it, persons can contribute to a public health outcome. The value of the
public outcome needs to be made clear to the public to facilitate important and
informed debate.
Apple and Google propose a "decentralised" approach to contact
tracing apps. The approach allows the contacts to be indicated on users'
handsets. The tech giants believe their effort provides greater privacy, in
contrast to the centralized mode, which limits the ability of either the
authorities, or a hacker to use the computer server logs to track specific
individuals and identify their social contacts.
NHSX, the unit with responsibility for setting national policy and
developing best practice for the National Health Service (NHS) technology,
digital and data, including data sharing and transparency in Great Britain,
has, however, decided to create its own app, and not rely on APIs from Google
or Apple (a strategy employed by other European countries: Switzerland, Estonia
and Austria's Red Cross, as well as a pan-European group called DP3T). NHSX
believes a centralised system will give it more insight into Covid-19's spread,
and hence how to improve the app accordingly. According to Prof. Christophe
Fraser, one of the epidemiologists advising NHSX spoken to the BBC, "one
of the advantages is that it's easier to audit the system and adapt it more
quickly as scientific evidence accumulates." For its part, the European
Commission has indicated that either model is acceptable. Dr. Michael Veale of
the DP3T has commented that "All countries deploying an app must put
adoption at the front of their mind, and if it doesn't work well or
significantly depletes battery life, then that may act as a deterrent, particularly
for those with older phones."
Against the background of this technological debate, British privacy
experts and academics have expressed grave concerns about the proposed NHSX
COVID-19 contact tracing app. A public letter to the government, signed by
117 experts
and
organised in part by Eerke Boiten, professor of cyber security at De
Montfort University, raised an admonition against the governmental plan.
Scientists and researchers, working in the fields of information security and
privacy, urged specialists from all relevant academic fields to analyse
comprehensively the health benefits of such digital solution and to find
sufficient evidence that it is of value to justify the dangers entailed.
One of the major concerns is that the new technology would enable (via
mission creep) a form of surveillance. Scientists insist that experts have to
be sure they have not created and installed a tool that enables data collection
and surveillance on the population, or on targeted sections of society.
Scientific experts have stated that solutions which allow reconstructing
invasive information about individuals must be fully justified. Such invasive
information could include the "social graph" of individuals who have
physically met over a period of time. With access to the social graph, a bad
actor (state, private sector, or hacker) could spy on citizens' real-world
activities. In addition, signatories of the letter hold that the usual data
protection principles should apply: to collect the minimum data necessary to
achieve the objective of the application. They furthermore claim the data
protection impact assessment (DPIA) for the contact tracing application should
be published immediately, prior to safeguards being put in place, rather than
just before deployment, in order to enable a public debate about its
implications and allow public scrutiny of the security and privacy
safeguards.
The system must show it will do what it is supposed to do in order for
people to trust it. By this, experts mean the reliability of the whole system,
including the people within it and not just the technological element.
Trustworthiness requires specialists to predict the risks that could happen in
order to stop unintended uses and harms that could undermine the good
idea behind any model. Thus, a combination of law, regulation, oversight,
enforcement and technical design should be put in force.
Compiled by Media 21
Foundation from
No comments:
Post a Comment