HighIQCommunity.Com

4,291,644 words. 5,012 posts.

Fighting Covid-19 Shouldn’t Mean Abandoning Human Rights

by | Apr 9, 2020 | New, News

If I cannot do great things, I can do small things in a great way.

— Martin Luther King Jr.

Governments around the world are racing to adopt new surveillance tools in response to the Covid-19 pandemic. Many are green-lighting dragnet monitoring systems, seeking real-time location data from mobile providers or deploying facial recognition and other emerging technologies.

These steps may usher in a long-term expansion of the surveillance state. Covid-19 has arrived after two decades of rapid technological change, in which both the public and private sectors exponentially increased their capacity to incorporate surveillance into various aspects of governance and commercial activity. Many democracies have tried, not always with success, to build legal barriers that constrain authorities’ ability to access and exploit the personal information collected by private companies. Coronavirus surveillance could dismantle these structures.

WIRED OPINIONABOUT

Allie Funk is a research analyst for technology and democracy at Freedom House. She is an expert on human rights in the digital age, focusing on the organization's Freedom on the Net project.

To avoid such an uncontrolled shift, policymakers must ensure that any new surveillance program complies with human rights principles, like those outlined by Freedom House, which safeguard basic freedoms while allowing the government to do what is necessary to protect public health.

Testing for necessity and proportionality

International human rights standards give states some leeway to adopt surveillance measures in the current crisis, but the programs must first be proven to be necessary in significantly limiting disease spread. If public health experts can ensure the monitoring’s effectiveness, then any program must next be narrowly tailored, minimizing what data is collected and using the least intrusive options to accomplish legitimate goals.

South Korea has been comparatively effective at containing its coronavirus outbreak, but its Infectious Disease Control and Prevention Act (IDCPA) allows authorities to tap into broad surveillance powers, raising questions about epidemiological necessity and proportionality. For example, officials have pulled information from credit card records, phone location tracking, and security cameras—all without court orders—and combined it with personal interviews for rapid contact tracing and monitoring of actual and potential infections. Importantly, IDCPA requires data collected to “be destroyed without delay when the relevant tasks have been completed.”

Credit card histories reveal intimate details about people’s lives that go far beyond basic information for contact tracing, including sexual orientation and religious beliefs. Mobile-phone location data is also personal information, and some South Korean officials have publicized patients’ gender, age range, and where they have been and when to notify other residents about potential exposure. This log has meant that some South Koreans’ personal movements have been laid bare for public consumption, at times fueling online ridicule, scrutiny, and social stigma. Yet the data may not be precise enough to discern whether two people were at least 6 feet apart. This ambiguity is especially problematic if the records are cited to penalize people for not complying with quarantine or social-distancing rules.

Instituting independent oversight

Surveillance programs need robust and independent oversight that can assess what types of data are collected, who manages the collection, and how and by whom that information is used. As the pandemic evolves, an independent legislative review process should routinely keep tabs on programs to ensure they remain necessary and proportionate. An avenue for judicial review should also be available so that affected individuals can appeal disproportionate restrictions and seek redress for any abuses.

Got a coronavirus-related news tip? Send it to us at covidtips@wired.com.

Worryingly, this essential oversight is lacking in some surveillance initiatives. Israel’s caretaker government, for example, used emergency regulations to grant police and security officials access to a secretly obtained trove of sensitive smartphone metadata, including geolocation data, without parliamentary approval. The existence of this database and its underlying legal framework was previously undisclosed. After the government’s unilateral move, the High Court intervened to impose a temporary injunction and require some legislative involvement. Security officials have since been allowed to continue the monitoring after a new parliamentary subcommittee was established, but these controls do not appear to be sufficiently robust.

Ensuring openness and transparency

Openness and transparency are crucial not only for keeping citizens safe during a health crisis, but also for helping them understand how and why their privacy is being affected. This builds public trust in the institutions tasked with curtailing the outbreak, while ensuring that surveillance programs and the officials running them remain accountable.

Many mobile applications that claim to track individuals’ movements and quarantine compliance fall short on transparency. They are generally opaque regarding how they collect and process data, and how and with whom they share that information.

In Poland, some are using the government’s new Home Quarantine app to prove they’re complying with isolation orders. Users first upload a profile image and are then sent periodic requests to upload a “selfie” for authorities. The app pulls a geolocation stamp from the selfie to confirm the time it was taken while using facial recognition to match the image to the user’s original picture. It remains unclear how much information the app collects, and whether the data can be retained or made available for other private or public facial recognition initiatives

Sunsetting and limiting data collection, access, and use

Surveillance programs should have unambiguous sunset clauses so that they cannot continue once the pandemic ends. Information collected during the outbreak should be firewalled from other governmental or commercial uses and then should generally be destroyed after the virus is brought under control.

Emergencies provide governments a shortcut to access people’s personal information or roll out emerging surveillance technology that under normal circumstances would either not be allowed or would require significantly stronger judicial or legislative review. Moreover, indiscriminate monitoring and mass collection of sensitive information sidestep due process standards, making everyone a suspect of potential wrongdoing.

Authorities could collect sensitive information or deploy facial recognition systems under the guise of countering the outbreak, only to use them later for political purposes, such as repression of minority populations. Phone records can be, and have previously been, weaponized to track down and arrest journalists. Geolocation data could be used to identify and detain undocumented people for deportation. And police could repurpose data about people’s movement to identify civic organizing efforts and disrupt protests. Private entities such as insurance companies or advertising agencies may also seek to exploit such data for their own commercial ends.

In practice, many programs lack or are ambiguous about sunset and firewall provisions. Certain mobile providers in Belgium, Germany, and Italy have supplied aggregated and “anonymous” location data to authorities. South African mobile carriers have also agreed to hand over location data. In the United States, mobile advertising companies, not mobile service providers, are reportedly providing government agencies with similar information. It is unclear how and by whom this third-party data could be used during and after the outbreak, whether for law enforcement, immigration, or intelligence purposes.

Stopping the spread

While certain forms of monitoring—such as contact tracing—can be indispensable to containing Covid-19, they should remain in compliance with human rights standards. Aggressive expansion of surveillance programs without adequate checks could normalize privacy intrusions and create systems that may later be used for various forms of political and social repression.

Surveillance tools alone cannot solve a public health crisis. Enhanced technical monitoring does not provide rapid tests to patients, protective equipment to medical workers, or ventilators and staffing to hospitals. As democracies build out their responses to the pandemic, they should ensure that their efforts do not also institute a lasting deterioration in human rights.

WIRED Opinion publishes articles by outside contributors representing a wide range of viewpoints. Read more opinions here. Submit an op-ed at opinion@wired.com.

More From WIRED on Covid-19

  • The mathematics of predicting the course of the coronavirus
  • What to do if you (or a loved one) might have Covid-19
  • First denial, then fear: patients in their own words
  • Fun tools and tips to stay social while you’re stuck at home
  • Should I stop ordering packages? (And other Covid-19 FAQs, answered)
  • Read all of our coronavirus coverage here

WIRED

WIRED is where tomorrow is realized. It is the essential source of information and ideas that make sense of a world in constant transformation. The WIRED conversation illuminates how technology is changing every aspect of our lives—from culture to business, science to design. The breakthroughs and innovations that we uncover lead to new ways of thinking, new connections, and new industries.

  • Facebook
  • Twitter
  • Pinterest
  • Youtube
  • Instagram

More From WIRED

  • Subscribe
  • Newsletter
  • FAQ
  • Wired Staff
  • Press Center

Contact

  • Advertise
  • Contact Us
  • Customer Care
  • SecureDrop
  • Jobs
  • RSS
  • Site Map
  • Accessibility Help
  • Condé Nast Store

© 2020 Condé Nast. All rights reserved. Use of this site constitutes acceptance of our User Agreement (updated 1/1/20) and Privacy Policy and Cookie Statement (updated 1/1/20) and Your California Privacy Rights. Do Not Sell My Personal Information Wired may earn a portion of sales from products that are purchased through our site as part of our Affiliate Partnerships with retailers. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast. Ad Choices


View Original Article

Site VisitorsMap

twoc, n.

twoc, n. The offence of taking a car without the owner's consent, esp. for the purpose of joy-riding

Recent News

NFTs and AI Are Unsettling the Very Concept of History

As an archivist, I’m excited about what disruptive innovations like non-fungible tokens (NFTs) and artificial intelligence may mean for archives. But I’m also worried. These developments pose existential threats to our field, and by extension, to the..

Site Statistics

302 registered users
4,291,644 words
5,012 posts, 2 comments
9326 images