Digital contact tracing is swiftly becoming a hopeful solution in containing the spread of COVID-19 in Canada. However, it is also a topic of hot debate and controversy among health authorities, governments, tech experts, and potential application users. The projected benefits are certainly appealing with faster exposure notifications to users, but the privacy implications and risks are plenty. For marginalized groups and survivors of domestic violence, those risks will weigh heavier.
What is contact tracing?
Contact tracing is a public health tool that seeks to contain the spread of infectious diseases. To do so, contact tracers map a patient’s whereabouts prior to a positive diagnosis for a highly communicable illness or infection. Their primary purpose: identify who they were in contact with and notify them of a possible exposure. Aside from being time-consuming, the process relies on the memory of the patient for much of that information. This is not an easy ask for the average person.
In the case of COVID-19, a person can be asymptomatic from 10-14 days before they show any symptoms and that lag time is contributing to higher infection rates. Consequently, governments across the world have been seeking solutions to slow the spread alongside local contact tracing authorities.
Enter: contact tracing apps.
Complementing manual contact tracing, the basic intent of digital contact tracing apps is to use phones to track proximity and duration with other phones. If someone with the app tests positive, that result is uploaded and other users that came into contact with that phone receive a notification about a possible exposure and how to proceed.
In Canada, COVID Alert is being tested in Ontario and slowly rolling out nation-wide with the cooperation of provincial governments. As part of its plan to protect Canadians' privacy, no GPS or personal data is stored or shared. Instead, user phones are assigned randomized codes every 14 days and Bluetooth signals collect and encrypt the codes from other users. The point is to prevent any personal information from being shared between the phones, the government, and third-party app developers. Although the federal government is prioritizing privacy, it cannot be guaranteed and unintended consequences are a reality for digital contact tracing apps.
Privacy Risks and The State of Ethics
There is no standard model of development or implementation for contact tracing apps and the implications of this variation can be riskier in some contexts over others. As such, transparency, accountability, and ethics are all vulnerable to being placed on the backburner while governments prioritize reducing caseloads and re-opening economies.
Security and data breaches as well as the misuse of collected data are also possible scenarios and app users could find their personal information compromised. In worst case scenarios, the misuse of data includes the government mapping out an individual’s routine and contacts or that information being hacked by another unknown party. For activists, migrant or undocumented workers, and refugees, that risk is palpable.
Ethics-wise, there is still much to be desired. Even if present motivations are genuine in stopping the spread of the coronavirus, there are no guarantees that future uses of the data (if it is not deleted) or future data collection will not cross a line. What is more, that line is being defined as they develop these apps. Non-profit organization Canadian Civil Liberties Association voiced these concerns to Radio Canada International and warned how unprecedented and big this “ask” is to participate in a contact tracing app. For them, it is not necessarily the app itself that is the problem but the precedent it sets for using surveillance technologies in the future.
Unintended Consequences of Notifying Users
Despite good intentions, digital contact tracings apps do not produce equal benefits to everyone. Heath and Human Rights Journal contributor, Sara L.M. Davis, identifies overarching risks for marginalized groups using international examples. Below are those risks adapted to the Candian context:
In Domestic Violence Contexts
For survivors of domestic violence currently in that context and where coronavirus information is accessible, a spontaneous notification may spark an abuse event. This could happen by raising speculation about where and how the exposure could have happened or as an act of punishment on the survivor for potentially bringing COVID-19 into the home.
In these cases and where COVID Alert isn’t able to remove the element of uncertainty in its notification programming, survivors may have to make unfair decisions between their immediate safety and the safety of their social circle and greater community from coronavirus.
For Groups Historically Targeted in Public Health Interventions or Crises
In Canada, there is a historical and ongoing context of mistrust, abuse of power, and violence against Indigenous peoples and other marginalized groups in healthcare contexts.
Uptake may remain low and the benefits of the app for its users and communities as a whole won’t be equal to or at scale with privileged users and communities.
Narratives against marginalized groups that blame them for the spread of COVID-19 are already circulating and increased incidences of racism have been reported. Although the COVID Alert app does not share personal information with other users, it may not stop users from accusing or suspecting someone. Sara L.M. Davis speaks more to this danger and its very real risks in this article by the Health and Human Rights Journal.
Before Clicking ‘Download’
Ultimately, the decision to participate in digital contact tracing is a personal one. As a citizen, individual, and wider community it is not selfish to ask for more information when it comes to your privacy – how it will be collected, used, stored, and deleted.
As practicing radiologists during the pandemic, Georgios Kaissis and Rickmer Braren point out, “We own our data, but we don’t own our patients’ data. Data governance does not equal data ownership.” This distinction is crucial for setting expectations and safeguarding the populations most vulnerable to data abuse.