In response to the COVID-19 pandemic, universities are pursuing a broad range of technology-assisted public health programs on campus. However, there is a troubling trend of mandating use of these new and often experimental technologies as a condition of returning to campus in order to work or study.
Contact tracing is a complex undertaking. The primary public health benefits are to track and diminish the spread of the disease. But it is intrusive to ask people who they have been near, when, and where. Thus, it requires cooperation, trust, and help from the person who is ill, or exposed to someone else who is ill. Contact tracing is also a social service, giving someone who is ill advice, and connections to resources often as basic as how to get food while staying home. A new application or device might add value to a robust public health plan, yet it cannot be a replacement. Thus, at the core of the effort there must be trust in public health officials, for people to comply with their demands, even for something as simple as staying home. This trust requires openness, transparency, and consent from all of the individuals involved. Technology might be an aid to this, but it cannot be the only response to the crisis, and it cannot be mandated.
University leadership need to keep in mind that all of these apps have been created as a rapid response to a pandemic, and thus have not received the sort of testing for flaws or security problems that would happen in a normal development process. The resulting bugs from these rush jobs have the potential to undermine any planned data collection and security protocols. If community members at a university are compelled to use these programs, despite the personal and technical risks, universities will disproportionately harm the most vulnerable individuals among them.
These novel apps, especially if mandated, may also exacerbate the digital divide on campus. These apps require a person to own an up-to-date smartphone which is always charged and close to their body. Students or faculty who do not have phones would be unable to meet the smartphone app requirements to attend classes in person. Some of those same students or faculty might also lack reliable home broadband or work space, and thus be unable to attend classes remotely, either.
There are many reasons why one might choose not to participate in these programs. Automated contact-tracing and notification programs feature a wide range of potential privacy and security concerns. The least worrisome programs don’t collect any data at all by default, but allow a user to stay informed about campus news or track their symptoms and share at their own discretion. On the other hand, there are technologies being adopted which collect and handle sensitive biometrics, personal medical data, and/or precise location data.
Given the sensitivity of this data, how these programs work needs to be transparent. Details about what methods are implemented have a great impact on both the efficacy and privacy concerns. For example, location tracking using GPS and cell site information is not suited to contact tracing, because it will not reliably reveal the close physical interactions that experts say are likely to spread the disease. It is, however, precise enough to expose sensitive information like whether we’ve been to a church or a bar. On the other hand, programs establishing a proximity history by means of Bluetooth, typically through Google-Apple Exposure Notification (GAEN) apps, are much more effective at preserving privacy,
In addition to being transparent about what data is collected, it is essential that universities are also transparent about how data is being handled and what rights users have to access and delete their own information. To foster more trust, universities should also disclose details around the agreements they have with external vendors.
By mandating these applications and devices, institutions are building a public health response based on discipline and coercion rather than trust. Operating in the dark, students and workers may fear the worst. For students violating rules, this means possibly facing disciplinary proceedings and sanctions. Tracking mandates, particularly by government entities like public universities, also have the potential to chill constitutionally protected speech. Students may be afraid to exercise their rights to speak out about university policies if they fear the university or law enforcement can misuse these public health technologies for mass surveillance.
CALL ON YOUR UNIVERSITY TO TAKE THE PLEDGE
Universities must strike any app mandates from their existing student commitments, and publicly commit to the University App Mandate Pledge (UAMP). If a university identifies a specific technology it would like students to use, it is the university’s responsibility to present it to students and demonstrate that it is effective and respects their privacy. It must do so by sharing privacy policies, by explaining how and by whom student data will be used and shared, by making commitments regarding how the institution will protect students privacy, and by offering avenues for feedback before and during decision-making. Anything less will infringe on the rights of students, faculty, staff, and community members.