In the early days of the pandemic, the term "contact tracing" vaulted into the public consciousness: that's the shoe-leather- and labor-intensive process whereby skilled heath experts establish a personal rapport with infected people to establish who they had contact with.
For both good reasons (the scale of the pandemic) and bad ones (tech's epistemological blindness, which insists that all social factors can be ignored in favor of quantifiable ones), there was interest in automating this process and "exposure notification" was born.
The difference is that exposure notification tells you whether your device was near another device whose owner is sick. It doesn't tell you about the circumstances - like, was it one of the people at that eyeball-licking party? Or someone in the next car in a traffic jam?
Exposure notification vaporizes qualitative elements of contact tracing, leaving behind just a quantitative residue of unknown value. There are two big problems with this: first, it might just not be very useful (that's what they learned in Iceland):
Second: people might be so distrustful of your data-handling processes that they actively subvert the app, meaning there are so many holes in your data that the data-set is useless. That's what happened in Norway.
The thing is, contact tracing is high-touch/low-tech because it is a social science intervention. Social scientists have always understood that if you only gather the data that's easy to reach, you'll come to bad conclusions skewed by defects in your collection.
A canonical text on this is Clifford Geertz's "Thick Description," where he describes an anthropologist trying to figure out why a subject just winked: is it flirting? Dust in the eye? Something else? The only way to know is to ask: you can't solve this with measurement.
To a first approximation, all the important stuff in our world has an irreducible, vital qualitative dimension. Take copyright exemptions: fair use rules are deliberately qualitative ("Is your use transformative in a way that comments on or criticizes the work it uses?").
These are questions that reflect policy priorities: in the words of the Supreme Court, fair use is the "escape valve" for the First Amendment, the thing that squares exclusive rights for authors with the public's right to free expression.
But the tech and entertainment industry have spent decades trying to jettison this in favor of a purely quantitative measure: it's not fair use if your image incorporates more than X pixels from another, or if your video or sound has more than Y seconds from another work.
This is idiotic. Solving automation challenges by declaring the non-automatable parts to be unimportant is how we get self-driving car assholes saying, "We just need to tell people that they're not allowed to act unpredictably in public."
(BTW, this is all said much better than I can in a superb Communications of the ACM article by Randy Connolly: "Why Computing Belongs Within the Social Sciences.")
All of this is a leadup to the story of @Q3w3e3, an anonymous student at Michigan's Albion College, a private uni that reopened after insisting that all students must install a proprietary exposure notification app before returning to campus to lick each other's eyeballs.
Albion paid some grifters to develop this app. Because of course they did. The app is called Aura, and it was created by a company called "Nucleus Careers."
If you're thinking that's a weird name for a public health development company, you're right. They're a recruiting firm, founded this year, "with no apparent history or experience in building or developing healthcare apps."
Aura is predictably terrible. As @Q3w3e3 discovered when they audited it, the app stores all the students' location data in an Amazon storage bucket, and comes with the keys to access that data hard-coded into the app.
The app also allows attackers to trivially discover the test status of any registered user. Techcrunch discovered this bug and hypothesizes that they could get the health data for 15,000 people this way. Did someone say HIPAA?
Nucleus Careers refused to talk with Techcrunch's Zack Whittaker about this beyond a few glomarish nonstatements. But the school administration is standing behind the app, threatening to expel students who don't use it.
And this brings us back to the disutility of the denatured quantitative residue of the thick, qualitative process of contact tracing. Many of the students who have the most at risk from using the app are also at the highest risk of contracting the disease.
People struggling with addiction, queer kids who aren't out and have secret partners, people engaged in survival sex-work are all at higher risk of exposure, and they also have the biggest reason NOT to use the app, lest it leak their secrets.
These are the people who you absolutely WANT to include in public health efforts, but that can only happen through noncoercive, personal, high-trust, low-tech interventions.
In other words, Aura isn't just technologically inept, it's also epidemiologically inept. The cliche that "you treasure what you measure" could not be more applicable here.
Look, these students shouldn't even be on campus. Obviously. And even a good contact tracing system would probably mostly serve as a postmortem for analyzing the inevitable conflagration of infection incoming in 3...2...1
But Albion is still a fascinating case-study in the lethal incoherence of the contempt of both managerial and technology circles for "human factors."
At the very least, we should ensure that the lives they will squander through their hubris aren't totally wasted.
@pluralistic Seriously? Is that what these applications do? That is, as you say, completely useless.
The system they use here in Singapore makes sense though, in my opinion. The application runs in the background, logs contacts but never informs you of anything other than the number of people it's interacted with (earlier versions didn't, but I guess it's just kind of neat to see that it's actually doing something).
If the contact tracing team (which is doing a lot of manual work) learns that, for example, the individual was at a restaurant at a certain time, they can then ask for the log of which devices their phone interacted with to be released (the person have to confirm this on their device before it's sent to the contact tracing team). The team can then work on contacting the people who were near them at the restaurant to ensure they are tested.
That's how it has to work, and I'm baffled that there are people who think that this kind of work can be automated.
@pluralistic it's almost like these app developers could have done better by working with public health workers to develop solutions to the problems they actually have.
The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!