This got automated. In 2014, Eric Meyer coined the term "inadvertent algorithmic cruelty" to describe his experience of Facebook's "memories" feature, which bombarded him with pictures of his young daughter on the anniversary of her death.
Meyer called it "inadvertent," but there's a strong argument to drop that and simply call it "algorithmic cruelty."
Facebook *should* have known that promoting "high-engagement" posts would end up retraumatizing people on the anniversaries of the worse moments in their lives.
And if the company didn't realize it in 2014, they certainly knew about it after, and did not stop. In 2018, Patrick Gerard wrote about how Facebook commemorated his mother's death with a video of animated characters literally dancing on her grave.
Algorithmic cruelty spread to other platforms: for example, Google's smart address book began adding women's stalkers to their speed-dials, sensing a high degree of mutual interactivity:
The problems of algorithmic cruelty - the predictable ghastliness of a fire-and-forget system of idiotic, automatic cheer - have long been a feature of science fiction.
Or Sarah Gailey's instant classic 2018 short story STET, which recounts a particularly horrific sort of algorithmic cruelty in the editorial notes on a scholarly paper about a self-driving car wreck:
None of these warnings were heeded. Indeed, algorithmic cruelty - incubated in primitive direct marketing, supercharged by social media - made the jump BACK to ad-tech, in a form that is thousands of times more virulent than its prehistoric paper-based ancestor.
Writing in Wired, Lauren Goode describes the ad-tech algorithmic cruelty trap she found herself in: eight years ago, she called off her wedding. Today, she is still bombarded with messages that track the progress of a marriage that never happened.
Goode quotes Kate Eichhorn, whose book THE END OF FORGETTING describes how this nonconsensual external memory system disrupts the "memory editing" that is key to overcoming trauma for the most marginalized among us:
Reading it I was struck by the distance between the algorithmic cruelty of nonconsensual memory-surfacing, and my own powerful, hugely beneficial practice of combing through my own digital history, which is in a database under *my* control - my 20-year blog archive.
For a decade, I've started each day by looking at my posts from this day in the past - at first, it was #1yrago and #5yrsago - now I look back at #15yrsago and #20yrsago, and republish the elements that seem significant today.
I can't overstate beneficial this is: tracking my own predictions, concerns and aspirations over time is an incredible tonic for anxiety, a tool to refine and improve my goals, an empirical, external check on my memories and feelings about where I am and where I've been.
It's a subspecies of Cognitive Behavioral Therapy, writing down your worries and aspirations, then revisiting them after the fact to refine your understanding of when your intuition leads you true...or astray.
The difference between what I do and algorithmic cruelty isn't technology - it's control. I'm in charge not an unaccountable, nonconsensual algorithm.
As is often the case with tech issues, the important thing isn't what the tech does, it's who it does it *to* and *for*.
Indeed, thinking this through this morning made me realize how much I'd like to revisit my photos every day; I've got 20 years worth of them stashed on Flickr, where I was literally one of the first users:
I tried it this morning, but Flickr's tools remain incredibly primitive thanks to years of neglect under Yahoo's ownership. Its new owners, Smugmug, have been making great strides, but they have a LOT of technology debt to pay off.
But having manually pulled up photos from this day 5, 10, 15 and 20 years ago, I was absolutely delighted. I would welcome a Flickr change to made it simple to see pics from a given date - maybe by editing the URL itself (currently a mess!):
The point I'm trying to make here is that we shouldn't mistake the ability to revisit your past experiences and thoughts for algorithmic cruelty - the answer to this cruelty isn't to destroy our digital time-machines; it's to seize the means of computation.
Mamot.fr est une serveur Mastodon francophone, géré par La Quadrature du Net.