The wife of one of my elementary school teachers once delivered a full-term, stillborn baby. It was a great tragedy, but far worse came in the months and years that followed, as direct-marketers bombarded them with pitches that tracked the progress of their dead child.


College-savings plan ads, ads for baby food, annual birthday notices - the whole thing running on autopilot as marketers pursued the Procter & Gamble "lifecycle marketing" playbook that targets the turning points in customers' lives, like parenthood.


This got automated. In 2014, Eric Meyer coined the term "inadvertent algorithmic cruelty" to describe his experience of Facebook's "memories" feature, which bombarded him with pictures of his young daughter on the anniversary of her death.

Meyer called it "inadvertent," but there's a strong argument to drop that and simply call it "algorithmic cruelty."


Facebook *should* have known that promoting "high-engagement" posts would end up retraumatizing people on the anniversaries of the worse moments in their lives.

And if the company didn't realize it in 2014, they certainly knew about it after, and did not stop. In 2018, Patrick Gerard wrote about how Facebook commemorated his mother's death with a video of animated characters literally dancing on her grave.


Algorithmic cruelty spread to other platforms: for example, Google's smart address book began adding women's stalkers to their speed-dials, sensing a high degree of mutual interactivity:

The problems of algorithmic cruelty - the predictable ghastliness of a fire-and-forget system of idiotic, automatic cheer - have long been a feature of science fiction.


Think of Bradbury's classic "There Will Come Soft Rains," where an empty house cheerfully greets its dead owners with their daily routine after a nuclear war has killed nearly every living thing.

Or David Marusek's pioneering, haunting story "The Wedding Album," about the AI avatars of a couple, created to commemorate their wedding day, outliving the couple and haunting virtual spaces for thousands of years:


Or Sarah Gailey's instant classic 2018 short story STET, which recounts a particularly horrific sort of algorithmic cruelty in the editorial notes on a scholarly paper about a self-driving car wreck:

None of these warnings were heeded. Indeed, algorithmic cruelty - incubated in primitive direct marketing, supercharged by social media - made the jump BACK to ad-tech, in a form that is thousands of times more virulent than its prehistoric paper-based ancestor.


Writing in Wired, Lauren Goode describes the ad-tech algorithmic cruelty trap she found herself in: eight years ago, she called off her wedding. Today, she is still bombarded with messages that track the progress of a marriage that never happened.


These are the product of the "memory monetization machine," which surfaces your old social-media breadcrumbs as inventory for spot-market advertising auctions: "This user got married eight years ago, who will pay me top dollar to show them an ad?"



Naturally, this has all the failure modes of social memory monetization - the dead children and parents, and commemorations of other traumas - but with ad-tech's nonconsensual, eternal torture: you can quit Facebook, but you can't control these background processes.


· · Web · 1 · 0 · 0

Goode quotes Kate Eichhorn, whose book THE END OF FORGETTING describes how this nonconsensual external memory system disrupts the "memory editing" that is key to overcoming trauma for the most marginalized among us:


Reading it I was struck by the distance between the algorithmic cruelty of nonconsensual memory-surfacing, and my own powerful, hugely beneficial practice of combing through my own digital history, which is in a database under *my* control - my 20-year blog archive.

For a decade, I've started each day by looking at my posts from this day in the past - at first, it was and - now I look back at and , and republish the elements that seem significant today.


Here's yesterday's:

I can't overstate beneficial this is: tracking my own predictions, concerns and aspirations over time is an incredible tonic for anxiety, a tool to refine and improve my goals, an empirical, external check on my memories and feelings about where I am and where I've been.


It's a subspecies of Cognitive Behavioral Therapy, writing down your worries and aspirations, then revisiting them after the fact to refine your understanding of when your intuition leads you true...or astray.

The difference between what I do and algorithmic cruelty isn't technology - it's control. I'm in charge not an unaccountable, nonconsensual algorithm.

As is often the case with tech issues, the important thing isn't what the tech does, it's who it does it *to* and *for*.


Indeed, thinking this through this morning made me realize how much I'd like to revisit my photos every day; I've got 20 years worth of them stashed on Flickr, where I was literally one of the first users:

I tried it this morning, but Flickr's tools remain incredibly primitive thanks to years of neglect under Yahoo's ownership. Its new owners, Smugmug, have been making great strides, but they have a LOT of technology debt to pay off.


But having manually pulled up photos from this day 5, 10, 15 and 20 years ago, I was absolutely delighted. I would welcome a Flickr change to made it simple to see pics from a given date - maybe by editing the URL itself (currently a mess!):


The point I'm trying to make here is that we shouldn't mistake the ability to revisit your past experiences and thoughts for algorithmic cruelty - the answer to this cruelty isn't to destroy our digital time-machines; it's to seize the means of computation.

Cryteria (modified)



death, specific traumas, retraumatisation 

@pluralistic Boost thread with CW.

@pluralistic "seize the means of computation." excellent phrasing

Sign in to participate in the conversation
La Quadrature du Net - Mastodon - Media Fédéré est un serveur Mastodon francophone, géré par La Quadrature du Net.