The Facebook whistleblower's testimony was refreshing and frightening by turns, revealing the company's awful internal culture, where product design decisions that benefited its users were sidelined if they were bad for its shareholders.

The big question now is, what do we do about it? The whistleblower, Frances Haugen rejected the idea that Facebook's power should be diminished; rather, she argued that it should be harnessed - put under the supervision of a new digital regulator.

1/

Follow

Regulating tech is a great idea (assuming the regulations are thoughtful and productive, of course), but even if we can agree on what rules tech should follow, there's still a huge debate over how the tech sector should be structured.

Like, should we leave monopolies intact so that we only have to keep track of a few companies to make sure they're following the rules? Or should we smash them up - through breakups, unwinding anticompetitive mergers, and scrutinizing future mergers?

2/

· · Web · 1 · 0 · 2

For me, the answer is self-evident: if we don't make Big Tech weak, we'll never bring them to heel. Giant companies can extract "monopoly rents" - huge profits - and cartels can agree on how to spend those profits to subvert regulation.

eff.org/deeplinks/2021/08/star

We need to fix the internet, not the tech giants.

3/

The problem isn't just that Zuck is really bad at being the unelected pope-emperor of the digital lives of 3,000,000,000 people - it's that the job of "pope-emperor of 3,000,000,000 people" should be abolished.

I believe that people who rely on digital tools should have the final say in how those tools serve them. That's the proposition at the core of the "nothing about us without us" movement for accessible tech, and the ethos of Free Software.

4/

Technologists should take reasonables step to make their products suitable for users, and regulators should step in to ban certain design choices: for example, algorithms that result in racial discrimination in housing, finance and beyond.

The law should step in when sites or apps are deceptive or fraudulent or otherwise harmful; people hurt by negligent security and other choices should have remedies in law, both as private individuals and through their law enforcement officials.

5/

But even if we did all that - and to be clear, we don't - it wouldn't be enough to deliver technological self-determination, the right to decide how the technology you use works.

For example, when the W3C was standardizing EME - a shameful incident in which they created a standard for video DRM - there was a lot of work put into accessibility, including safeguarding closed captions and audio description tracks.

6/

But even the most inclusive design process can't contemplate all of the ways in which users will need to adapt their tools. My friend Jennifer has photosensitive epilepsy and was hospitalized after a strobe effect in a Netflix stream triggered a series of grand mal seizures.

EME could have accommodated that use-case by implementing a lookahead algorithm that checked for upcoming strobes and skipped past them or altered their gamma curves so that they didn't flash.

7/

The committee rejected this proposal, though.

But that wasn't all they did. They also rejected a proposal to extract a promise from the companies involved in EME's creation to refrain from threatening toolsmiths who added this feature on their own, either to help themselves or on behalf of other users.

8/

The reason such a promise was necessary is DRM enjoys special legal protection: distributing a tool that bypasses DRM - even to prevent grand mal seizures - can be prosecuted as a felony under Sec 1201 of the DMCA, with 5 years in prison and a $500k fine for a first offense.

9/

The companies making W3C DRM said they didn't need to promise not to destroy the lives of toolsmiths who added accessibility features to their product because they would add every necessary accessibility feature themselves.

Except they wouldn't. They blocked an anti-seizure tool, and Dan Kaminski's proposal to shift color palettes to compensate for color-blindness, and a proposal for captioning tools to bypass DRM to ingest videos and run multiple rounds of text-to-speech analysis.

10/

Even if they *had* accepted all of this, it wouldn't have been enough. No one can anticipate all the ways that people need to adapt their tools. "Nothing about us without us" can't just mean, "Some people with disabilities helped design this."

It also has to mean, "I, a person using this tool, get a veto over how it works. When my interests conflict with the manufacturer's choices, I win. It's my tool. Nothing about me without me." That's the soul of technological self-determination.

11/

Not only is Zuck a bad custodian of 3b lives, but every company is a bad custodian - or at least, an imperfect one - when it comes to its users' lives. Companies often do good things for their users, but when user interests conflict with shareholder priorities, users lose.

12/

Companies *should* try to do good things, and we *should* have minimum standards, and *also* we should have ways to further adapt their tools to suit their own needs, first, because companies can't anticipate all those needs, and second, because they have conflicts of interest.

13/

In response to the whistleblower's testimony, Slate Future Tense has an article by Louis Barclay describing his misadventures with a Facebook add-on he created called "Unfollow Everything" - a tool that helped people use Facebook less and enjoy it more.

slate.com/technology/2021/10/f

14/

Unfollow Everything unfollowed all the friends, groups and pages you followed on Facebook. This eliminated Facebook's News Feed entirely, and then users could either manually check in on friends (unfollowing isn't the same as unfriending) or selectively follow them again.

Users who tried Unfollow Everything really liked it.

15/

They found themselves spending less time on Facebook, but enjoying the time they spent there a lot more. Both Barclay and his users felt "addicted" to Facebook and this helped them "control the addiction."

I'm not a fan of analogizing the habitual use of digital tools to "addiction," but that's not the point here - before Unfollow Everything, these users didn't like Facebook but kept returning to it, and after, they kept using it, and liked it more.

16/

It would be interesting to understand this phenomenon. After all, Facebook has spent a lot of money on internal experiments where social scientists and designers collaborated to increase the time users spend on Facebook (AKA "engagement").

In his spittle-flecked rebuttal to the WSJ's Facebook Files, Mark Zuckerberg insisted that the point of this research was to increase user satisfaction, and the extra ad revenue from additional pageviews were just a coincidence.

mashable.com/article/mark-zuck

17/

I don't think this is true. But it's a claim we can empirically investigate, and Unfollow Everything would be a great tool for such an investigation. That's why a group of academics at Switzerland's University of Neuchâtel sought a collaboration with Barclay for a study.

18/

That study never happened and never will, because Facebook had Unfollow Everything deleted from the Chrome store, kicked Barclay off the service and threatened to rain legal hell upon Barclay's head if he ever made *any* Facebook tools, *ever* again.

Like the companies that standardized DRM at the W3C, Facebook says its users' wellbeing and satisfaction are primary factors in its product design and demands a veto over any modifications so it can deliver the best product to those users.

19/

Like those companies, Facebook opposes - and wields its legal might against - toolsmiths who adapt its products and services to reflect the needs of users when those needs conflict with its shareholders' interests.

Time and again, Facebook demonstrates that it cannot be trusted to wield a veto over how we use Facebook. Think of how the company is attacking Ad Obervatory, an academic project that recruits volunteer Facebook users to track paid political disinformation on the platform.

20/

Facebook falsely claimed that Ad Observatory compromised users' privacy, and directed researchers to its own tools for analyzing content on the platform - tools that are riddled with errors and omissions and flat-out misleading distortions:

pluralistic.net/2021/08/06/get

Show newer

@pluralistic hypothetical regulators: "but we can't work with innumerable small platforms, only a pope-emperor",,,,

Sign in to participate in the conversation
La Quadrature du Net - Mastodon - Media Fédéré

Mamot.fr est une serveur Mastodon francophone, géré par La Quadrature du Net.