The biggest fallacy in the online privacy is that there is a difference between "state surveillance" and "commercial surveillance." Bizarrely, it's a fallacy that is widely held by both government snoops and Big Tech snoops.

1/

Many's the time I've spoken to a DC audience about privacy, only to have an audience member say, "I'm OK with Uncle Sam spying on me - after all, I've already given up every sensitive scrap of information about my personal life to the Office of Personnel Management when I applied for security clearance. But I don't want my money going to *Google* - those bastards would sell their mothers out for a nickle."

2/

Meanwhile, in Silicon Valley, I hear, "I don't care if Google has my data - they just want to show me better ads. But the US government? Hell no! Those govies and their profiteering private contractor pals are all too stupid to get jobs at *real* tech companies and who knows what they're going to do with my data?"

3/

Both groups are gripped by the delusion that state surveillance can be disentangled from commercial surveillance. In a just world, companies would be barred from mass-scale surveillance for their private gain. After all, this is a practice that imposes vast risks on the public - humiliation, identity theft, extortion, and more - and is only profitable because the companies that create this risk can privatize the benefits of spying and socialize the costs of leaks:

locusmag.com/2018/07/cory-doct

4/

How is it that the government hasn't stepped in to force companies to end the practice of spying? Worse, how is it that the government *abets* spying - for example, by reinforcing the risible fiction that clicking "I agree" on a meandering, multi-thousand word garbage legalese novella constitutes "consent"?

pluralistic.net/2022/08/10/be-

5/

It's because the project of mass state surveillance *depends* on mass commercial surveillance. Remember the Snowden revelations? Remember how they started with , a program whereby Big Tech had secretly colluded with the NSA to conduct illegal, mass surveillance?

theguardian.com/world/2013/jun

6/

Follow

The companies denied it at first, but they changed their tunes - and squealed like stuck pigs - when another NSA program called "Upstream" was revealed. "Upstream" was the NSA's practice of wiretapping the fiber lines between Big Tech's data-centers.

washingtonpost.com/world/natio

7/

· · Web · 1 · 1 · 0

Prism turned out to be a way to trick the tech giants into thinking that they were in control of the NSA's harvesting of their users' data. But what was really going on was that the NSA was capturing *everything*, picking out the stuff they wanted, and requesting it via Prism (this is called "parallel construction" and it's used when an agency does not want to reveal its methods to its partners or adversaries).

8/

The NSA *depended* on Big Tech collecting and retaining everything, and it *depended* on the companies recklessly transmitting data between their data-centers without encrypting it. The NSA is also the agency charged with defending Americans from foreign surveillance, the risk of which *also* increased thanks to Big Tech's overcollection and sloppy storage.

9/

If the NSA took its defensive mission seriously, it would have been screaming its head off, demanding an end to commercial surveillance and hardening of internal communications. Instead, it exploited both.

The public-private surveillance partnership is very old, and it's key to monopolists' strategy.

10/

It took 69 years to break up AT&T, because every time trustbusters came close, America's cops and spies and military would spring into action, insisting that the Bell System was America's "national champion," needed to defend it from foreign enemies. The Pentagon rescued Ma Bell from breakup in the 50s by claiming that the Korean War couldn't be won without AT&T's help:

onezero.medium.com/jam-to-day-

11/

But it's not just powerful federal agencies that rely on commercial surveillance - and who aggressively cape for the tech surveillance industry. Local cops rely on Amazon's Ring doorbells to conduct off-the-books, mass scale street surveillance. Despite Amazon's repeated false claims, police can do this without Ring owners' knowledge or consent:

politico.com/news/2022/07/13/a

12/

Hard to overstate how sleazy this is, even leaving aside the creepy public surveillance part. Amazon sells you networked surveillance cameras, encourages you to put them inside and outside of your house, promises that you will have control over the footage they capture, then secretly hands it out to cops. In a just world, Amazon would face stiff penalties for lying to its customers about a matter this sensitive.

13/

In our world, nothing happens - because local cops across America go to bat for Amazon every time the issue comes up.

Google deceptively captures your location data. It is effectively impossible to opt out of Google location collection. You have to uncheck a dozen or so boxes in different places. Even the senior Googlers who ran Google Maps couldn't figure it out - they thought they'd opted out, but hadn't.

pluralistic.net/2021/06/01/you

14/

In a just world, Google would face stiff penalties for deceiving billions of people who thought they had explicitly told the company *not* to track their location - but in our world, Google is left alone to do so. I mean, of course - why not? Without Google's mass harvesting and indefinite storage of surveillance data, cops wouldn't be able to use "reverse warrants" to go after Black Lives Matter protesters:

theguardian.com/us-news/2021/s

15/

(If you think that reverse warrants are good because they were used to prosecute the 1/6 insurrectionists, please consider that the vast majority of reverse warrants are used against progressive protesters).

16/

Facebook deceptively captures your personal communications. You may think your private messages are private, but actually Facebook collects this data and retains it forever. In a just world, Facebook would be punished for this. In our world, Facebook turns over teens' private chats about procuring a medication abortion to cops seeking to charge an underaged girl as an adult with multiple felonies:

vice.com/en/article/n7zevd/thi

17/

Republicans talk a big game about tech companies being too powerful - but they mean that tech companies shouldn't be able to do content moderation.

eff.org/deeplinks/2021/07/righ

They *don't* mean that tech companies should stop collaborating with latter-day Witchfinders General in their hunt for formerly pregnant children to imprison on behalf of the forced birth movement.

18/

A federal privacy bill has been working its way through Congress all year, but it keeps getting watered down to the point of uselessness - or worse, because the bill will preempt *good* state privacy laws and replace them with a weak federal rule. But that might be moot, because I hear there's no chance of the bill passing.

19/

This isn't regulatory capture - it's *legislative* capture. Congress and the Senate are thoroughly dependent on the big tech companies, as well as other surveillance giants like the credit reporting bureaux and the military contractors who build and maintain government surveillance systems.

doctorow.medium.com/regulatory

20/

All that might piss you off. It should. But here's the good news. The *great* news. When it comes to digital surveillance, America no longer has a regulatory capture problem. That's because personnel are policy, and the brilliant, fearless Lina Khan is running the FTC.

pluralistic.net/2022/05/09/res

21/

Khan rose to prominence just five years ago, when, as a law student, she published the earth-shaking law review article "Amazon's Antitrust Paradox," which demolished 40 years of right-wing orthodoxy that insisted that monopolies were efficient and beneficial and should be *encouraged* by governments:

yalelawjournal.org/note/amazon

22/

Today, she is chair of the FTC, and she's taking no prisoners. Instead, she's instituting new stringent merger guidelines, aggressively pursuing monopolies, and proposing sweeping new regulation that would allow the FTC to step in on privacy where Congress has failed us.

The FTC's just given notice of a future rulemaking on digital privacy, called the "Commercial Surveillance and Data Security Rulemaking":

ftc.gov/legal-library/browse/f

23/

They want to hear from you on a series of hard-hitting questions, including

* Are there some harms that consumers may not easily discern or identify? Which are they?

* How should the Commission identify and evaluate these commercial surveillance harms or potential harms? On which evidence or measures should the Commission rely to substantiate its claims of harm or risk of harm?

* Which areas or kinds of harm, if any, has the Commission failed to address through its enforcement actions?

24/

* Has the Commission adequately addressed indirect pecuniary harms, including potential physical harms, psychological harms, reputational injuries, and unwanted intrusions?

* Which kinds of data should be subject to a potential trade regulation rule?

* Which, if any, commercial incentives and business models lead to lax data security measures or harmful commercial surveillance practices? Are some commercial incentives and business models more likely to protect consumers than others?

25/

* How, if at all, should potential new trade regulation rules address harms to different consumers across different sectors? Which commercial surveillance practices, if any, are unlawful such that new trade regulation rules should set out clear limitations or prohibitions on them? To what extent, if any, is a comprehensive regulatory approach better than a sectoral one for any given harm?

26/

Show newer
Sign in to participate in the conversation
La Quadrature du Net - Mastodon - Media Fédéré

Mamot.fr est un serveur Mastodon francophone, géré par La Quadrature du Net.