Yesterday, Canadian Innovation Minister Navdeep Bains introduced the Digital Charter Implementation Act, which proposes a national privacy standard for Canada akin to Europe's GDPR.
The law is complex and will undergo many changes, but its two most salient features are:
I. The right to refuse to have your data collected and used; and
II. The right to have your data deleted if you change your mind.
With still penalties for companies that don't comply.
The latter is self-explanatory, but the former is really interesting. Since the early days of packaged software, the tech industry has operated on the basis of a fictional consent: "By being stupid enough to be my customer (open this box, click this link, etc), you agree. That I'm allowed to come over to your house, punch your grandmother, make long distance calls, wear your underwear, and eat all the food in your fridge.
Once a company decides it can declare that its customers have given consent to non-negotiated, unconscionable contracts, the product-design equilibrium shifts dramatically.
Features that benefit shareholders (but harm customers) get greenlit, with a note to get legal to add more text to the sprawling novella of garbage legalese that no one reads before the new version is released.
Think of the original Ipod, a stevejobsian curve-cornered slab of plastic and chrome, stripped of ornamentation in favor of one button, two ports, and a wheel - whose packaging included the world's shittiest zine: its unreadable terms and conditions.
What the GDPR did, and what the new Canadian rule proposes, is that the fiction of consent must be replaced with true consent. If you want to get permission to do one million things with my data, then you have to ask me one million plain-language, separate questions.
There can't be an "Accept All" button. The default has to be "no" and this can only be changed to "Yes" if I manually toggle it. You can't deny me access if I don't change to a "Yes," so your product needs a million contingencies for how it interacts with me.
If you think about this for half a second, you'll realize that its purpose isn't to allow companies to continue producing the kinds of products you can only field if you can maintain the sham of consent. It is to prohibit those products by raising the bar on consent.
It's the state saying, "You tell us that all the shady stuff is undertaken with consent. OK, let's see if anyone actually consents to this. If not, you gotta cut it out."
The idea is to shift the product-design equilibrium: "If we do this terrible thing, we're going to have to add 15 more consent questions to the onboarding process. We predict that 25% of potential users will bail if we do this.
"What's more, we predict that 85% of the customers who do finish onboarding will say no to five or more of these questions, which means an extra year of development time to ensure compliance with their preferences."
That's the real purpose of these explicit consent rules: the annihilate the fiction of consent and expose the underlying reality - no one has ever agreed to these terms and no rational person ever would.
The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!