For a Markup feature, Leon Yin and Aaron Sankin compiled a list of "social and racial justice terms" with help from Color of Change, Media Justice, Mijente and Muslim Advocates, then checked if YouTube would let them target those terms for ads.
Even worse: when the reporters asked Youtube for comment on these blocks, the company stonewalled them, and then added even more terms to the blocklist, including Black excellence, LGBTQ, antiracism, civil rights, Black is beautiful, abolish ICE, believe Black women, queer, Black trans lives matter, antiracism, Muslimfashion and many, many more. The full data-set is on Github:
As if that wasn't enough, there's the list of terms that Youtube *does* allow ad-targeting on, including white power, white lives matter, white power, etc.
The contradictions go further: you can advertise to "Christian parenting" and "Jewish parenting" but not "Muslim parenting." Racist terms like "white sharia" and "civilizational jihad" are in, too.
After Youtube was called for comment, they started blocking "Christian" and "Jewish" as prefixes on the same keywords that were blocked when associated with "Muslim."
Youtube's policies offer two explanations for this, the first ("[ads should] ads to reflect a user’s interests rather than more personal interpretations of their fundamental identity") is thoroughly unconvincing. It's literally nonsense.
The second, though ("[targeting categories could be] used to stigmatize an individual") is both hugely revealing and hugely incomplete, and therein lies the tale.
Youtube is caught in an unresolvable contradiction. On the one hand, you have the company's statement that "At YouTube, we believe Black lives matter and we all need to do more to dismantle systemic racism."
On the other hand, you have the platform's utility to reactionary, racist, genocidal and eugenic communities who are totally in opposition to Youtube's claimed support for racial justice.
Some of that is unwitting - the company can't possibly know what's in all the videos published on its platform - and some is deliberate: Youtube doesn't want to face the reputational, political and financial consequences of cutting off superstars like Prageru.
They know if that if they allow advertisers to target "Black Lives Matter," some of those ads will show up alongside of Prageru's racist video, "'Black Lives Matter' Is Not Helping Blacks."
That's the heart of the contradiction. Sometimes, Youtube wants us to think of its self-serve, algorithmic ad/publishing system as untouched by human hands, an interplay of pure math, initiated and steered by third parties whose choices are not Youtube's responsibility.
Other times, Youtube wants us to think of it as a corporate person, with identities and values, priorities and ethics. The selective demand that Youtube be considered a moral actor - but only for the outcomes that reflect well on the company - leads to this contradiction.
To be clear, I don't think there's any way Youtube *could* operate a self-serve ad platform or a self-serve video program that could proactively identify racist outcomes.
It's not enough to vet every ad to make sure it's not racist - they'd also have to vet every possible ad *placement* and make sure that it doesn't violate its ethics; that is, they'd have to use reliable human judgment to evaluate every single combination of ads and videos.
There isn't enough human judgement - let alone sound human judgement - in existence to cover that combinatorial explosion.
What's more, Youtube is so consequential to our discourse that its errors would be - and are - hugely consequential as well.
That's why all this matters: Youtube's editorial choice has the foreseeable (and, evidently, acceptable to Youtube) outcome of producing an economic boycott of the creators it says it wants to uplift and support.
Youtube's monopolistic dominance has the effect of making its contradictions matters of civilizational importance.
It wants to be:
It can't have all of those. It just can't.
And to be perfectly honest, I don't know what I want it to do here. I mean, it could stop spinning idiotic tales about "[ads that] reflect a user’s interests rather than more personal interpretations of their fundamental identity," but that wouldn't fix things.
Likewise, it could ban the words "white" and "Christian" in association with all same the keywords it blocks in connection with "Black" and "Muslim," producing a kind of evenhanded idiocy, which is preferable to a biased idiocy.
And it could be more transparent in its "brand safety" tactics, and have some process for appealing bad choices, as Nandini Jammi - who cofounded Check My Ads - sensibly calls for.
They should do this, but it still would leave the contradiction - and its consequences - intact.
Thinking about this stuff gives me a headache. On the other hand, it reminded me to order a copy of SILICON VALUES, the new book from my EFF colleague Jillian C York, who is far and away the content moderation expert I trust most in this world.
Mamot.fr est une serveur Mastodon francophone, géré par La Quadrature du Net.