Clicky

YouTube CEO calls for global coalitions to address content that’s “legal but could be harmful”

A push for unelected corporations to set the global content moderation standards for legal content.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

During an appearance at the World Economic Forum Global Technology Governance Summit 2021, an event where more than 40 governments and 150 companies meet to ensure “the responsible design and deployment of emerging technologies,” YouTube CEO Susan Wojcicki expressed her support for tech platforms moderating content that’s “technically legal but could be harmful” and praised global coalitions that help Big Tech coordinate and automate their censorship efforts.

Wojcicki said that when tech companies comply with the law, there are still “issues around speech” and suggested that these issues should be addressed by private corporations.

“I see a lot of issues around speech and what should or should not be allowed on platforms for example,” Wojcicki said. “And that’s a really tough area. Now, certainly countries pass certain laws and we comply with all the laws that the different countries pass but a lot of times, there’s content that’s legal but could be seen as harmful. And it’s hard for governments to necessarily find the right way to regulate it.”

She then proposed YouTube’s model of privately policing what the platform deems to be COVID-19 “misinformation” as an effective way to handle this content that’s “legal but could be harmful.”

“With COVID-19, with a number of different types of misinformation, it would be hard for governments all around the world to all pass different regulations about that and have compliance,” Wojcicki said. “So, there’s this category of content that I would say is content that is technically legal but could be harmful and that’s where we’ve put a lot of time to try to make sure we’ve put the right policies in place.”

Wojcicki continued by noting that she finds it “challenging” when governments pass different content moderation laws.

“It is challenging when governments all pass different rules and we have a patchwork of different products,” Wojcicki said. “I think it would be strange if YouTube operated differently in every country depending on the different policies there.”

Wojcicki then positioned global organizations that come together to form coalitions and create global content moderation standards as something that’s “really effective.”

“GIFCT, for example, which is an organization that works to fight violent extremism, that’s funded by governments, it has a lot of experts, that’s an example of where you really can get a good coalition to be able to come up with how do we handle this tough topic but do so globally and do it in a consistent way,” Wojcicki said.

She added: “I’m very supportive of coming up with organizations that can be global, that can span industry as well as governments, have experts and come up with the ways for us to better manage some of these tough questions and so I’m looking forward to more collaboration in the future and hopefully setting up more organizations like these that can help us address some of the toughest issues that we face.”

Click here to display content from YouTube.
Learn more in YouTube’s privacy policy.

To give some context to the implications of private companies censoring content that they deem to be “technically legal but could be harmful,” YouTube has deleted more than 800,000 videos for violating its far-reaching COVID-19 misinformation rules.

Global coalitions such as GIFCT (the Global Internet Forum to Counter Terrorism) amplify this mass unaccountable censorship of legal content by allowing multiple tech platforms to coordinate and automate their censorship efforts.

Daphne Keller, director of the Program on Platform Regulation at Stanford’s Cyber Policy Center, has noted that hashes of content get added to GIFCT’s database because “they violate a platform’s TOS” and that the underlying images or videos “don’t necessarily violate any law, or may violate some countries’ laws but not others.”

Once these hashes have been added to GIFCT’s database, member companies can automatically detect and block the underlying content.

Not only does GIFCT facilitate this coordinated censorship but according to more than 15 human rights and digital rights organizations, GIFCT members’ efforts to block or limit content that they deem to be “terrorist and violent extremist” content have resulted in “the removal of content opposing terrorism, as well as satire, media reports, and other content that constitutes legitimate free speech under international law.”

If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

Read more

Share