Clicky

Subscribe for premier reporting on free speech, privacy, Big Tech, media gatekeepers, and individual liberty online.

WEF Digital Safety Partner Calls For Online Crackdown on “Harmful” Content

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

France-founded Teleperformance, a multinational digital service that is one of the World Economic Forum’s partners in the Global Coalition for Digital Safety (set up to tackle “harmful content and conduct online”) has come out with new recommendations to platforms seeking to censor a certain group of online creators.

These would be those who are flagged as causing harm, and what a write-up on the WEF website by two Teleperformance execs urges is that platforms should change their censorship policy frequently, as well as track and seek to demonetize those, so to speak, “demonized” creators.

Teleperformance, which is in the business telemarketing, customer relationship management, content moderation, and communication, would also like to see more collaboration at the “multi-stakeholder” level – needless to say, all to combat whatever these stakeholders decide is “harm.”

The authors refer to this as a “multi-faceted” approach to what they qualify as “real-world harm,” and are looking to rally the industry to “collaborate.”

Teleperformance has no doubt that the level of “harm” now coming from online content deserves to be dubbed as growing at “an alarming rate” and “AI”-powered capabilities are particularly singled out as dangerous.

The proposed solution goes three ways, always stressing the collaborative nature of this undertaking.

First, this WEF collaborator addresses content policies, and how effectively they are enforced. Here, while reasonably happy about how platforms are combining machine learning and human “moderation” to produce the onslaught of censorship we’ve seen over the past years, the idea is to focus on the “pre-crime:” be more pro-active, and in this case, it means identifying and reducing “harmful content” – before it is reported.

“Platforms must be nimble and modify/augment their policies frequently given the velocity of changes we have seen in the field in such a short timespan,” Teleperformance advises.

Next, there’s the issue the corporation sees as “signals and enablers beyond content.”

This is where we come to the need to demonetize the creators who are marked as producing “harmful content” to discourage them from participating on platforms. And, Teleperformance believes, they should be targeted whether they are making money directly or via ads.

“When it comes to payment mechanisms, while the use of credit cards to purchase illegal content have been hampered based on efforts by financial institutions, bad actors have found other payment mechanisms to use, including cryptocurrency,” continues the article.

And it’s clear where those behind it think the “censorship gun” should be pointed next.

Finally, what does multi-stakeholder collaboration mean in this context?

In short, it’s the push to make these issues an industry-level solution in search of a threat – and always a handy excuse to take down actually harmful content and those that is simply “(politically) disfavored.”

“Initiatives that aim to tackle these harms at an industry-level will becoming increasingly important given that bad actors move across platforms to propagate harm,” according to Teleperformance.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Read more

Reclaim The Net Logo

Join the pushback against online censorship, cancel culture, and surveillance.

Already a member? Login.

Share