Facebook’s parent company Meta has launched a program to “support” journalism in New Zealand and has made a new commitment to managing “defamation.”
The aim of the program is presented as a way to help media organizations in New Zealand create sustainable business models.
The program includes a Grant Fund, Audience Development Accelerator, the creation of a News Innovation Advisory Group, and the provision of digital training focused on engagement for news organizations.
According to Facebook’s head of public policy for Australia and New Zealand Mia Garlick, the program will involve 12 media organizations with regionally, culturally, and digitally diverse publications. The idea is that these organizations will “come together and try to innovate and learn from experts and really collaborate on new strategies to drive business growth both on and off Facebook.”
Garlick said that the International Center for Journalists would help establish the advisory group.
“What we’re really trying to do now is trying to recognize that the goal in New Zealand is to support a sustainable and diverse and robust ecosystem, and so we want to make sure that we’re offering different solutions for publishers, no matter where they are in their digital transformation journey,” Garlick said.
Garlick denied that the New Zealand journalism program is not a way to keep media organizations from demanding deals such as those afforded to Australian media houses and she also made a commitment to helping news outlets censor “potentially litigious” comments.
There were several strategies to prevent “false comments” being posted, Garlick said, according to RNZ.
The other aspect mentioned by the announcement was that Facebook was committed to help all page administrators to “better moderate” their comments.
“So what we did in March this year is that we released a tool that allows page owners, including publishers to be able to [turn] off comments on a post on a per post basis,” Garlick said.
“Facebook removed harmful health information relating to Covid-19 and vaccine issues, plus page administrators had the ability to set their own custom filters,” the report states, adding that “Facebook had been working with health experts around the world and in New Zealand to find out what were the claims that were harmful and should be removed in relation to Covid-19”
“We invest significant resources in making sure that we’re removing those and then we invest in fact checkers and we have fact checkers working across New Zealand as well, and when they rank content as false, including Covid and vaccine misinformation, then that will reduce in its distribution and people are notified that they’ve shared misinformation,” Garlick said.