Facebook has taken another decision that favors news outlets it thinks are “credible” – such as CNN, NPR, and the New York Times over smaller, independent media.
The Big Tech giant has introduced a “news ecosystem quality” metric that is internally used in the company to judge a certain news outlet and decide if that needs to be preferred over the rest. This new change is a response to the huge surge of what Facebook thinks is “misinformation” on the platform post the US presidential elections, as reported by the Times.
Simply put, this metric, abbreviated as “NEQ” would favor established and dominant outlets such as CNN over other outlets that aren't as prominent but still push information that their readers would want to stay updated on. After the metric has come to play, all outlets that enjoyed a high NEQ (establishment media) significantly benefited.
Based on what the Times reports, three people close to Facebook CEO Mark Zuckerberg revealed that the change was personally signed off by him. What's more, the move is apparently being welcomed by some sections of the company's employees, with some asking if the changes can be made permanent. Some employees of the company have reportedly been referring to the update as a “nicer new feed.”
The Times has also reported that Facebook has conducted research in early November to determine if the content being shared on its platform is “good for the world” or “bad for the world.”
Users were asked to fill surveys and categorize a certain few posts under the above-mentioned categories. The Big Tech platform apparently found that several popular posts across its platform were categorized under “bad for the world.” That's what may have fueled Facebook's efforts towards developing an algorithm that would start showing posts categorized as “bad for the world” in lesser frequency on user feeds.
It is, however, worth noting that the Times hasn't made it clear if Facebook is still following the NEQ approach or not. Because the company has also revealed that the results lead to a decrease in user sessions. “The results were good except that it led to a decrease in sessions, which motivated us to try a different approach,” was what was found in a summary of the report the Times viewed.
Facebook also tweaked its algorithms to demote content that is “bad for the world” less aggressively.