The Senate Homeland Security Committee questioned executives from social media companies about allowing “disinformation” to go viral.
Watch the hearing here.
Former executives from these companies appeared during the hearings and accused their former employers of allowing misinformation to spread because it has more user engagement.
Committee chair Senator Gary Peters (a Democrat from Michigan) told Twitter, Meta, YouTube, and TikTok that by pushing “the most engaging posts to more users, they end up amplifying extremist, dangerous, and radicalizing content. This includes QAnon, Stop the Steal, and other conspiracy theories, as well as white supremacist and anti-Semitic rhetoric.”
Last September, a former Facebook employee, turned ?“whistleblower,” claimed that the company allows “disinformation” to spread to boost growth and called for more censorship.
During the hearing, former head engineer at Twitter, Alex Roetter, said that social media companies do not want to rein in disinformation because it is profitable.
“Regulators must understand these companies’ incentives, culture, and internal processes to fully appreciate how resistant they will be to changing the status quo that has been so lucrative for them,” he said.
Roetter went on to say that Twitter uses an experimental system to test how to get the most engagement from users.
“This system logs a slew of data for every live experiment,” he said. “Teams use this data to show per-experiment effects on various user and revenue metrics. Noticeably absent were any values tracking impacts on trust and safety metrics.”
Former vice president for product engineering, marketing, strategic operations, and analytics at Facebook, Brian Boland, testified about his former employer prioritizing user engagement. He said that Facebook acquired CrowdTangle, a company that provided “industry-leading transparency” into the platform’s newsfeed content. The company showed that Facebook was amplifying political and racial divisions in 2020. According to Boland, Meta “attempted to delegitimize the CrowdTangle-generated data.”
“What finally convinced me that it was time to leave was that despite growing evidence that the newsfeed may be causing harm globally, the focus on and investments in safety remained small and siloed,” Boland said. “Rather than address the serious issues raised by its own research, Meta leadership chooses growing the company over keeping more people safe.”
Boland also noted that Facebook disbanded its Responsible Innovation team last week. He added that social media companies should be regulated because their algorithms will only get better at targeting vulnerable users.