As Hao writes, a New York University study Facebook pages from partisan editors found that “those who regularly published political disinformation were most engaged in the run-up to the 2020 US presidential election and the Capitol riots.”
Zuckerberg, after saying he believed “a bunch of inaccurate things” about Facebook’s incentives to allow and amplify misinformation and content polarization were shared during the hearing by members of Congress, added:
“People don’t want to see misinformation or divisive content about our services. People don’t want to see bait and things like that. While it’s true that people are more likely to click on it in the short term, it’s not good for our business, product, or community that it’s there. ”
His response is a common topic of discussion on Facebook and ignores the fact that the company has not undertaken a centralized, coordinated effort to examine and reduce how its recommendation systems amplify misinformation. To learn more, read Hao’s report.
Zuckerberg’s comments came during the House Energy and Trade Committee hearing on disinformation, in which members of Congress interviewed Zuckerberg, Google CEO Sundar Pichai and CEO of Twitter Jack Dorsey on the spread of misinformation about the November US election, the January 6 attack on the Capitol Building and anti-covid vaccines, among others.
As has become common in these hearings, Tory representatives have also asked CEOs about the perception of anti-Tory bias on their platforms, a long-standing claim by the right-wing that data does not support.