The WSJ’s “Facebook Files” is the biggest scoop in the company’s history. Internal documents prove:
-Facebook knew its algorithm incentivized outrage,
-Instagram knew it hurt teen girls,
-Facebook has been shielding VIPs from moderation.
Facebook changed its algorithm in 2018, promote friends & family content to “improve well-being”. In actuality, it was an attempt to stop a multi-year decline in Likes and Sharing.
Facebook’s algorithm change incentivized hateful content, so political parties & news outlets made their posts angrier, driving polarization. Some shifted to make 80% of their posts negative, Fb’s research found.
But execs refused to change back because it would hurt usage & revenue.
Instagram told the public and congress that its impact on teen well-being was “quite small” or there was no consensus.
Internally, research showed that “32% of teen girls said that when they felt bad about their bodies, Instagram made them feel worse”.
Instagram’s own research found:
- 13% of British users and 6% of American users traced the desire to kill themselves to Instagram
-14% of boys in the U.S. said Instagram made them feel worse
-40% of users said Instagram made them feel unattractive and poor
Facebook’s ‘XCheck’ system protects 5.8 million VIPs from having their policy-violating content removed
XCheck’s purpose? To prevent “PR fires”. Special treatment caused posts that violated Fb’s rules to be viewed 16.4 billion times in 2020, including posts by Trump
When soccer star Neymar posted non-consensual intimate imagery (revenge porn) of a woman who accused him of sexual assault, XCheck delayed its removal.
That caused the video to be viewed by 56 million people, reposted 6000 times, and it lead to bullying of the accuser.
I spent 10 years reporting on Facebook at TechCrunch, but these Files finally prove it knew it was harmful, but hid the info from the public and refused safeguards that would hurt its profits.
Safely enabling communication at scale is hard, but society deserves better.