Social media giants made decisions that allowed more harmful content on people's feeds, after internal research into their algorithms showed how outrage fueled engagement, whistleblowers told the BBC.
More than a dozen whistleblowers and insiders have laid bare how the companies took risks with safety on issues including violence, sexual blackmail, and terrorism as they battled for users' attention.
An engineer at Meta, which owns Facebook and Instagram, described how he had been told by senior management to allow more borderline harmful content - which includes misogyny and conspiracy theories - in users' feeds to compete with TikTok.
They sort of told us that it's because the stock price is down, the engineer said.
A TikTok employee provided the BBC with rare access to the company's internal dashboards of user complaints, which showed that staff had been instructed to prioritize cases involving politicians over reports of harmful posts involving children.
Decisions were being made to maintain a strong relationship with political figures to avoid threats of regulation or bans, not because of the risks to users, the TikTok staffer said.
According to whistleblower accounts collected by the BBC documentary Inside the Rage Machine, the social media landscape changed drastically after TikTok's rapid rise. This led competitors to prioritize engagement over user safety, allowing harmful content to proliferate.
Matt Motyl, a senior Meta researcher, mentioned that Instagram Reels was launched without sufficient safety measures, resulting in a higher prevalence of harassment, hate speech, and violence in comments compared to other parts of Instagram.
The company was allocated resources to grow Reels while safety teams struggled to garner additional support, leading to troubling situations where the safety of children was not a priority.
In response to whistleblower claims, Meta denied any allegations of deliberately amplifying harmful content, asserting that such claims are inaccurate, whereas TikTok labeled them as fabricated, emphasizing its investment in technology to minimize harmful content exposure.



















