In 2021, when former Meta employee Frances Haugen blew the whistle on dangers that the company’s platforms posed to kids, Meta realized it needed to change.
“I’m here to tell you today that Meta has changed,” said one of a new set of whistleblowers — former Meta user experience researcher Cayce Savage — before the the Senate Judiciary Subcommittee on Privacy, Technology, and the Law, “for the worse.”
Savage and another former Meta researcher, Jason Sattizahn, appeared before the subcommittee on September 9th. Their testimonies built on an account that they and several other former and current employees shared with The Washington Post, which recently detailed allegations that Meta unleashed its legal team on its own researchers to suppress findings that its virtual reality services harmed kids. As Congress struggled to pass tech regulation spurred by Haugen’s revelations, lawmakers contended, Meta has simply learned to hide its problems better.
The former researchers testified that children under 13 are rampant on Meta’s VR social platforms, despite having their access officially restricted. These spaces pose the same dangers as the rest of the internet, including sexual predators, but the immersive nature of VR, the whistleblowers said, could make interactions more potent. “In VR, someone can stand behind your child and whisper in their ear, and your child will feel their presence as though it’s real,” Savage testified. “VR is tracking a user’s real life movements, so assault in VR requires those movements to happen in real life. What happens in virtual reality is reality.”
But Savage and Sattizahn said Meta lawyers discouraged and even threatened researchers against collecting information that would confirm it had a problem, fearing a paper trail that could create legal liability unless it removed a large group of engaged users.
“The research they’re doing is being pruned and manipulated”
In a statement on the Washington Post story, Meta spokesperson Dani Lever said the whistleblowers’ examples were cherry-picked “to fit a predetermined and false narrative” and that the company has “approved nearly 180 Reality Labs-related studies on social issues, including youth safety and well-being.” At the hearing, Sattizahn called this stat a “lie by avoidance,” since “the whole point of this testimony is that the research they’re doing is being pruned and manipulated.”
Haugen’s momentous 2021 report revealed a trove of internal research documents demonstrating that Meta was aware products like Instagram had harmful effects on some teens, including negative body image issues. Rather than adjust its protocols to better protect kids and teens, testified Savage and Sattizahn, Meta learned to stop creating those documents. “It was the wrong lesson,” Sen. Richard Blumenthal (D-CT) said at a press conference ahead of the hearing.
The company created a regime of “legal surveillance,” Sattizahn said, where lawyers would monitor researchers’ work, “limiting the topics, the questions, the methods that you can use before you even collect data.” He testified that Meta executives threatened his job should he not comply and recalled that the company’s lawyers would ask him to delete or stop collecting data about emotional and psychological harm. “Legal’s repeated, explicit statements to me, was that we did not want this data because it was too risky for us to have, because if there was an outside audit, it would be discovered that Meta knew about these harms,” he testified.
Savage said that the issue of young kids on Meta’s VR platform was so prevalent that every time she personally used the product, the majority of people she interacted with were “audibly under the age of 13.” Both whistleblowers believe Meta CEO Mark Zuckerberg is aware of the issue. “The only way that he would not be aware is if he had never used his own headset,” Savage said.
When Haugen came forward, Congress responded with the largest push for kids online safety legislation in decades. Sen. Marsha Blackburn (R-TN) and Blumenthal introduced the Kids Online Safety Act (KOSA) in early 2022, aiming to make platforms legally responsible for protecting kids. But the efforts have fizzled. KOSA passed the Senate last year with a 91-3 vote, but it never reached the House floor. “I could have given the same talking points” about child safety four years ago, Blumenthal said. “Nothing has changed.”
Parent advocates like Maurine Molak, whose teen son David died by suicide after compulsive use of online platforms and cyberbullying, showed up once again for the hearing. On a KOSA advocacy trip in December, shortly before the bill withered in the House, Molak wasn’t sure if she’d be back if it failed. In DC on Tuesday, she said she decided to keep pushing forward after Senate Commerce Committee Chair Ted Cruz (R-TX) committed to continue championing the bill.
Savage said she “deliberated for a long time about whether to come forward” after seeing how little positive impact Haugen had. “Meta responded to Frances Haugen’s disclosure in 2021 by cracking down on research internally,” Savage reflected. “Researchers across the company were subjected to sudden censorship, and were told it was for our own protection so we wouldn’t be part of any future leaks. Candidly, I am worried that speaking to you today will put my former colleagues, as well as the field of user research within Meta at risk.”
0 Comments