As Meta’s head go to trial in Up-to-date Mexico for allegedly failing to protect minors from sexual exploitation, the company is taking aggressive action to exclude certain information from the legal proceedings.
The company asked a judge to exclude certain studies and articles relating to social media and youth mental health; any mention of a recent high-profile case involving teen suicide and social media content; and any references to Meta’s financial resources, employees’ personal activities, and Mark Zuckerberg’s time as a student at Harvard University.
Meta motions to suppress information, called motions in limine, are a standard part of pretrial proceedings in which a party can ask a judge to predetermine what evidence or arguments are admissible in court. This is to ensure that the jury is presented with facts, rather than irrelevant or prejudicial information, and to ensure that the defendant receives a fair trial.
Meta emphasized in pretrial motions that the only questions that should be asked of the jury are whether Meta violated the Up-to-date Mexico Unfair Practices Act for the way it allegedly handled child safety and youth mental health, and that other information, such as Meta’s alleged election interference and misinformation or privacy violations, should not be considered.
But some of the demands appear unusually aggressive, two lawyers tell WIRED, including requests that the court not mention the company’s artificial intelligence chatbots and the broad reputational protections Meta is seeking. WIRED was able to grant Meta in limine claims pursuant to a public records request filed by Up-to-date Mexico courts.
These motions are part of a landmark case brought by Up-to-date Mexico Attorney General Raúl Torrez in delayed 2023. The state alleges that Meta failed to protect minors from online solicitation, human trafficking and sexual exploitation on its platforms. It alleges that the company actively shared pornographic content with minors on its apps and failed to implement certain child safety measures.
Condition complaint details how investigators could easily set up bogus Facebook and Instagram accounts posing as underage girls, and how these accounts were soon sending vulgar messages and showing algorithmically enhanced pornographic content. In another test case cited in the complaint, investigators created a bogus account as a mother looking to traffic her adolescent daughter. According to the complaint, Meta did not flag suggestive remarks that other users commented on her posts or terminate certain accounts that were reported to have violated Meta’s policies.
Meta spokesman Aaron Simpson told WIRED via email that the company has been listening to parents, experts and law enforcement for more than a decade and has conducted extensive research to “understand the issues that matter” and “use those insights to make significant changes, such as introducing Teen Accounts with built-in security and providing parents with tools to manage their teens’ experiences.”
“While New Mexico presents sensational, irrelevant and distracting arguments, our focus is on demonstrating our long-standing commitment to supporting young people,” Simpson said. “We are proud of the progress we have made and are always working to do better.”
In its pre-trial motions in Up-to-date Mexico, Meta asked the court to exclude any reference to a public advisory published by Vivek Murthy, former U.S. surgeon general, on social media and youth mental health. She also asked the court to exclude Murthy and Murthy’s article calling for inclusion of social media warning label. Meta argues that the former Surgeon General’s statements treat social media companies as a monolith and are “irrelevant, unacceptable hearsay, and unduly damaging.”
