A court in New Mexico has ordered Meta to pay $375 million after finding that the company misled users about how safe its platforms are for children.
The jury concluded that Meta failed to protect young users on platforms like Facebook, Instagram, and WhatsApp. It found that children faced exposure to sexual content and potential contact with predators.
New Mexico Attorney General Raul Torrez called the decision a historic moment. He stated that this marks the first successful case where a state held Meta accountable for child safety concerns.
Meta, led by CEO Mark Zuckerberg, disagrees with the ruling and plans to appeal. A company spokesperson said Meta continues to invest in safety tools and works actively to remove harmful content. The company insists it remains committed to protecting teenagers online.
However, the jury found that Meta violated New Mexico’s Unfair Practices Act. It ruled that the company gave a misleading impression about the safety of its platforms for younger users.
During the seven-week trial, prosecutors presented internal company documents. Former employees also testified about ongoing issues with child safety. One former engineering leader, Arturo Béjar, revealed that Meta knew about harmful activity on its platforms.
Béjar shared results from experiments showing that Instagram exposed underage users to sexualized content. He also stated that a stranger approached his own daughter with inappropriate messages on the platform.
Prosecutors also highlighted internal research from Meta. The data showed that 16% of Instagram users reported seeing unwanted sexual content within a single week.
Meta defended its efforts by pointing to safety updates. In 2024, Instagram introduced Teen Accounts to give young users more control. The platform also added features that notify parents if children search for self-harm-related content.
Despite these measures, the jury identified thousands of violations. Each violation carried a maximum penalty of $5,000, which led to the total fine of $375 million.
Meta now faces additional legal challenges. A separate case in Los Angeles involves claims that social media platforms like Instagram and YouTube contributed to addiction among young users.
Thousands of similar lawsuits continue across the United States. Many accuse tech companies of designing platforms that expose children to harmful content.
New Mexico filed its lawsuit in 2023. The state argued that Meta’s algorithms pushed explicit and dangerous material toward young users. These systems automatically select and recommend content based on user behavior.
Attorney General Torrez stated that Meta ignored internal warnings and failed to act responsibly. He said the verdict sends a strong message from families, educators, and safety experts that protecting children must come first.
