Published :
7 minute read

New Mexico Attorney General Raúl Torrez Slams Meta Platforms Over Threat to Withdraw Facebook, Instagram and WhatsApp Amid Child Safety Legal Battle

New Mexico Attorney General Raúl Torrez speaks about Meta’s threat to remove Facebook, Instagram and WhatsApp during a child safety legal dispute.

In a sharply worded response that underscores escalating tensions between regulators and Big Tech, New Mexico Attorney General has publicly criticized Meta for warning it may withdraw access to its major platforms across the state. The dispute, rooted in child safety concerns and regulatory enforcement, has quickly evolved into one of the most closely watched legal confrontations involving social media governance in the United States.

Torrez accused Meta of prioritizing engagement metrics and profit over the safety of children, stating that the company’s stance reveals “how little it cares about child safety.” His comments follow Meta’s indication that it could block residents in New Mexico from accessing Facebook, Instagram, and WhatsApp if ongoing legal demands are enforced without modification.

Legal Confrontation Intensifies Over Child Safety Measures

At the heart of the dispute is a lawsuit filed by the State of New Mexico in 2023, which seeks sweeping reforms in how Meta manages its platforms for younger users. The case is set to proceed to a bench trial beginning May 4, where a judge will determine whether the proposed measures should be implemented.

The state’s legal demands are extensive and reflect growing national concern over the impact of social media on minors. Among the proposed changes are stricter age verification systems, mandatory blocking of users under the age of 13, and requirements to link minor accounts directly to verified guardians. The state has also pushed for limitations on how minors interact with adult users, a move aimed at curbing exposure to harmful or inappropriate content.

In addition, regulators are seeking to impose structural changes to platform design. These include restricting algorithm driven recommendations for minors, disabling autoplay features, and limiting push notifications during school hours and typical sleep periods. The proposed measures also include caps on daily usage time, signaling a broader attempt to reshape how young users engage with social media.

Meta Pushes Back on “Impractical” Demands

Meta has strongly contested the feasibility of these requirements. In its response, the company described the proposed rules as technically impractical and argued that they fail to account for the complexities of the modern internet ecosystem.

According to Meta, targeting a single company does little to address broader concerns about online safety, as teenagers today engage across hundreds of apps and platforms. The company warned that if a workable compromise is not reached, it may have no option but to withdraw its services from New Mexico entirely, a move that would affect millions of users.

Meta further emphasized that it has already implemented various safety tools designed to protect younger users, including parental controls and content moderation systems. The company maintains that it remains committed to providing age appropriate experiences while balancing user privacy and freedom of expression.

Attorney General Dismisses Threat as Strategic Pressure

Torrez has dismissed Meta’s warning as a calculated public relations maneuver rather than a genuine technical limitation. He argued that the company has historically adapted its systems when required by law or regulatory pressure, suggesting that compliance is a matter of corporate priorities rather than capability.

“This is not about what Meta can or cannot do,” Torrez said, emphasizing that the company has the technical resources to implement stronger safeguards. “It is about what it chooses to prioritize.”

His remarks reflect a broader frustration among regulators who believe that social media companies have been slow to address longstanding concerns related to child safety, online exploitation, and harmful content exposure.

Undercover Investigation Sparked Legal Action

The origins of the lawsuit trace back to an undercover investigation conducted by the New Mexico Department of Justice. Investigators created a fake profile posing as a 13 year old user to assess how Meta’s platforms handle interactions involving minors.

According to findings presented by the state, the account quickly received unsolicited messages and explicit content from adult users. These interactions, officials argue, demonstrate systemic failures in Meta’s safety mechanisms and highlight gaps in enforcement.

Meta has disputed aspects of the investigation, suggesting that the interactions did not fully utilize available safety tools. However, the state maintains that the findings reveal fundamental weaknesses in how the platforms are designed and monitored.

Landmark Jury Verdict and Financial Penalty

In March 2026, the case reached a significant milestone when a jury in Santa Fe found Meta guilty of violating the state’s Unfair Practices Act in 75,000 separate counts. The ruling resulted in a substantial financial penalty of $375 million, marking one of the most notable legal victories against a major technology company on child safety grounds.

Legal experts have described the verdict as potentially precedent setting, signaling increased willingness among courts to hold social media platforms accountable for user safety, particularly when minors are involved.

The upcoming trial will determine whether additional structural reforms will be mandated. These could include the appointment of a court monitored oversight mechanism to ensure compliance, as well as enforceable changes to algorithmic systems that shape user experiences.

Broader Implications for Technology Regulation

The dispute between New Mexico and Meta is not an isolated case but part of a broader national trend. State attorneys general across the United States are intensifying scrutiny of social media platforms, focusing on issues ranging from data privacy to mental health impacts on young users.

This case, however, stands out due to the scale of proposed reforms and the potential consequences of non compliance. If Meta follows through on its threat to withdraw services, it would mark an unprecedented move by a major technology company in response to state level regulation.

Such a scenario could raise complex questions about digital access, interstate commerce, and the balance of power between governments and global technology firms. It may also influence how other states approach similar regulatory efforts, potentially accelerating the push for federal level legislation.

Free Speech and Platform Responsibility Debate

Meta has also raised concerns about the broader implications of the proposed rules, particularly in relation to free speech and parental authority. The company argues that overly restrictive measures could limit user expression and shift decision making power away from families.

Critics, however, contend that these arguments often serve to deflect from core safety issues. They argue that meaningful protections for minors can coexist with free expression, provided platforms adopt responsible design practices.

The debate highlights a fundamental tension in the digital age: how to ensure user safety without undermining the open nature of the internet. As platforms continue to evolve, finding this balance remains one of the most pressing challenges for policymakers and industry leaders alike.

What Comes Next

As the May 4 trial approaches, all eyes are on the courtroom where the future of Meta’s operations in New Mexico could be decided. The outcome may not only determine the company’s obligations within the state but also set a benchmark for how child safety laws are enforced nationwide.

For now, the standoff between and reflects a deeper shift in how governments are confronting the influence of social media. It signals a growing expectation that technology companies must align innovation with accountability, especially when vulnerable users are at stake.

Whether this case leads to meaningful reform or further conflict, it is clear that the era of minimal oversight for social media platforms is steadily coming to an end.

Frequently Asked Questions

Why is New Mexico taking legal action against Meta?

The state filed a lawsuit alleging that Meta failed to adequately protect minors on its platforms, exposing them to harmful content and unsafe interactions.

What changes is New Mexico demanding from Meta?

The state is seeking stricter age verification, blocking users under 13, linking minor accounts to guardians, limiting adult interactions, and restricting features like algorithms, autoplay, and notifications.

Why did Meta threaten to remove Facebook, Instagram and WhatsApp from New Mexico?

Meta said the state’s proposed requirements are technically impractical and difficult to implement, warning it may withdraw services if no workable solution is reached.

How did Attorney General Raúl Torrez respond to Meta’s warning?

Torrez dismissed the threat as a public relations tactic and accused Meta of prioritizing profit and engagement over child safety.

What evidence led to the lawsuit against Meta?

An undercover investigation created a fake 13 year old profile that reportedly received unsolicited messages and explicit content from adults on Meta platforms.

What was the outcome of the March 2026 jury decision?

A Santa Fe jury found Meta guilty of violating the Unfair Practices Act in 75,000 counts and imposed a $375 million fine.

What will the upcoming trial decide?

The bench trial will determine whether Meta must implement structural changes, including stricter safeguards for minors and possible court monitored oversight.

What concerns has Meta raised about the proposed rules?

Meta argues the rules could negatively affect user experience, raise free speech concerns, and fail to address the broader ecosystem where teens use multiple apps.

How could this case impact social media regulation in the US?

The case may set a precedent for stricter state level regulation and increase pressure on tech companies to enhance child safety measures nationwide.

Why is this issue significant for users and parents?

It highlights growing concerns about online safety for minors and could lead to major changes in how social media platforms are designed and regulated.

Khogendra Rupini Author Profile
VOICES FROM AUTHOR

Khogendra Rupini

Khogendra Rupini is a full-stack developer and independent news writer, and the founder and CEO of Levoric Learn. His journalism is grounded in verified information and factual accuracy, with reporting informed by reputable sources and careful analysis rather than live or speculative updates. He covers technology, artificial intelligence, cybersecurity, and global affairs, producing clear, well-contextualized articles that emphasize credibility, precision, and public relevance.

Founder & CEO, Levoric Learn Editorial and Technology Analysis
or
or

Edit Profile

Contact Khogendra Rupini

Are you looking for an experienced developer to bring your website to life, tackle technical challenges, fix bugs, or enhance functionality? Look no further.

I specialize in building professional, high-performing, and user-friendly websites designed to meet your unique needs. Whether it's creating custom JavaScript components, solving complex JS problems, or designing responsive layouts that look stunning on both small screens and desktops, I can collaborate with you.

Get in Touch

Email: contact@khogendrarupini.com

Phone: +91 8837431044

Create something exceptional with us. Contact us today