Meta is rolling out new age-detection technologies and parent notifications across Facebook and Instagram, a move that coincides with the second phase of a high-profile legal battle in New Mexico. As the company faces billions in potential damages and demands for stricter safety protocols, these updates represent both a technical effort to protect minors and a strategic response to intense regulatory scrutiny.

Direct Outreach to Parents

In a significant shift toward transparency, Meta announced that parents in the United States will receive direct notifications regarding age verification. This outreach extends beyond just those supervising “Teen Accounts” to include any user identified by Meta as a parent.

The notification includes resources on how to discuss age accuracy with teenagers, linking to guidance published by the company a year ago. This approach attempts to shift some responsibility to families while highlighting Meta’s commitment to age-appropriate environments.

Expanding AI Detection Globally

Meta is expanding its use of artificial intelligence to detect underage users who may have listed false adult ages. Key developments include:

  • Global Rollout: Age-detection technology is now being deployed in 27 European Union countries and Brazil.
  • U.S. Expansion: For the first time, this technology is being applied to Facebook users in the United States.
  • Enhanced AI Capabilities: The system now analyzes “contextual clues” in user profiles to identify teens, simplifies the reporting process for suspected underage accounts, and strengthens barriers against new account creation by minors.

Since April 2025, Meta has used AI to reassign users identified as teens to its Teen Account product, which promises stricter safety guardrails. However, the efficacy of these protections remains under debate. Independent experts recently reported that Teen Accounts sometimes fail to prevent inappropriate contact with strangers, raising questions about the reliability of automated safety measures.

The Legal Pressure: New Mexico Trial

These technical updates come at a critical juncture in Meta’s legal troubles. In March, a jury found Meta liable for misleading consumers about platform safety and endangering children, ordering the company to pay $375 million in penalties. Meta has vowed to appeal this decision.

The current phase of the trial is a bench trial where New Mexico’s Department of Justice is seeking:
1. $3.75 billion in additional damages.
2. Injunctive relief requiring specific operational changes, including:
* Effective age verification systems.
* Blocking children under 13 from accessing the platforms.
* Limits on end-to-end encryption for minors.
* Permanent bans for adults involved in child exploitation.

A Threat to Withdraw

The tension between regulatory demands and technical feasibility has reached a breaking point. Meta has argued that many of the state’s requests are “technologically or practically infeasible,” suggesting they would require building entirely separate apps for New Mexico alone.

Consequently, Meta has threatened to shut down Facebook, Instagram, and WhatsApp in the state if the court orders these changes. Company counsel Alex Parkinson stated that complying with the full scope of the state’s demands would make it “untenable” to continue offering services in New Mexico.

“Many of the requests are technologically or practically infeasible… granting onerous relief could compel Meta to entirely withdraw Facebook, Instagram and WhatsApp from the state.” — Meta’s court filing

State Attorney General Raul Torrez dismissed these concerns, asserting that the issue is not about technological capability but about corporate priorities. He accused Meta of prioritizing advertising revenue and profit over the safety of children.

Conclusion

Meta’s new age-verification tools are a dual-purpose strategy: an attempt to improve child safety and a defensive maneuver in a costly legal battle. However, with independent experts questioning the effectiveness of current safeguards and regulators demanding stricter controls, the company faces a complex challenge. The outcome of the New Mexico trial could set a precedent for how social media platforms are regulated globally, forcing a reckoning on whether current technologies can meet the high standards demanded by lawmakers.