Let’s talk numbers. New Mexico’s Attorney General Raúl Torrez extracted $375 million from Meta earlier this year. A substantial sum, no doubt. But here’s the kicker: that verdict, while significant for many, may well be pocket change compared to the potential fallout from the next stage of this legal drama.
This isn’t just about money anymore. Beginning Monday, Meta finds itself back in a Santa Fe courthouse, this time for a three-week public nuisance trial. The state’s argument? That Meta’s business practices, as they currently stand, constitute a public health hazard. And the remedies sought aren’t minor tweaks; they’re seismic shifts.
Think mandatory age verification across its platforms (Facebook, Instagram, WhatsApp), a ban on end-to-end encryption for users under 18, and a strict 90-hour monthly cap on usage for minors. Not to mention limiting addictive engagement features like infinite scroll and autoplay, and a mandate to detect 99 percent of new child sexual abuse material (CSAM). This is Meta’s operating model under the microscope.
“From the outset, our goal was to try and change the way the company’s doing business,” Torrez told The Verge. His point is sharp: $375 million, for a company of Meta’s scale, might simply be factored into the cost of doing business. Real change, he implies, comes from altering the very DNA of their operations.
“I recognize that even at $375 million for a company this big and this profitable, it’s not enough in and of itself to change the way they’re doing business. In fact, there’s probably some folks in that company who think of it as the cost of doing business.”
The implications here are vast. While any court-ordered changes would technically only apply to Meta’s operations within New Mexico, the company’s history suggests a tendency to apply such mandates broadly for operational simplicity. Or, as they’ve threatened, they could simply exit the state entirely. But the more compelling outcome is the precedent. A ruling in New Mexico’s favor could signal to other jurisdictions that courts are indeed willing to step in and fundamentally alter how tech giants operate.
The Public Nuisance Argument: A New Front?
The state’s legal strategy hinges on framing Meta as a public nuisance, a creator of a public health crisis. They’re bringing in a battery of experts and fact witnesses to detail the alleged harms. Meta’s defense will follow, and then Judge Bryan Biedscheid will weigh the feasibility and relevance of the proposed remedies. Unlike the swift jury verdict on financial penalties, this phase promises a more drawn-out deliberation.
A sweeping victory for New Mexico could embolden plaintiffs in thousands of other ongoing lawsuits against tech companies. Conversely, a limited order might offer a brief respite for Meta but could still complicate future settlement negotiations. The legal landscape for Big Tech is on a knife’s edge.
Unintended Consequences and Uncharted Territory
Some of Torrez’s proposed changes are already sparking debate among privacy advocates and tech policy experts. Mandating age verification, for instance, inevitably means collecting more personal data—a move that could ironically make users, including children, less safe by creating a richer target for data breaches. Then there’s the encryption issue. As Don McGowan, formerly on the board of the National Center for Missing and Exploited Children, points out, prohibiting encryption on platforms like Facebook Messenger could simply push users to less regulated channels.
This isn’t lost on Meta. They recently announced the phasing out of end-to-end encrypted messaging on Instagram, citing low user adoption. But Peter Chapman of the Knight-Georgetown Institute highlights that there might be more effective interventions. Evidence presented by the state already suggested Meta’s own recommendation algorithms were responsible for connecting adults and minors—a flaw with direct harm potential and clear benefit reduction, which Torrez also seeks to address. “There’s an opportunity to intervene at that level and try to prevent more of these harmful interactions from taking place without having to tackle encryption,” Chapman notes.
It’s clear that no single fix will eradicate the complex problem of child and teen safety online. Torrez’s multi-pronged approach acknowledges this. However, the ultimate effectiveness of any mandated changes will hinge on their implementation and oversight. How will Meta’s 99 percent CSAM detection rate be verified? What metrics will define a “child sexual abuse material”? The accuracy and reliability of age verification systems are equally significant questions that remain unanswered.
Meta, for its part, is quick to point out the complexities and potential downsides of these proposed mandates. Their defense will likely lean heavily on the practical and privacy implications of the state’s demands, arguing that some solutions could create more problems than they solve. This trial isn’t just a local New Mexico affair; it’s a vital moment that could set a powerful precedent for how the entire tech industry is regulated going forward.
🧬 Related Insights
- Read more: Supreme Court Wrestles ISP Copyright Liability: What’s New?
- Read more: Federal Circuit Rejects PTAB’s ‘Inherency’ Ruling Against Apple
Frequently Asked Questions
What is the main argument of New Mexico against Meta? New Mexico is arguing that Meta’s business practices constitute a public nuisance and a public health hazard, particularly concerning child safety on its platforms.
Will the changes ordered in New Mexico apply nationwide? While the court order would technically only apply to Meta’s operations in New Mexico, the company might implement them across its platforms for simplicity, or similar demands could arise in other states and jurisdictions.
Could Meta shut down its services in New Mexico? Yes, Meta has threatened to cease operations in New Mexico if the court imposes changes it deems unworkable. However, the broader implications of a court-ordered change are likely a more significant concern for the company.