We’ve all been there. It starts innocently enough—you pick up your phone just to clear a quick notification or check the weather, and then, suddenly, it’s forty-five minutes later. You’re somehow three years deep into the vacation photos of a person you haven’t spoken to since middle school, wondering how you got there. It’s that strange, hollow itch in the back of your brain—the one that keeps your thumb flicking upward even when you’re bored, exhausted, or, if we’re being honest, a little bit disgusted with yourself. Most of us just call that “social media addiction” and move on with our day. But if you listen to the legal team over at Meta, they’ll tell you that you’re basically just imagining things. According to them, it’s not a real problem because a specific medical manual hasn’t given it a shiny enough sticker yet.
As recently reported by Engadget—a site that usually spends its time obsessing over the latest gadgets but is now tracking this legal drama—Meta has spent the last week fighting for its life in two major trials. One is in New Mexico and the other is in Los Angeles. In both courtrooms, they are desperately trying to debunk the idea that their platforms are specifically engineered to hook us like digital slot machines. It’s a fascinating, if deeply frustrating, masterclass in how a tech giant handles a massive PR and legal crisis: by bickering over word definitions while the house is clearly on fire. Their main defense? If the “official” manual of mental disorders hasn’t slapped a label on it yet, then for all intents and purposes, it doesn’t exist.
It’s a bold strategy, as the old meme goes. Let’s see if it actually works out for them in front of a jury.
A Dictionary Isn’t a Get-Out-of-Jail-Free Card: Meta’s Semantic Shield
In the New Mexico trial, Meta’s lawyer, Kevin Huff, stood before a jury and essentially told them that “social media addiction is not a thing.” No, really. His evidence for this claim? The Diagnostic and Statistical Manual of Mental Disorders (the DSM), which is the holy grail handbook that mental health professionals in the U.S. use to diagnose patients. Because the American Psychiatric Association (APA) hasn’t officially categorized “Instagram Addiction” alongside things like substance abuse or gambling disorders, Huff argued that the company simply can’t be held responsible for “addicting” its users. He even went so far as to claim the APA “studied this and decided that social media addiction is not a thing.”
But here’s the kicker—and it’s a big one: the APA actually fired back. They released a statement clarifying that just because something isn’t in the DSM-5-TR yet, it doesn’t mean it isn’t real, harmful, or happening right now. It just means the scientific consensus-building process—which, let’s be honest, is notoriously slow and methodical—is still trying to catch up to the lightning-fast, “move fast and break things” reality of Silicon Valley’s algorithms. Think about it this way: the DSM is a lot like a dictionary. New words are used in the real world for years before they’re officially added to the Oxford English Dictionary. That doesn’t mean the words didn’t have a very real meaning before they got that official stamp of approval from the linguists. The same logic applies to mental health.
“The absence of a DSM classification does not mean that a behavior cannot be addictive, maladaptive or clinically significant. Diagnostic manuals formalize scientific consensus; they do not define the boundaries of legitimate scientific inquiry.”
Dr. Tania Moretta, Clinical Psychophysiology Researcher
Meta is essentially trying to win a heavyweight fight by arguing about the technical definition of the word “punch.” While they’re standing there debating the terminology, people are still getting hit. It feels like a classic corporate stall tactic, and it’s eerily reminiscent of how the tobacco industry spent decades arguing that “habituation” wasn’t the same thing as “addiction.” We all know how that turned out.
Binging ‘Stranger Things’ Isn’t the Same as a Casino in Your Pocket
Then we have Adam Mosseri, the head of Instagram. On the stand this week, he tried to play down the severity of the whole situation by comparing social media use to being “addicted” to a Netflix show. It’s a clever bit of wordplay, isn’t it? It’s designed to make the whole thing sound harmless, cozy, and relatable. Who hasn’t binged a whole season of a show on a rainy Sunday afternoon? We use the word “addictive” to describe a good thriller all the time, but we don’t turn around and sue Netflix for it, do we?
But let’s be real for a second: binging a scripted show with a beginning, middle, and an end is fundamentally different from an infinite scroll powered by a black-box algorithm that knows your deepest insecurities. When you watch a show, you’re a passive consumer of a story. When you’re on Instagram, you’re trapped in a high-stakes feedback loop of social validation. You aren’t just watching; you’re performing. You’re waiting for that “like” notification that triggers a tiny, fleeting hit of dopamine in your brain. According to a 2023 Pew Research Center study, about 35% of U.S. teens say they are using at least one of the top five social media platforms “almost constantly.” You simply don’t see 35% of teens watching Netflix “almost constantly” while they’re at the dinner table, in the middle of class, or even in the bathroom. It’s just not the same thing.
The “Netflix defense” completely ignores the interactive, social, and competitive nature of these platforms. It’s not just about the content; it’s about the “variable reward” schedule. This is the exact same psychological trick used by slot machines. You keep scrolling because the *next* post—the one you haven’t seen yet—might be the one that makes you laugh or makes you feel seen. That’s not a TV show; that’s a casino that lives in your pocket 24/7.
The Human Toll Behind the Code
While the lawyers are busy arguing over the nuances of the DSM, the actual cases being presented are absolutely heartbreaking. In Los Angeles, a woman is suing because of the severe mental health harms she suffered, which she attributes directly to Meta’s addictive design choices. In New Mexico, the Attorney General is accusing the company of facilitating child exploitation. These aren’t just abstract “screen time” issues or grumpy parents complaining about their kids’ phones; these are real-world, devastating consequences of a platform that, by its own internal admission, has known about its negative impacts for years.
And let’s not forget, we’ve seen the internal documents before, thanks to brave whistleblowers like Frances Haugen and, more recently, Arturo Bejar. Bejar, who testified this week, has been incredibly vocal about how Meta’s leadership repeatedly ignored warnings about how the platform’s design was actively harming young users. When your own internal experts are telling you there’s a problem and you choose to prioritize “engagement” metrics and ad revenue over the safety of children, you’re moving way past a simple “oops” and straight into “negligence” territory. It’s hard to argue you didn’t know when the receipts are right there.
The Science is Moving Faster Than the Lawyers
The scientific community isn’t exactly waiting for the DSM to catch up before they start sounding the alarm. Researchers like Dr. Tania Moretta point out that there is already a mountain of documented evidence for what they call “social media use disorder.” This isn’t just a “kids these days” complaint from people who miss the era of landlines; it’s a measurable, physiological change in the human brain. We’re talking about actual alterations in the brain’s reward and inhibitory systems. Basically, the part of your brain that screams “one more scroll” gets stronger, while the part of your brain that says “maybe we should actually go to sleep now” gets significantly weaker.
A 2024 report from Common Sense Media found that average screen time for tweens and teens has surged to staggering levels, with many spending upwards of 8 hours a day on entertainment media. When you spend literally half of your waking life in an environment that was specifically designed to keep you there at any cost, “addiction” feels like a pretty accurate word to use, regardless of whether it’s printed in a specific medical manual yet. The impact on sleep, academic performance, and general psychological distress is well-documented and, frankly, it’s obvious to anyone who spends five minutes with a teenager today. You don’t need a medical degree to see the change in behavior.
Zuckerberg’s Date with Reality
These trials are expected to drag on for several more weeks, and the “main event” is coming up soon: Mark Zuckerberg himself is expected to testify. This is going to be a massive, high-stakes moment for the company. Zuckerberg has a long history of being “composed” (which most people read as “robotic” and detached) during these types of hearings. But this time around, he’s facing internal documents that show Meta knew exactly what was happening behind the scenes. It’s much harder to play the “we’re just a neutral platform connecting people” card when your own research says you’re actively harming them. The “innocent bystander” act is wearing thin.
And let’s not forget the bigger picture here: this is just the tip of a very large iceberg. Meta is currently facing lawsuits from 41 different state attorneys general, not to mention a high-profile trial with school districts coming up in June. The legal walls are closing in from every direction, and the “it’s not in the DSM” defense feels like a very flimsy, paper-thin shield against a very large and very angry army of regulators and parents.
Is social media addiction officially a mental disorder?
As of right now, “Social Media Use Disorder” is not listed as a formal, standalone diagnosis in the DSM-5-TR. However, the American Psychiatric Association has been very clear that this doesn’t mean the condition doesn’t exist. It simply means that it’s a field currently under intense study and hasn’t reached the final, formal classification stage that the scientific community requires for a manual update. The reality is often ahead of the paperwork.
What is Meta’s main defense in these trials?
Meta’s legal team is essentially leaning on a technicality. They argue that because social media addiction isn’t a recognized clinical diagnosis like drug or alcohol addiction, they can’t be held legally liable for “addicting” their users. They also love to compare social media use to other forms of entertainment, like binging a TV show, to make it seem like a normal, harmless leisure activity rather than a calculated psychological loop.
Who are the whistleblowers involved in these cases?
The trials have featured some pretty damning testimony from former employees like Arturo Bejar and former executive Brian Boland. Both have gone on the record to criticize Meta for its culture of prioritizing growth and “engagement” metrics over the basic safety and well-being of its users—specifically the younger, more vulnerable ones. Their testimony provides a rare look at the internal warnings that were supposedly ignored by top leadership.
The Turning Point: Are We Done With ‘Move Fast and Break Things’?
Ultimately, these trials represent a massive shift in how we view the responsibility of Big Tech. For years, these companies have enjoyed a sort of “Wild West” immunity. They could build whatever they wanted, release it into the world, and then blame the users for how they chose to use it. But that “personal responsibility” argument starts to fall apart when the product itself is specifically engineered to bypass your willpower and exploit your brain’s biology. You can’t blame a person for falling into a hole that you dug and covered with leaves.
If Meta loses these cases—or even if they just suffer enough reputational damage to move the needle—we could be looking at a massive wave of new regulation. We might see mandatory “circuit breakers” for young users to force them off the app, stricter age verification that actually works, or even a total overhaul of how recommendation algorithms are allowed to function in the first place. The era of “move fast and break things” might finally be coming to a close, mostly because it turns out the “things” they were breaking were us.
In the end, it doesn’t really matter what the DSM says today. What matters is what the data shows, what the parents are seeing in their homes every day, and what those internal documents reveal about the company’s true priorities. Meta can try to hide behind a dictionary for a while, but the reality of the situation is staring everyone else right in the face. We’re not just watching a show; we’re part of a giant, global social experiment, and the results are finally starting to come in. And let’s be honest: they don’t look good for the house.
This article is sourced from various news outlets and court reports. The analysis and presentation represent our editorial perspective on the ongoing legal proceedings.





