We’ve Seen This Pattern Before. We’re in It Again.
There was a time when cigarettes weren’t treated like a serious problem. They were just part of life.
You saw them everywhere. Ads, TV, even doctors backing them. People knew they probably weren’t great for you, but they weren’t looked at as dangerous in any real sense.
Then the evidence started to build.
And what followed wasn’t some clean, decisive shift. It was slower than that. Messier.
Pushback. Messaging. Delay.
“Use responsibly.”
“Not proven.”
“Personal choice.”
If any of that sounds familiar, it should. Because we’re watching something very similar play out again.
This time it’s not cigarettes. It’s social media — and now, more recently, AI layered into it.
And it’s not just a theory at this point. It’s already starting to move into the same phase we’ve seen before.
The lawsuits are beginning.
There are cases tied to mental health, addiction, even extreme outcomes like suicide and violence. Some of them are already working their way through the courts, trying to answer a question that’s going to matter a lot:
Where does responsibility actually sit?
At the same time, the companies are starting to position themselves.
Meta, for example, has already signaled it will appeal recent rulings and continues to argue that platforms shouldn’t be held responsible for how people use them — framing it as an issue of personal responsibility and, in some cases, free speech.
And on the AI side, you’re seeing a similar move take shape.
In Illinois, there’s a proposed bill — backed by companies like OpenAI — that would limit when AI developers can be held liable, even in cases of large-scale harm.
The idea is that if the system wasn’t used intentionally or recklessly by the company itself, then the responsibility shouldn’t fall on them.
That includes scenarios involving serious outcomes — even things like mass harm or catastrophic misuse.
That doesn’t make it right or wrong, but it does make it recognizable, because we’ve seen this part before too.
Evidence starts to build.
Pressure follows.
Accountability begins to take shape.
And at the same time, the system pushes back — not by stopping, but by redefining where responsibility begins and ends.
If you look back at tobacco, the change didn’t happen all at once. The evidence came in waves, the pressure built gradually, and eventually regulation followed. Warning labels, age limits, restrictions on advertising.
It helped. Especially when it came to kids.
But the system itself didn’t disappear. It adjusted.
Cigarettes turned into vaping. Same underlying pull, just delivered in a way that felt newer, cleaner, easier to justify.
And a lot of the people who got pulled in early? They’re still dealing with it.
Alcohol followed a similar path.
At one-point, hard liquor wasn’t even allowed on television. Over time that line moved, and now it’s completely normalized again. Branded, marketed, everywhere.
Not banned. Not removed.
Just reshaped.
And this is where it starts to connect to what’s happening now.
We’re seeing the same kinds of concerns show up again, but this time it’s kids growing up inside social media and AI-driven systems. Not just using them here and there but spending a real part of their day inside them.
That’s a different level of exposure.
The patterns are hard to miss once you step back.
Heavy early exposure.
Unclear long-term effects.
Systems designed to keep attention.
Pushback when concerns are raised.
And a lot of the response still focused on the edges.
Screen time tools. Parental controls. Usage limits.
All of that helps. It’s not nothing.
But none of it really changes what the system itself is built to do.
And to be clear, the focus on kids makes sense.
That’s the one place where there’s still some level of control. Parents, schools, policy — there are levers you can actually pull. Delaying exposure, setting boundaries, putting guardrails in place… all of that matters, and it’s worth doing.
We’ve seen with things like tobacco that those kinds of efforts can make a real difference over time.
But that’s only part of the picture, because there’s a much larger group already inside the system.
Adults.
People 18 and older who aren’t going to be regulated in the same way, and realistically, aren’t going to be “protected” from it either.
And that’s a potentially harder problem.
It’s not as visible. It’s not as easy to measure. And it doesn’t get talked about as much. But it’s there.
People who’ve been using these platforms for years, long enough to feel the shift, even if they wouldn’t call it a problem. Attention doesn’t hold the same way. Reactions come quicker. It’s easier to get pulled into something you didn’t intend to engage with. Not constantly. Not dramatically. But consistently enough that it adds up.
And unlike kids, this isn’t something that’s going to be solved with restrictions.
It’s closer to what we’ve seen with tobacco and alcohol over the long run.
Once people are already in it, the solution isn’t control. It’s awareness. Education. Understanding what’s actually happening and adjusting behavior over time. That’s a much heavier lift. And it’s still ongoing, even decades later, with things we already understand far better.
I smoked for 12 years, and I still have a few drinks when I feel like it. This isn’t about cutting everything out. Most adults aren’t going to do that anyway. It’s about recognizing the patterns in play while you’re inside them.
If you zoom out, the pattern is pretty consistent.
Something useful shows up first. Adoption spreads quickly. The downsides take time to fully surface. Once they do, there’s pushback, then some level of regulation, usually around the edges, and eventually the system adapts and keeps going.
We’re not at the end of that cycle right now, we’re somewhere in the middle of it and that’s what makes this harder to deal with.
Tobacco and alcohol are substances.
Social media and AI are environments.
You can regulate what people ingest. It’s a lot more complicated to regulate what people see, interact with, and get drawn into — especially when those systems are constantly adapting in real time.
So even if real progress is made, especially when it comes to kids, the underlying system doesn’t just disappear.
It keeps optimizing. It keeps refining.
And in the meantime, people are still living inside it.
Kids growing up in it. Adults adjusting to it.
Not necessarily addicted. Not completely unaware.
But very likely being shaped by it more than they realize.
That’s the part that tends to get overlooked.
And if this really is the same pattern… then we’re not at the beginning of the story. We’re somewhere in the middle of it — whether we want to admit that or not.