You Didn't Notice Anonymous Internet Browsing Was Just Killed
What a new Supreme Court ruling means for privacy, free speech and the future of the internet
H.B. 1181 101
Texas’ H.B. 1181, an online age verification law, makes complete sense on its surface. It’s designed to prevent children from accessing pornography by requiring proof of age prior to entering an adult site. I mean, who wouldn’t support this?
But what’s unique about this law is that it places the onus on any website that is one-third “adult content” to conduct the ID check. The technical prowess, costs, privacy implications and visitor fears profoundly complicate the law.
The 2025 Free Speech Coalition v. Paxton case, which reached the Supreme Court, debated whether or not the act of requiring a government ID to access material on the internet infringed upon free speech and our access to information. As argued, the protection of children has unintentionally chilling effects on the rest of the population attempting to just browse the internet.
Last week, the Supreme Court upheld the law, deciding that using age verification “to prevent children from accessing sexually explicit content” is within a state’s authority.
What this means...
Firstly, what’s noteworthy about this decision is that it departed from previous similar case decisions. In 1997's Reno v. ACLU and 2004's Ashcroft v. ACLU, the Court struck down laws that restricted access to sexual content online. They ruled that making "indecent" or "offensive" speech a crime — just because minors might see it — violated the First Amendment.
In such cases, the Supreme Court historically ruled that laws burdening adults’ access to speech required the highest level of review in the context of the First Amendment (AKA “strict scrutiny.”) This type of scrutiny means a law must be “narrowly tailored” or the least speech-restrictive.
For decades, the legal principle was clear: you cannot burden adults' constitutional rights in the name of protecting children.
But in light of the most recent 2025 Free Speech Coalition v. Paxton case, this is now no longer the case.
Their new ruling outlines: no person — adult or child — has a First Amendment right to access speech that is inappropriate to minors without first submitting proof of age.
According to the Court, the modern internet has made access to adult material too easy. We therefore require new precedent.
The reaction...
As the Electric Frontier Foundation (EFF), the leading nonprofit org defending civil liberties online, explained,
“[The court’s decision] ignores the many ways in which verifying age online is significantly more burdensome and invasive than doing so in person.”
Checking an ID for beer at a store is ephemeral and incomparable to uploading an ID on the internet, which can be associated with one’s online activity and ultimately can create trackable, hackable, and commodifiable records of private desires, which futures are uncertain.
Also, buying alcohol is not a protected right in the United States Constitution, while the right to access protected speech as an adult is.
The EFF continues,
“This Supreme Court broke a fundamental agreement between internet users and the state that has existed since its inception: the government will not stand in the way of people accessing First Amendment-protected material.
There is no question that multiple states will now introduce similar laws to Texas.”
Two dozen states across the U.S. have already proposed similar laws. And at least three states “have no limit on the percentage of material required before the law applies.”
In some states and countries you can skip handing over an ID by alternatively conducting a biometric facial scan or giving up your banking information before accessing an adult website.
With these occurrences in mind, Supreme Court Justice Kagan stated in her dissent,
“In today’s opinion, speech rights are pushed to the sidelines, or entirely off the field.”
“Many reasonable people, after all, view the speech at issue here as ugly and harmful for any audience. But the First Amendment protects those sexually explicit materials, for every adult. So a State cannot target that expression, as Texas has here, any more than is necessary to prevent it from reaching children.”
Or as the more eloquent saying goes,
“Censorship is telling a man he can't have a steak just because a baby can't chew it.”
This ruling isn't just legal precedent, it's an artifact symbolizing our moment's anxiety about tech-mediated sexuality, paternalism and privacy online.
Context on obscenity...
Obscenity has a supporting role here, a historical stratagem for censorship.
In 1973’s Miller v. California Supreme Court case, it was determined that obscene material falls outside First Amendment protection. In other words, anything “obscene” is not protected. But the issue then is how obscenity is legally defined. According to The Miller Test for Obscenity, obscene content must meet the following three criteria:
Appeals to prurient interest when judged by contemporary community standards
Note: “Community” here is meaningless as a kink community has different standards than a church community. Community standards are subjective...
Depicts sexual conduct in a patently offensive way as defined by state law
Note: Similarly, what is “obviously” offensive to one may not be offensive to another. Offense is also inherently subjective...
Lacks serious literary, artistic, political, or scientific value when taken as a whole
Note: Art is human expression, therefore anything that is expressive can be considered art and therefore valuable...
Conclusively, obscenity is not black and white, but a grey, amorphous, culturally subjective Rorschach test. This ambiguity wasn't a bug, but a feature, helping set a high bar for what can be deemed obscene and therefore censorable.
But today’s age verification laws are designed to now lower this bar.
Age verification laws don't require material to meet the Miller Test. Now, material only needs to be deemed “harmful to minors.” And as we know, many have a field day labeling anything they personally dislike as “harmful.”
Elementary school books notwithstanding.
Meanwhile, the Interstate Obscenity Definition Act (IODA) is currently looking to redefine the definition of obscenity by updating the second part of the Miller Test to:
“Depicts, describes, or represents, an actual or simulated sexual act or sexual contact”
Such a wide scope and loose subjectivity is inevitably prone to weaponization.
The implications...
Anything a government now deems “sexual” or “harmful” may now require one to hand over their passport, biometric data or banking information. More concerning, such laws may now require this personal information to access platforms, which may only have some amount of “sexual material.”
How do you even define “sexual material?”
Definitions can and will effortlessly stretch beyond porn — which in itself does not have a clear definition. Targeted content could include any form of sexual education, sexual health, medical material, dating advice, support communities, LGBTQ+ resources, classic literature with sexual themes, or artistic photography or films.
And how exactly will one measure a breakdown of “one-third adult content” on a site? Couldn’t a platform also bloat their library with G-Rated AI fluff to dilute their “adult percentage?”
Regardless, many sites cannot and will not implement effective age check verifications due to either technical or financial limitations. Those unable or unwilling to comply will soon be inaccessible or prone to five-figure fines per violation. One woman is already suing for millions for “psychological distress” on behalf of her son who cunningly used her laptop to access porn — despite age check laws already in effect in her state. (Goes to show you how effective they are...) Pornhub has pulled out of all states and countries requiring age check verifications — not because of disinterest in safety, but because of the impossible costs, claims of ineffectiveness, and data privacy burdens.
As you can imagine, many internet users are considering the potential nightmares of intertwining government identification, internet activity and sexual preferences. Millions are not game handing over a driver’s license or selfie to a porn site... even if they’re directed to a third party.
Fears are legitimate with real scar tissue. In 2015 there was a 60GB data breach of Ashley Madison, the extramarital affair platform, because hackers “didn’t approve” of its existence. Fallout included suicides and a $567M class-action lawsuit.
As you can guess, these laws and memories will not curb the desire for adult content...
In only four hours after Florida’s HB 3 age check law came into effect, VPN usage spiked 1,150%.
Further, when adults are deterred from accessing lawful content, they’re nudged towards smaller, darker platforms which have no problem not complying.
In a study by researchers at New York University's Center for Social Media & Politics and Stanford University's Polarization & Social Change Lab, they found that while traffic to Pornhub dropped in states where the platform pulled out, there was a +48% increase in searches for another large noncompliant platform.
These laws curb traffic to compliant sites and drive users to VPNs and noncompliant sites.
With unfortunate experience, laws like FOSTA/SESTA, financial de-platforming, book bans, shadow banning LGBTQ+ content and reproductive restrictions do not measurably help the public, but actively endanger select communities.
The deeper story...
This isn't about pornography, it's about “performative protection.”
Texas legislators know this law won't actually comprehensively protect children because kids aren't just getting their sexual scripts from porn, but from all across the internet, social media, TV and film where sexual socialization is happening on a much larger scale. We also know when there’s a will, there’s a way...
We are in the theater of moral action — posturing and appearance is more applaudable than actual safety and humanity.
It’s late-stage moral legislation, designed not to solve problems, but to manage anxieties about problems that we can more effectively address ourselves with self-reflection and accountability — not by sacrificing privacy and speech.
Mind you, all this time, there have been proposed alternatives including:
Age verification on the device, browser or app store level (which would not subject individual sites to vulnerabilities),
Reformed sexual education and porn literacy programs like Cindy Gallop and Erika Lust’s efforts (preparing families and children for the inevitable),
Better family-controlled site and device filtering solutions (or a willingness to engage with existing adequate features),
Using confirmed email addresses to triangulate age from associated data (vs. more sensitive documents), and
An increased competence and confidence empowering parents to have uncomfortable conversations with their children about these topics.
These are rarely considered.
Instead, the recent legal ruling normalizes the idea that certain types of content requires government intervention, verification and potential tracking. It’s the domestication of a surveillance state through panic. We're creating a generation of digital natives who will understand privacy as conditional, sexuality as inherently suspicious, and government oversight as the natural price of adult desire. You can believe these things and also want to protect children.
There are solutions to protect children and speech. This is not a binary trade off.
It’s all apart of the plan...
The evil genius of this strategy lies in its procedural invisibility. Unlike heavy-handed content bans of previous panics, age verification works through what appears to be neutral friction. It's not censorship, it's just age verification. This is far more insidious than traditional censorship because it masquerades as iron-clad common sense.
You don’t have to be a conspiracy theorist to think this is some devious political plot to control what content is accessible on the internet.
Russell Vought, the architect of Project 2025, admitted on hidden camera that these age verification laws are the “back door” — gotta love the Freudian pun — to achieving a broader porn ban that conservatives wouldn’t be able to otherwise implement directly.
The question isn't whether this will protect children — it won't. Curiosity and sexuality are imminent.
Rather, the question is whether we're comfortable living in an online environment where curiosity is surveilled and sexual desire is demonized.
The internet's original promise was radical disintermediation and the democratization of information — indifferent to whether you personally enjoyed or approved of all that material.
As sociology professor Hannah Wohl put it in The Hill,
“We may not all want to defend porn. But if we care about free speech and digital privacy, we must. Because what starts with porn rarely ends there.”
Where we go from here...
Are you willing to accept the documentation of desire as the price of digital citizenship?
I know I’m not.
What this then means is that we require a refreshed energy and comfort to talk openly about the taboo. Yet the taboo isn’t porn or sexuality — it’s the complex, nuanced conversation about human agency and tolerance.
Government and parental concerns are grave — but so are the shortcomings and the overreach of this approach.
When we refuse to openly discuss stigma, we keep it swept under the rug where it festers and we surrender privacy and speech to those who exploit discomfort for political gain.
We’ve ironically sacrificed free speech because we’re uncomfortable discussing the most shameful aspects of it.
Again, this is not about porn, but about long ignored questions we must now face: How should sexual autonomy look in digital environments? How do we balance child protection with adult privacy? And how do we effectively address real harms vs. imaginary ones used to justify control?