It's time to change that.
Every day, children encounter something online they were never meant to see. This is the story playing out in homes across America.
Harmful content exists freely on the open internet — unguarded, unverified, and accessible to anyone with a screen. No ID check. No age gate. Just one click away from a child.
Children are naturally curious. They don't understand what they're encountering. A pop-up, a shared link, a search result gone wrong — and suddenly they're exposed to content that leaves a mark no child should carry.
Exposed children share what they've seen — with siblings, classmates, friends. The harm multiplies. Without age verification, there is no barrier, no safeguard, no second chance to protect them.
Shock. Confusion. Fear. Studies link early exposure to increased anxiety, depression, and distorted views of relationships. These aren't statistics — they're children who needed protection that wasn't there.
We can stop this. The solution already exists.
A commonsense, constitutional approach to ensure age verification online — just like in the real world.
Research indicates that the average first exposure to pornographic material occurs between ages 10-12, often accidentally through pop-ups, social sharing, or algorithmic recommendation.
Unlike brick-and-mortar stores that verify ID before selling age-restricted materials, most websites rely on self-certification ("I am 18") with no meaningful verification.
California Penal Code Section 313 prohibits the sale of harmful materials to minors. This Act simply extends those protections to the digital realm.
Unrestricted access to pornographic material poses documented risks to adolescent development. Research and clinical evidence support the need for this legislation.
Adolescents' brains are still developing—particularly areas tied to impulse control, decision-making, and reward processing. During these critical years, exposure to high-stimulation sexual content can reinforce compulsive viewing patterns and potentially alter dopamine and reward systems. Younger users are especially vulnerable to habit formation, making early intervention essential.
Studies have documented associations between frequent exposure to pornography and increased anxiety, depression, and loneliness among adolescents. Many young people report feelings of shame, guilt, and confusion following exposure. These psychological effects can persist and interfere with healthy development during formative years.
Early exposure may contribute to unrealistic expectations about intimacy and bodies, objectification of others, reduced empathy, and confusion about consent and healthy relationship boundaries. Adolescents who consume pornographic material before developing emotional maturity often struggle to distinguish fantasy from reality, affecting their future relationship quality.
Many children are first exposed to pornographic material before age 13—frequently by accident through pop-ups, unfiltered search results, social media sharing, or algorithmic exposure. This early, often involuntary exposure can normalize sexual content before emotional readiness, creating long-term effects on perception and behavior.
Research suggests that repeated exposure can lead to desensitization, whereby users seek increasingly extreme material to achieve the same level of stimulation. This escalation pattern, combined with uncontrolled internet access, can rapidly expose minors to material that would be illegal to distribute to them offline.
Given the documented risks, the Protection of Minors from Online Pornography Act focuses on a narrow, constitutionally grounded solution: requiring reasonable age verification to ensure that material harmful to minors is not freely accessible online.
Extending California Penal Code Sections 313–313.4 to address online pornography — a comprehensive, constitutional framework for age verification.
California already prohibits the knowing sale, exhibition, or distribution of harmful sexual material to minors under Penal Code Sections 313–313.4. This Act extends those protections to the internet — requiring commercial entities that publish or distribute material harmful to minors online to verify users' ages before granting access.
Commercial pornography websites — defined as sites where more than one-third of content is harmful matter as defined in Section 313(a) — must take reasonable measures to verify age before allowing access. This applies to all persons knowingly publishing or distributing such material, with safe harbor protections for good-faith compliance.
The bill permits multiple commercially reasonable methods, including:
The Attorney General may also adopt regulations specifying additional approved methods.
Critical to this Act is Section 3(b): commercial entities cannot retain, store, or transmit personal identifying information obtained for age verification after verification is completed. They cannot sell, transfer, or use such data for any other purpose — including advertising, profiling, or commercial analytics. Misuse of verification data carries both civil and criminal penalties.
Companies that implement commercially reasonable age verification technology have an affirmative defense to any action brought under the Act. This encourages rapid compliance without fear of retroactive liability.
The Act does not apply to:
Extends existing California Penal Code Sections 313–313.4 by adding sections addressing online pornography
SECTION 1. Legislative Findings and Declarations
The Legislature finds and declares the following:
1. The State of California has a compelling interest in protecting minors from harm resulting from exposure to harmful material of a sexual nature unprotected by the First Amendment for minors;
2. Existing California law prohibits the knowing sale, exhibition or distribution of such material to minors and requires the exercise of reasonable care by the provider in ascertaining the true age of a minor before the sale, exhibition or distribution of such material;
3. The internet has created unprecedented, direct access for minors to internet pornography not adequately addressed by existing law;
4. This Act would extend the protections for minors in existing law to the internet by requiring the commercial entities that publish or distribute material harmful for minors online make reasonable efforts to verify the age of users before granting access.
5. In Free Speech Coalition v. Paxton, the United States Supreme Court held that states may require commercial pornography websites to verify the age of users before providing access to certain pornographic material harmful for minors.
6. Age-verification technologies exist that allow verification without permanently retaining personal identifying information and therefore protect the privacy rights of adult users.
7. It is the policy of this State that minors should not have unrestricted access to pornographic material harmful for minors and outside the protection of the First Amendment for minors.
8. It is not the intention of this Act to regulate or restrict the content of material distributed online but rather to restrict the access of minors to certain material harmful for minors and unprotected by the First Amendment.
SECTION 2. Definitions
For purposes of this Act:
(a) "Person" shall have the meaning set forth in California Penal code Section 313(c).
(b) "Distribute" means to issue, sell, give, provide, deliver, transfer, transmute, circulate or disseminate by any means with or without consideration therefor;
(c) "Minor" means a person under the age of 18 years.
(d) "News gathering organization" means a newspaper, news publication or news source, printed or online or mobile, of current news or matters of public interest, a radio broadcast station, television station, cable television operator, or wire service, or a bona fide employee of same, acting within the scope and purpose of that employment;
(e) "Knowingly" means being aware of the character of the matter;
(f) "Publish" means to communicate, distribute or otherwise make available on a publicly available internet website with or without consideration therefore;
(g) "Matter" means any picture, drawing, photograph, video, motion picture or other visual representation or image, live or recorded, regardless of the means of producing such matter including, but not limited to, images created by Artificial Intelligence technology;
(h) "Harmful matter" shall have the meaning set forth in California penal code section 313(a).
(c) "Substantial portion" means more than one-third of its total content when the totality of content is viewed on an annual basis.
(g) "Age verification" means use of a commercially reasonable method of verifying that a person attempting to access a website is 18 years of age or older, including, but not limited to:
1. Government-issued identification verification
2. Digital identity verification services
3. Transactional data verification (e.g., credit card or similar methods)
4. Biometric or third-party identity verification systems
SECTION 3. Age Verification Requirement
(a) A person that knowingly publishes or distributes matter on an internet website a substantial portion of which is harmful matter as defined in Section 313(a) shall take reasonable measures to insure age verification before allowing access.
(b) A commercial entity shall not retain, store, or transmit personal identifying information obtained for the purpose of age verification after verification is completed and shall not sell, transfer or use such verification data for any other purpose including, but not limited to, for advertising, profiling, or commercial analytics.
SECTION 4. Civil Liability
(a) A commercial entity that knowingly violates Section 3(a) or Section 3(b) of this Act shall be liable to the State of California for a civil penalty not to exceed:
$50,000 per day of violation; and
(b) The Attorney General, district attorneys, and city attorneys may bring an action to enforce this section.
SECTION 6. Private Right of Action
(a) A parent or legal guardian of a minor who accesses material harmful to minors in violation of this Act may bring a civil action for actual damages, injunctive relief and attorneys' fees and costs.
(b) An adult whose identity is retained, stored, transmitted or utilized in violation of this Act may bring a civil action for actual damages, injunctive relief and attorneys' fees and costs.
SECTION 7. Safe Harbor
A commercial entity shall not be liable under this Act if it:
Implements commercially reasonable age-verification technology which shall be an affirmative defense to any action brought under this act.
SECTION 8. Exemptions
This Act does not apply to:
1. Cloud service providers, search engines or other providers of internet or telephone technologies that host or facilitate the transmission of content by others;
2. Bona fide news gathering organizations, educational institutions, libraries, or scientific publications;
SECTION 8. Criminal Liability
(a) Every person who violates Section 3(a) is punishable by a fine of not more than two thousand dollars ($2,000) and by imprisonment in the county jail for not more than one year or by both fine and imprisonment;
(b) Every person who violates Section 3(b) is punishable by a fine of not more than two thousand dollars ($2,000) and by imprisonment in the county jail for not more than one year or by both fine and imprisonment;
SECTION 9. Attorney General Rulemaking Authority
The attorney general may adopt regulations consistent with this Act specifying additional age verification methods.
SECTION 9. Severability
If any provision of this Act is held invalid, the remaining provisions shall remain in effect.
Age verification laws are spreading across America. Here's the current landscape — and how to bring this critical protection to your state.
Everything you need to advance this legislation in your state:
This Act is constitutionally sound, strategically tailored, and achievable. Here's what you need to know.
The internet has fundamentally changed the landscape of minor access to pornography. While California Penal Code Section 313 has successfully restricted sales of harmful materials in brick-and-mortar contexts for decades, online platforms operate without similar gatekeeping. The Protection of Minors from Online Pornography Act extends a proven, constitutionally sound framework to the digital realm.
Unlike content-restriction laws, this Act is content-neutral: it does not expand the definition of what material is "harmful," nor does it restrict what adults can access. It simply requires that existing legal protections be meaningfully enforced by preventing unrestricted minor access online.
The Miller Test (1973) defines "obscenity" as material that:
Importantly, Miller carved out an exception for "harmful to minors" material. Content may be protected speech for adults but still be constitutionally restricting to minors. The Act leverages this doctrine: it does not call material "obscene" (which would affect adult access), but rather restricts minor access to material already deemed harmful to minors under PC §313(a). Free Speech Coalition v. Paxton confirmed this approach is constitutional.
Robust age verification methods exist that protect user privacy while confirming age. Here's how they work.
Users upload or present a government-issued ID (driver's license, passport). Third-party verification services confirm the ID is valid and the birthdate shows age 18+. The ID image is typically discarded immediately after verification.
Services like Intellinetics or Vouched use AI and biometrics to verify identity without storing personal data long-term. Users take a selfie; the service confirms identity and age against government records, then deletes the biometric data.
Credit card or payment verification confirms the account holder is 18+ (most credit accounts require age 18+). No personal data is retained—only a confirmation that age verification occurred.
Services like IDology or LexisNexis verify age against public records (address history, etc.). These systems confirm age without retaining personal identifying information.
Unlike Facebook, Google, or other ad-tech platforms that retain user data for profiling and targeting, age verification systems under this Act are designed as ephemeral checkpoints. Verification occurs, access is granted, and the verification data is discarded. No profile is created. No ad targeting occurs. This is privacy by design.
Enter your ZIP code to find your state representatives and learn how to reach them.
Copy these messages to share with your network: