General (07)Date:6/10/2025 2:36:44 PM
From:"durfeycraig778@gmail.
To:
"'
Subject:[EXTERNAL] PUBLIC COMMENT CALLING A SOCIAL MEDIA RESOLUTION TO RAISE AREWARENESS
OF HARM FROM TO MUCH SCREENTIME.
Attachment:
ORANGE COUNTY BOARD OF HEAL TH.pdf;Resolution Addressing Social Media Use, Screen Time, and
Youth.pdf;202520260AB2_Assembly Privacy And Consumer Protection.pdf;202520260AB2_Assembly
Judiciary.pdf;202520260AB2_Assembly Floor Analysis.pdf;
Warning: This email originated from outside the City of Anaheim. Do not click links or open attachme nts unle ss you recognize the
sender and are expecting the message.
(P.R.D.D.C.)(P.R.D.D.C.)
PARENTS F OR THE RIGHTS OF DEVELOPMENTALLY DISABLED CHILDRENPARENTS F OR THE RIGHTS OF DEVEL OPMENTAL LY DISABL ED CHIL DREN
CRAIG A. DURF EY F OUNDER OF P.R.D.D.C.CRAIG A. DURFEY FOUNDER OF P.R.D.D.C.
P.O.BOX 2001 GARDEN GROVE, CA 928 42P.O.BOX 20 01 GARDEN GROVE, CA 92 84 2
SOCIAL EMOTIONALPAWS.COMSOCIALEMOTIONALPAWS.COM
FACEBOOK: CRAIG DURFEYFACEBOOK: CRAIG DURFEY
U.S. HOUSE OF CONGRESS H2 40 4 - HONORING U .S. H OU SE OF CON GRESS H2 40 4 - HONORING CRAIG CRAIG DURFEYDURFEY FOR HIS F IG HT AGAINST AU TISM ... Ms. F OR HIS FIGHT AGAINST AUTISM ... Ms. L ORETTALORETTA
SANCHEZSANCHEZ of California. of Californ ia.
h ttps://www.g ovin fo .go v/co nte nt/p kg /CREC-2 00 3-0 3-2 7/p df/CREC-2 00 3-0 3-2 7.p dfhttps://www.go vinfo.g ov/content/pkg/CREC-2003-03-27/pdf/CREC-2 00 3-0 3-2 7.p df
new website new website socialemotionalpaws.orgsocialemotionalpaws.org
Orange County Board of SupervisorOrange County Board of Supervisor
County Administration NorthCounty Administration North
400 W. Civic Center Drive400 W. Civic Center Drive
Santa Ana, CA 92701Santa Ana, CA 92701
Orange County Board of EducationOrange County Board of Education
200 Kalmus Drive200 Kalmus Drive
Costa Mesa, CA 92626Costa Mesa, CA 92626
Phone: 714-966-4012Phone: 714-966-4012
Fax: 714-432-1916Fax: 714-432-1916
E-mail: contact@ocbe.usE-mail: contact@ocbe.us
Congressman Lou CorreaCongressman Lou Correa
1127 Longworth House Office Building1127 Longworth House Office Building
Washington, DC 20515Washington, DC 20515
Phone: (202) 225-2415Phone: (202) 225-2415
Congressman Derek TranCongressman Derek Tran
2082 Rayburn House Office Building2082 Rayburn House Office Building
Washington, DC 20515Washington, DC 20515
Phone: (202) 225-2965Phone: (202) 225-2965
Senator Tony StricklandSenator Tony Strickland
1021 O Street, Suite 67301021 O Street, Suite 6730
Sacramento, CA 95814-4900Sacramento, CA 95814-4900
Phone: (916) 651-4036Phone: (916) 651-4036
Assemblyman Tri TaAssemblyman Tri Ta
14361 Beach Blvd Ste 21114361 Beach Blvd Ste 211
Westminster, CA 92683Westminster, CA 92683
Phone - 714-379-0970Phone - 714-379-0970
Mayor StephanieMayor Stephanie
Garden Grove City HallGarden Grove City Hall
11222 Acacia Parkway11222 Acacia Parkway
Garden Grove, CA 92840Garden Grove, CA 92840
Dear Mayor StephanieDear Mayor Stephanie
Request place as a agenized proposal RESOLUTION ADDRESSING SOCIAL MEDIA USE, SCREEN TIME,Request place as a agenized proposal RESOLUTION ADDRESSING SOCIAL MEDIA USE, SCREEN TIME,
AND YOUTH MENTAL HEALTHAND YOUTH MENTAL HEALTH
WHEREAS, according to a recent survey one-third of all teens report being online and using social mediaWHEREAS, according to a recent survey one-third of all teens report being online and using social media
platforms "almostplatforms "almost
constantly"; 1 andconstantly"; 1 and
WHEREAS, a 2023 U.S. Surgeon General's Advisory warned that while social media may offer benefits,WHEREAS, a 2023 U.S. Surgeon General's Advisory warned that while social media may offer benefits,
there is growingthere is growing
evidence of risks to youth mental health including anxiety, depression, and reduced quality of sleep;2 andevidence of risks to youth mental health including anxiety, depression, and reduced quality of sleep;2 and
WHEREAS, studies have found correlations between time spent on social media and mental healthWHEREAS, studies have found correlations between time spent on social media and mental health
including one studyincluding one study
that found risk of anxiety and depression doubled in youth that spent more than three hours daily onthat found risk of anxiety and depression doubled in youth that spent more than three hours daily on
social media; 3 andsocial media; 3 and
WHEREAS, adolescence represents a critical time in the brain development and studies have reportedWHEREAS, adolescence represents a critical time in the brain development and studies have reported
social media maysocial media may
result in addictive behavior, decreased attention span, and lower test scores;andresult in addictive behavior, decreased attention span, and lower test scores;and
WHEREAS, growing reports suggest social media can expose youth to cyberbullying, inappropriateWHEREAS, growing reports suggest social media can expose youth to cyberbullying, inappropriate
content, and heavilycontent, and heavily
edited comparisons that can lead to eating disorders, body dissatisfaction, and suicidal ideation; 6 andedited comparisons that can lead to eating disorders, body dissatisfaction, and suicidal ideation; 6 and
WHEREAS, evidence-based strategies to reduce the risks of social media include implementing device-WHEREAS, evidence-based strategies to reduce the risks of social media include implementing device-
free school policies,free school policies,
setting screen-time limits at home, avoiding screens before bedtime, and educating families about thesetting screen-time limits at home, avoiding screens before bedtime, and educating families about the
psychological effectspsychological effects
of social media; andof social media; and
WHEREAS, time spent on social media and screens decreases time for exercise, sleep, in-person timeWHEREAS, time spent on social media and screens decreases time for exercise, sleep, in-person time
with friends, andwith friends, and
other activities that are thought to improve mental health and healthy social development (for youth andother activities that are thought to improve mental health and healthy social development (for youth and
adults).adults).
https://www.orangecountync.gov/DocumentCenter/View/31474/Resolution-Addressing-Social-Media-Use-https://www.orangecountync.gov/DocumentCenter/View/31474/Resolution-Addressing-Social-Media-Use-
Screen-Time-and-Mental-Health_Approved-20250326_For-Sharing?bidId=Screen-Time-and-Mental-Health_Approved-20250326_For-Sharing?bidId=
New Poll Reveals Strong Bipartisan Opposition to Proposed Ban on State AI LawsNew Poll Reveals Strong Bipartisan Opposition to Proposed Ban on State AI Laws
Poll from Common Sense Media and Echelon Insights reveals strong and bipartisan concern aboutPoll from Common Sense Media and Echelon Insights reveals strong and bipartisan concern about
a congressional proposal to ban state-level AI laws for the next decadea congressional proposal to ban state-level AI laws for the next decade
SAN FRANCISCO, May 29, 2025 — Common Sense Media and Echelon Insights today released theSAN FRANCISCO, May 29, 2025 — Common Sense Media and Echelon Insights today released the
findings of a new poll showing that Americans across the political spectrum strongly findings of a new poll showing that Americans across the political spectrum strongly oppose aoppose a
decade-long ban on state AI safety laws included in the budget reconciliation bill narrowly passeddecade-long ban on state AI safety laws included in the budget reconciliation bill narrowly passed
by the House on May 22 and now being considered by the Senate.by the House on May 22 and now being considered by the Senate. New Poll Reveals StrongNew Poll Reveals Strong
Bipartisan Opposition Bipartisan Opposition to Proposed Ban AIto Proposed Ban AI
The key takeaways from the poll include:The key takeaways from the poll include:
Concern about the potential effects of AI — especially on kids and teens — is widespread andConcern about the potential effects of AI — especially on kids and teens — is widespread and
bipartisan.bipartisan.
93% of voters — including 95% of Republicans — are concerned about kids being exposed to highly93% of voters — including 95% of Republicans — are concerned about kids being exposed to highly
sexualized AI-generated content online. sexualized AI-generated content online. Nine-in-ten (90%) worry about the effect of social media onNine-in-ten (90%) worry about the effect of social media on
kids and teens.kids and teens.
Some 86% of voters prefer an approach that focuses on protecting kids and teens from dangersSome 86% of voters prefer an approach that focuses on protecting kids and teens from dangers
online, while only 7% prioritize an online, while only 7% prioritize an approach that generally avoids regulation in pursuit of economicapproach that generally avoids regulation in pursuit of economic
growth and innovation.growth and innovation. New Poll Reveals Strong Bipartisan Opposition to Proposed Ban AINew Poll Reveals Strong Bipartisan Opposition to Proposed Ban AI AND AND SocialSocial
Media ResolutionMedia Resolution
Commerce Committee Advances Schatz-Cruz Bipartisan Legislation To Keep Kids Safe, Healthy, OffCommerce Committee Advances Schatz-Cruz Bipartisan Legislation To Keep Kids Safe, Healthy, Off
Social MediaSocial Media
Kids Off Social Media Act Sets Social Media Age Minimum To 13, Prohibits Use Of Algorithms ToKids Off Social Media Act Sets Social Media Age Minimum To 13, Prohibits Use Of Algorithms To
Feed Addictive Content To Teens Under 17.Feed Addictive Content To Teens Under 17.
Today, the U.S. Senate Commerce, Science, and Transportation Committee approved the Kids OffToday, the U.S. Senate Commerce, Science, and Transportation Committee approved the Kids Off
Social Media Act. Authored by U.S. Senators Brian Schatz (D-Hawai‘i), a senior Social Media Act. Authored by U.S. Senators Brian Schatz (D-Hawai‘i), a senior member of themember of the
Senate Commerce Committee, Ted Cruz (R-Texas), Chair of the Senate Commerce Committee, ChrisSenate Commerce Committee, Ted Cruz (R-Texas), Chair of the Senate Commerce Committee, Chris
Murphy (D-Conn.), and Katie Britt (R-Ala.), the bipartisan legislation will keep kids off social mediaMurphy (D-Conn.), and Katie Britt (R-Ala.), the bipartisan legislation will keep kids off social media
and help protect them from its harmful impacts. and help protect them from its harmful impacts. To do that, the bill would set a minimum age of 13To do that, the bill would set a minimum age of 13
to use social media platforms and prevent social media companies from feeding algorithmically-to use social media platforms and prevent social media companies from feeding algorithmically-
targeted content to users under the age of 17. In addition to Schatz, Cruz, Murphy, and Britt, thetargeted content to users under the age of 17. In addition to Schatz, Cruz, Murphy, and Britt, the
Kids Off Social Kids Off Social Media Act is cosponsored by U.S. Senators Peter Welch (D-Vt.), Ted Budd (R-N.C.),Media Act is cosponsored by U.S. Senators Peter Welch (D-Vt.), Ted Budd (R-N.C.),
John Fetterman (D-Pa.), Angus King (I-Maine), Mark Warner (D-Va.), and John Curtis (R-Utah).John Fetterman (D-Pa.), Angus King (I-Maine), Mark Warner (D-Va.), and John Curtis (R-Utah).
“There is no good reason for a nine-year-old to be on Instagram or Snapchat. The growing evidence“There is no good reason for a nine-year-old to be on Instagram or Snapchat. The growing evidence
is clear: social media is making kids more depressed, more anxious, and is clear: social media is making kids more depressed, more anxious, and more suicidal. Yet techmore suicidal. Yet tech
companies refuse to anything about it because it would hurt their bottom line. This is an urgentcompanies refuse to anything about it because it would hurt their bottom line. This is an urgent
health crisis, and Congress must act with the boldness and urgency it demands,” said Senatorhealth crisis, and Congress must act with the boldness and urgency it demands,” said Senator
Schatz. “Protecting kids online is not a partisan Schatz. “Protecting kids online is not a partisan issue, and our bipartisan coalition – which includesissue, and our bipartisan coalition – which includes
several parents of kids and teenagers – represents the millions of parents across the countryseveral parents of kids and teenagers – represents the millions of parents across the country
who’ve long been asking for help.”who’ve long been asking for help.”
Parents overwhelmingly support the mission of the Kids Off Social Media Act. A survey conductedParents overwhelmingly support the mission of the Kids Off Social Media Act. A survey conducted
by Count on Mothers shows that over by Count on Mothers shows that over 90 percent of mothers agree that there should be a minimum90 percent of mothers agree that there should be a minimum
age of 13 for social media. Additionally, 87 percent of mothers agree that social media companiesage of 13 for social media. Additionally, 87 percent of mothers agree that social media companies
should not be allowed to use personalized recommendation systems to deliver content to children.should not be allowed to use personalized recommendation systems to deliver content to children.
Pew Pew finds similar levels of concern from parents, reporting that 70 percent or more of parentsfinds similar levels of concern from parents, reporting that 70 percent or more of parents
worry that their teens are being exposed to explicit content or wasting too much time on socialworry that their teens are being exposed to explicit content or wasting too much time on social
media, with two-thirds of parents saying that parenting is harder today media, with two-thirds of parents saying that parenting is harder today compared to 20 years ago—compared to 20 years ago—
and many of them cited social media as a contributing factor.and many of them cited social media as a contributing factor.
The Kids Off Social Media Act is supported by Public Citizen, National Organization for Women,The Kids Off Social Media Act is supported by Public Citizen, National Organization for Women,
National Association of Social Workers, National Association of Social Workers, National League for Nursing, National Association ofNational League for Nursing, National Association of
School Nurses, KidsToo, Count on Mothers, American Federation of Teachers, American CounselingSchool Nurses, KidsToo, Count on Mothers, American Federation of Teachers, American Counseling
Association, National Federation of Families, National Association of Pediatric Nurse Practitioners,Association, National Federation of Families, National Association of Pediatric Nurse Practitioners,
National National Council for Mental Wellbeing, Parents Television and Media Council, Tyler ClementiCouncil for Mental Wellbeing, Parents Television and Media Council, Tyler Clementi
Foundation, Parents Who Fight, Conservative Ladies of America, David’s Legacy Foundation, DigitalFoundation, Parents Who Fight, Conservative Ladies of America, David’s Legacy Foundation, Digital
Progress, HAS Coalition, Parents Defending Education Action, Concerned Women Progress, HAS Coalition, Parents Defending Education Action, Concerned Women for Americafor America
Legislative Action Committee, and the American Academy of Child and Adolescent Psychiatry.Legislative Action Committee, and the American Academy of Child and Adolescent Psychiatry.
Kids Off Social Media Act Sets Social Media Age Minimum To 13, Pr
Request support to address other bodies of government agencies to support this resolution asRequest support to address other bodies of government agencies to support this resolution as
action item to raise awareness of harm the lack of duty of care.action item to raise awareness of harm the lack of duty of care.
AB-2 Injuries to children: civil penalties.(2025-2026)AB-2 Injuries to children: civil penalties.(2025-2026)
Bill Text - AB-2 Injuries to children: civil penalties.
ASSEMBLY THIRD READINGASSEMBLY THIRD READING
AB 2 (Lowenthal and Patterson)AB 2 (Lowenthal and Patterson)
As Amended April 3, 2025As Amended April 3, 2025
Social media harms to children. From 2010 to 2019, "rates of depression and anxiety—fairly stableSocial media harms to children. From 2010 to 2019, "rates of depression and anxiety—fairly stable
during the 2000s—rose by more than 50% in many studies" and "[t]he suicide during the 2000s—rose by more than 50% in many studies" and "[t]he suicide rate rose 48% forrate rose 48% for
adolescents ages 10 to 19." This trend tracks "the years when adolescents in rich countries tradedadolescents ages 10 to 19." This trend tracks "the years when adolescents in rich countries traded
their flip phones for smartphones and moved much more of their social lives online—particularlytheir flip phones for smartphones and moved much more of their social lives online—particularly
onto social-media platforms designed for virality onto social-media platforms designed for virality and addiction."1and addiction."1
According to the Surgeon General:According to the Surgeon General:
[T]he current body of evidence indicates that while social media may have benefits for some[T]he current body of evidence indicates that while social media may have benefits for some
children and adolescents, there are ample indicators that social media can also children and adolescents, there are ample indicators that social media can also have a profoundhave a profound
risk of harm to the mental health and well-being of children and adolescents. Atrisk of harm to the mental health and well-being of children and adolescents. At
ASSEMBLY COMMITTEE ON JUDICIARYASSEMBLY COMMITTEE ON JUDICIARY
Ash Kalra, ChairAsh Kalra, Chair
AB 2 (Lowenthal) – As Amended April 3, 2025AB 2 (Lowenthal) – As Amended April 3, 2025
SUBJECT: INJURIES TO CHILDREN: CIVIL PENALTIESSUBJECT: INJURIES TO CHILDREN: CIVIL PENALTIES
KEY ISSUE: SHOULD SOCIAL MEDIA PLATFORMS BE LIABLE FOR SPECIFIED STATUTORY DAMAGESKEY ISSUE: SHOULD SOCIAL MEDIA PLATFORMS BE LIABLE FOR SPECIFIED STATUTORY DAMAGES
FOR BREACHING THEIR DUTY OF ORDINARY CARE AND SKILL TO A CHILD BY CAUSING INJURY TOFOR BREACHING THEIR DUTY OF ORDINARY CARE AND SKILL TO A CHILD BY CAUSING INJURY TO
THE THE CHILD?CHILD?
According to the author:According to the author:
[AB 2] amends Section 1714 only by adding statutory damages against platforms that are found in[AB 2] amends Section 1714 only by adding statutory damages against platforms that are found in
court to be liable under current law for negligently causing harm to children court to be liable under current law for negligently causing harm to children under the age of 18.under the age of 18.
Under the bill, if Under the bill, if a company is proven to have failed to exercise its already established duty ofa company is proven to have failed to exercise its already established duty of
operating with ordinary careoperating with ordinary care , the company becomes financially liable for a set amount of $5,000, the company becomes financially liable for a set amount of $5,000
per violation, up to a maximum penalty of $1 million per child, or three times per violation, up to a maximum penalty of $1 million per child, or three times the amount of thethe amount of the
child’s actual damages, whichever is applicable. This financial liability aims to incentivizechild’s actual damages, whichever is applicable. This financial liability aims to incentivize
platforms who count their profits in the tens of billions to proactively safeguard children againstplatforms who count their profits in the tens of billions to proactively safeguard children against
potential harm by changing how they operate potential harm by changing how they operate their platforms.their platforms.
ASSEMBLY COMMITTEE ON PRIVACY AND CONSUMER PROTECTIONASSEMBLY COMMITTEE ON PRIVACY AND CONSUMER PROTECTION
Rebecca Bauer-Kahan, ChairRebecca Bauer-Kahan, Chair
AB 2 (Lowenthal) – As Amended March 17, 2025AB 2 (Lowenthal) – As Amended March 17, 2025
PROPOSED AMENDMENTSPROPOSED AMENDMENTS
SUBJECT: Injuries to children: civil penaltiesSUBJECT: Injuries to children: civil penalties
SYNOPSISSYNOPSIS
State law provides that everyone, including individuals, businesses, and other entities, has a dutyState law provides that everyone, including individuals, businesses, and other entities, has a duty
of “ordinary care and skill” of “ordinary care and skill” in the “management” of their “property or person” – the long-establishedin the “management” of their “property or person” – the long-established
standard for negligence. This bill, which is identical to last year’s AB 3172 (Lowenthal) as it passedstandard for negligence. This bill, which is identical to last year’s AB 3172 (Lowenthal) as it passed
out of this Committee, provides that a large social media platform that violates out of this Committee, provides that a large social media platform that violates this duty and harmsthis duty and harms
a minor is additionally liable for the higher of $5,000 per violation, with a per-child maximum ofa minor is additionally liable for the higher of $5,000 per violation, with a per-child maximum of
$1,000,000, or three times the amount of the child’s actual$1,000,000, or three times the amount of the child’s actual
Thank YouThank You
Craig DurfeyCraig Durfey
Founder of P.R.D.D.C.Founder of P.R.D.D.C.
AB 2
Page 1
Date of Hearing: April 1, 2025
Fiscal: No
ASSEMBLY COMMITTEE ON PRIVACY AND CONSUMER PROTECTION
Rebecca Bauer-Kahan, Chair
AB 2 (Lowenthal) – As Amended March 17, 2025
PROPOSED AMENDMENTS
SUBJECT: Injuries to children: civil penalties
SYNOPSIS
State law provides that everyone, including individuals, businesses, and other entities, has a duty
of “ordinary care and skill” in the “management” of their “property or person” – the long-
established standard for negligence. This bill, which is identical to last year’s AB 3172
(Lowenthal) as it passed out of this Committee, provides that a large social media platform that
violates this duty and harms a minor is additionally liable for the higher of $5,000 per violation,
with a per-child maximum of $1,000,000, or three times the amount of the child’s actual
damages.
The bill is sponsored by Common Sense Media and the Los Angeles County Office of Education,
and supported by educational and children’s safety groups. Proponents contend that augmented
financial liability will incentivize platforms, who count their profits in the tens of billions, to
proactively safeguard children against potential harm by changing how they operate their
platforms.
Opponents include TechNet, California Chamber of Commerce, Computer and Communications
Industry Association, and Electronic Frontier Foundation. They argue, among other things, that
the bill is largely preempted by federal law, will lead to a flood of unmeritorious litigation, and
will restrict protected speech.
Clean-up amendments are proposed in Comment #6.
If passed by this Committee, this bill will next be heard by the Assembly Judiciary Committee.
THIS BILL:
1) Finds and declares:
a. Subdivision (a) of Section 1714 of the Civil Code already makes every person and
corporation, including social media platforms, financially responsible for an injury
occasioned to another by their want of ordinary care or skill in the management of
their property or person.
b. Children are uniquely vulnerable on social media platforms.
c. The biggest social media platforms invent and deploy features they know injure large
numbers of children, including contributing to child deaths.
AB 2
Page 2
d. The costs of these injuries are unfairly being paid by parents, schools, and taxpayers,
not the platforms.
e. The bill is necessary to ensure that the social media platforms that are knowingly
causing the most severe injuries to the largest number of children receive heightened
damages to prevent injury from occurring to children in the first place.
2) Provides that a social media platform that violates subdivision (a) of Section 1714 and
breaches its responsibility of ordinary care and skill to a child is, in addition to any other
remedy, liable for statutory damages for the larger of the following:
a. $5,000 per violation up to a maximum, per child, of $1,000,000.
b. Three times the amount of the child’s actual damages.
3) Makes waivers of the bill’s provisions void and unenforceable.
4) Defines:
a. “Child” as a minor under 18 years of age.
b. “Social media platform” as a social media platform, as defined in Section 22675 of
the Business and Professions Code (see below), that generates more than
$100,000,000 per year in gross revenues.
5) States that the duties, remedies, and obligations imposed by the bill are cumulative to the
duties, remedies, or obligations imposed under other laws and shall not be construed to
relieve a social media platform from any duties, remedies, or obligations imposed under any
other law.
6) Contains a severability clause and clarifies that its provisions do not apply to cases pending
before January 1, 2026.
EXISTING LAW:
1) Prohibits, under Section 230 of the Communications Decency Act, treating a provider or
user of an interactive computer service as the publisher or speaker of any information
provided by another information content provider. (47 U.S.C. § 230(c)(1).)
2) Defines “social media platform” as a public or semipublic internet-based service or
application that has users in California and that meets both of the following criteria:
a. A substantial function of the service or application is to connect users in order to
allow them to interact socially with each other within the service or application. (A
service or application that provides email or direct messaging services does not meet
this criterion based solely on that function.)
b. The service or application allows users to do all of the following:
i. Construct a public or semipublic profile for purposes of signing into and using
the service or application.
AB 2
Page 3
c. Populate a list of other users with whom an individual shares a social connection
within the system.
d. Create or post content viewable by other users, including, but not limited to, on
message boards, in chat rooms, or through a landing page or main feed that presents
the user with content generated by other users. (Bus. & Prof. Code § 22675(f).)
3) Provides that everyone is responsible, not only for the result of their willful acts, but also for
an injury occasioned by their want of ordinary care or skill in the management of their
property or person, except so far as the latter has, willfully or by want of ordinary care,
brought the injury upon themselves. (Civ. Code § 1714(a).)
COMMENTS:
1) Author’s statement. According to the author:
AB 2 amends Section 1714 of the Civil Code by adding statutory damages against platforms
that are found in court to be liable under current law for negligently causing harm to children
under the age of 18. Under the bill, if a company is proven to have failed to exercise its
already established duty of operating with ordinary care, the company becomes financially
liable for a set amount of $5,000 per violation, up to a maximum penalty of $1 million per
child, or three times the amount of the child’s actual damages, whichever is applicable. This
financial liability aims to incentivize platforms who count their profits in the tens of billions
to proactively safeguard children against potential harm by changing how they operate their
platforms.
2) Social media’s impact on children. In May 2023, U.S. Surgeon General Vivek Murthy
issued an advisory warning of the potential mental health impacts of social media on young
people.1 The advisory calls for more research and concludes that while “the current body of
evidence indicates that while social media may have benefits for some children and adolescents,
there are ample indicators that social media can also have a profound risk of harm to the mental
health and well-being of children and adolescents.”2
According to the Surgeon General, adolescents, in a critical formative period of brain
development, are especially vulnerable to potential mental health impacts of social media.3
While noting that several complex factors shape social media’s influence on children and
adolescents, the Surgeon General points to two primary risk factors: 1) harmful content, and 2)
excessive and problematic use.
Harmful content. According to the Surgeon General, “extreme, inappropriate, and harmful
content continues to be easily and widely accessible by children and adolescents” and is “spread
1 “Social Media and Youth Mental Health: The U.S. Surgeon General’s Advisory” (May 23, 2023) p. 6 (emphasis
added), https://www.hhs.gov/sites/default/files/sg-youth-mental-health-social-media-advisory.pdf. (“Surgeon
General’s Advisory”)
2 Id. at p. 4.
3 “Extractive Technology is Damaging our Attention and Mental Health,” Center for Humane Technology,
https://www.humanetech.com/attention-mental-health.
AB 2
Page 4
through direct pushes, unwanted content exchanges, and algorithmic designs.”4 Such content
includes:
Extreme content such as live depictions of self-harm acts, like asphyxiation or cutting,
“which can normalize such behaviors, including through the formation of suicide pacts
and posing of self-harm models for others to follow.”5
Bullying and harassment: roughly two-thirds of adolescents are “often” or “sometimes”
exposed to hate-based content, with nearly 75% of adolescents stating that social media
sites do a fair to poor job of addressing online harassment and bullying.6
Predatory behaviors, including financial or sexual exploitation of children and
adolescents; nearly 6-in-10 adolescent girls surveyed had received unwanted advances
from strangers on social media platforms.7
Leaked internal platform studies indicate that youth exposure to unwanted, disturbing, graphic,
or sexual content is common and facilitated by platform design.8 According to documents
obtained by the Wall Street Journal, one in eight users under the age of 16 experienced unwanted
sexual advances on Instagram, facilitated by lax privacy settings.9
Additionally, the advisory cites a synthesis of 20 studies demonstrating that many users,
especially adolescent girls, experience envy and social comparison, leading to body
dissatisfaction, disordered eating behaviors, and low self-esteem. “When asked about the impact
of social media on their body image, nearly half (46%) of adolescents aged 13–17 said social
media makes them feel worse, 40% said it makes them feel neither better nor worse, and only
14% said it makes them feel better.”10 Internal studies by platforms also indicate similar patterns
of social comparison, with negative effects on wellbeing.11 In an internal Meta study, younger
and female users reported much greater rates of feeling “worse about yourself because of other
4 Surgeon General’s Advisory, supra, at p. 8.
5 Ibid.
6 Alhajji et al., “Cyberbullying, Mental Health, and Violence in Adolescents and Associations With Sex and Race:
Data From the 2015 Youth Risk Behavior Survey” Global pediatric health (2019),
https://journals.sagepub.com/doi/10.1177/2333794X19868887; Vogels, “Teens and Cyberbullying,” Pew Research
Center: Internet, Science & Tech (2022), https://www.pewresearch.org/internet/2022/12/15/teens-and-
cyberbullying-2022/.
7 Nesi, et al. “Teens and mental health: How girls really feel about social media” Common Sense Media (2023),
https://www.commonsensemedia.org/research/teens-and-mental-health-how-girls-really-feel-about-social-media.
8 “Minnesota Attorney General’s Report on Emerging Technology and Its Effects on Youth Well-Being” (Feb.
2025), p. 10-11. https://www.ag.state.mn.us/Office/Reports/EmergingTechnology_2025.pdf. (“Minnesota Attorney
General’s Report”)
9 Jeff Horwitz, “His Job Was to Make Instagram Safe for Teens. His 14-Year-Old Showed Him What the App Was
Really Like” The Wall Street Journal (Nov. 2, 2023), https://www.wsj.com/tech/instagram-facebook-teens-
harassment-safety-5d991be1?mod=hp_featst_pos3.
10 Bickham et al., “Adolescent Media Use: Attitudes, Effects, and Online Experiences” Boston Children’s Hospital
Digital Wellness Lab (2022), https://digitalwellnesslab.org/wpcontent/uploads/Pulse -Survey_Adolescent-Attitudes-
Effectsand-Experiences.pdf .
11 Minnesota Attorney General’s Report, supra, p. 11-12.
AB 2
Page 5
peoples’ posts on Instagram,” with 27.4% of 13-15 year old females reporting this experience
over a 7-day period, compared to 14.6% of males in the same age group.12
Excessive and problematic use. The advisory cites studies showing that on a typical weekday,
nearly one in three adolescents report using screens – most commonly social media – until
midnight or later.13 One third or more of girls aged 11-15 feel “addicted” to certain platforms.
Excessive use correlates with attention problems, feelings of exclusion, and sleep problems.14
Poor sleep, in turn, is linked with neurological development issues, depression, and suicidality.15
These findings are borne out by the observations of platforms themselves: internal Meta research
detailed in a recent lawsuit concluded that “when social media use displaces sleep in adolescents,
it is negatively correlated to indicators of mental health.”16
Excessive use is driven in part by systems that are optimized to maximize user engagement
through design features, such as recommendation algorithms, likes, push notifications, auto-play,
and endless scroll.17 According to a former social media company executive’s statements, such
features were designed intentionally to increase time spent through features that “give you a little
dopamine hit every once in awhile.”18 These features “can trigger pathways comparable to
addiction.”19 Young people with still-developing pre-frontal cortexes who crave social reward
and lack inhibition are especially susceptible.20
3) Negligence. Civil Code section 1714(a) provides: “Everyone is responsible, not only for the
result of his or her willful acts, but also for an injury occasioned to another by his or her want of
ordinary care or skill in the management of his or her property or person, except so far as the
latter has, willfully or by want of ordinary care, brought the injury upon himself or herself.” To
establish negligence, “the plaintiff must show that the defendant had a duty to use due care, that
he breached that duty, and that the breach was the proximate or legal cause of the resulting
injury.”21 “A duty exists only if ‘the plaintiff's interests are entitled to legal protection against the
defendant’s conduct.’”22 “‘[A]s a general matter, there is no duty to act to protect others from the
conduct of third parties.’”23 However, “[i]n a case involving harm caused by a third party, a
person may have an affirmative duty to protect the victim of another’s harm if that person is in
what the law calls a ‘special relationship’ with either the victim or the person who created the
12 Arizona et al. v. Meta Platforms, Inc., et al., Case No. 4:23-cv-05448, Complaint (N.D. Cal. Oct. 24, 2023),
https://storage.courtlistener.com/recap/gov.uscourts.nmd.496039/gov.uscourts.nmd.496039.36.2.pdf .
13 Rideout, V., & Robb, M. B. “Social media, social life: Teens reveal their experiences” Common Sense Media
(2018), https://www.commonsensemedia.org/sites/default/files/research/report/2018-social-mediasocial-life-
executive-summary-web.pdf.
14 Surgeon General’s Advisory, supra, at p. 10.
15 Ibid.
16 Arizona et al. v. Meta Platforms, Inc., supra.
17 Burhan & Moradzadeh, “Neurotransmitter Dopamine and its Role in the Development of Social Media
Addiction” 11 Journal of Neurology & Neurophysiology 507 (2020), https://www.iomcworld.org/open-
access/neurotransmitter-dopamine-da-and-its-role-in-the-development-of-social-mediaaddiction.pdf.
18 Alex Hern, ‘Never get high on your own supply’ – why social media bosses don’t use social media,” The
Guardian (Jan. 23, 2018), https://www.theguardian.com/media/2018/jan/23/never-get-high-on-your-own-supply-
why-social-media-bosses-dont-use-social-media.
19 Surgeon General’s Advisory, supra, at p. 9.
20 Ibid.
21 Nally v. Grace Community Church (1988) 47 Cal.3d 278, 292.
22 Brown v. USA Taekwondo (2021) 11 Cal.5th 204, 213, internal quotes omitted.
23 Id. at p. 214.
AB 2
Page 6
harm.”24 A special relationship “‘gives the victim a right to expect’ protection from the
defendant, while a special relationship between the defendant and the dangerous third party is
one that ‘entails an ability to control [the third party’s] conduct.’”25
4) This bill augments liability for social media platforms that negligently harm children.
This bill provides that a social media platform that violates Section 1714(a) and breaches its
responsibility of ordinary care and skill to a child – defined as a minor under 18 years of age – is,
in addition to any other remedy, liable for statutory damages for the larger of:
$5,000 per violation up to a per-child maximum of $1,000,000; or
Three times the amount of the child’s actual damages.
A social media platform for these purposes is one that meets an existing statutory definition and
generates more than $100,000,000 per year in gross revenues. The bill would also provide that
any waivers of the bill’s provisions are void and unenforceable as contrary to public policy.
The bill is identical to last year’s AB 3172 as it passed this Committee, by an 11-0 vote. The bill
was amended in Senate Appropriations to apply only to knowing and willful failure to exercise
ordinary care to a child, cap statutory damages at $250,000, and require no less than 51% of the
penalties to go to a state fund dedicated to raising awareness among adolescents on safe social
media use. The bill was moved to the Senate’s inactive file.
5) Constitutional considerations. Opponents of the bill raise concerns relating to freedom of
speech and federal preemption.
First Amendment. The United States and California Constitutions prohibit abridging, among
other fundamental rights, freedom of speech.26 “The Free Speech Clause of the First Amendment
. . . can serve as a defense in state tort suits.”27 “[T]he basic principles of freedom of speech and
the press, like the First Amendment’s command, do not vary when a new and different medium
for communication appears.”28 Additionally, “the creation and dissemination of information are
speech . . . .”29 Dissemination of speech is different from “expressive conduct,” which is conduct
that has its own expressive purpose and may be entitled to First Amendment protection.30
Laws that are not content specific are generally subject to “intermediate scrutiny,” which
requires that the law “be ‘narrowly tailored to serve a significant government interest.’”31 In
other words, the law “‘need not be the least restrictive or least intrusive means of’ serving the
government’s interests,” but “‘may not regulate expression in such a manner that a substantial
portion of the burden on speech does not serve to advance its goals.’”32 This bill does not
24 Id. at p. 215.
25 Id. at p. 216.
26 U.S. Const., 1st and 14th Amends; Cal. Const. art. I, § 2.
27 Snyder v. Phelps (2011) 562 U.S. 443, 451.
28 Joseph Burstyn v. Wilson (1952) 343 U.S. 495, 503.
29 Sorrell v. IMS Health Inc. (2011) 564 U.S. 552, 570.
30 Ibid.
31 Packingham v. North Carolina (2017) 582 U.S. 98, 98.
32 McCullen v. Coakley (2014) 573 U.S. 464, 486, emphasis added.
AB 2
Page 7
regulate expression; it augments liability for large platforms that violate an existing duty and
harm children.
Federal preemption. Section 230(c)(1) of the federal Communications Decency Act of 1996
shields online platforms from liability for third-party content: “No provider or user of an
interactive computer service shall be treated as the publisher or speaker of any information
provided by another information content provider.”33 This provision has been hailed as the law
that created the modern internet, fostering free expression online and allowing an array of
innovative services and spaces to flourish, from search engines to social media.34 It has also
come with a destructive side, absolving platforms of responsibility for virtually all third-party
harms arising from the use of their services – “a protection not available to print material or
television broadcasts.”35
Section 230 was intended to promote investment in online companies and encourage “‘Good
Samaritan’ blocking and screening of offensive material” 36 without fear of liability for
defamation.37 Courts soon adopted an expansive interpretation – a key early decision construed
“publisher” immunity as encompassing “traditional editorial functions” such as deciding whether
to publish, remove, or even alter content.38 Consequently, the plaintiff, a victim of online
defamation by an anonymous user, had no recourse against the platform despite its failure to
timely remove the content, which would have resulted in liability in the offline world. Following
this logic, courts have extended Section 230 well beyond the defamation context, routinely
concluding that online intermediaries are not liable for harms related to third-party illicit
content.39 “The common thread weaving through these cases is that the courts have sapped
§230’s Good Samaritan concept of its meaning.”40
This sweeping grant of immunity has been the subject of widespread criticism and calls for
reform.41 Senators Lindsey Graham and Dick Durbin are planning to introduce a bill that would
sunset Section 230.42 Justice Clarence Thomas has called for the Supreme Court to review the
scope of Section 230.43 Ninth Circuit Judge Ryan Nelson recently stated that courts have
“stretch[ed] the statute’s plain meaning beyond recognition,” leading to “perverse effects.”44 The
33 42 U.S.C. § 230(c)(1). Section 230 also (1) provides a safe harbor for good faith content moderation, (2) preempts
contrary state laws, and (3) enumerates exemptions for enforcement of federal criminal statutes, intellectual property
laws, communications privacy laws, and sex trafficking.
34 See e.g., Kosseff, The Twenty-Six Words that Created the Internet (2019).
35 Quinta Jurecic, “The politics of Section 230 reform: Learning from FOSTA’s mistakes” Brookings (Mar. 1,
2022), https://www.brookings.edu/articles/the-politics-of-section-230-reform-learning-from-fostas-mistakes.
36 § 230(c).
37 Fair Hous. Council v. Roommates.com, LLC (9th Cir. 2008) 521 F.3d 1157, 1163.
38 Zeran v. Am. Online, Inc (4th Cir. 1997) 129 F.3d 327.
39 Michael Rustad & Thomas Koenig, “The Case for a CDA Section 230 Notice-and-Takedown Duty” (2023) 23
Nev.L.J. 533, 561-574.
40 Danielle Keats Citron, “How to Fix Section 230” (2023) 103 B.U.L. Rev. 713, 727.
41 E.g., John Lucas, “AG Moody Joins with Other Attorneys General to Urge Congress to Stop Protecting Illegal
Activity on the Net,” Capitolist (May 23, 2019), https://thecapitolist.com/ag-moody-joins-with-other-attorneys-
general-to-urge-congress-to-stop-protecting-illegal-activity-on-the-net.
42 Lauren Feiner, “Lawmakers are tyring to repeal section 230 again” The Verge (Mar. 21, 2025),
https://www.msn.com/en-us/politics/government/lawmakers-are-trying-to-repeal-section-230-again/ar-
AA1BptAI?ocid=BingNewsVerp.
43 Doe ex rel. Roe v. Snap, Inc. (2024) 144 S. Ct. 2493 (Thomas, J., dissenting from denial of certiorari).
44 Calise v. Meta Platforms, Inc. (9th Cir. 2024) 103 F.4th 732, 747 (Nelson, J. concurring) (Calise).
AB 2
Page 8
Ninth Circuit “should revisit our precedent,” he urged, particularly in light of “artificial
intelligence raising the specter of lawless and limitless protections.”45
Courts have emphasized, however, that Section 230 immunity is not limitless.46 Section 230 is
not “an all-purpose get-out-of-jail-free card”47 that “create[s] a lawless no-man’s-land on the
internet.’”48 The Ninth Circuit has “consistently eschewed an expansive reading of the statute
that would render unlawful conduct ‘magically . . . lawful when [conducted] online,’ and
therefore ‘giv[ing] online businesses an unfair advantage over their real-world counterparts.’”49
Under Ninth Circuit precedent, Section 230(c)(1) immunity exists for “(1) a provider or user of
an interactive computer service (2) whom a plaintiff seeks to treat, under a state law cause of
action, as a publisher or speaker (3) of information provided by another information content
provider.”50 With respect to the third prong, Section 230 protection extends only to claims that
“derive[] from the defendant’s status or conduct as a publisher or speaker.”51 If, instead, the
claim “springs from something separate from the defendant’s status as a publisher, such as from .
. . obligations the defendant has in a different capacity,” Section 230 immunity does not apply.52
Examples of such cases involving negligence include:
A networking website owner’s negligent failure to warn a woman who was raped by two
users of the website who posed as talent scouts to lure her to a fake audition, where it was
alleged that an outside source had informed the owner about the predatory scheme.53
Snap’s allegedly defectively-designed app, which promoted content that encouraged two
teen boys who died in a high-speed car accident to drive at dangerous speeds.54
While these cases are highly fact-specific and there are precedents to the contrary,55 these cases
show that negligence claims against social media platforms can survive a Section 230 defense.
6) Amendments. The author has agreed to the following technical clean-up amendments:
(a) A social media platform that violates subdivision (a) of Section 1714 and breaches its
responsibility of ordinary care and skill by causing injury to a child shall, in addition to any
other remedy, be liable for statutory damages for the larger of the following:
[. . .]
(b) Any waiver of this subdivision section shall be void and unenforceable as contrary to
public policy.
45 Ibid.
46 Calise, supra, 103 F.4th at p. 739, citing cases.
47 Doe v. Internet Brands, Inc. (9th Cir. 2016) 824 F.3d 846, 853.
48 HomeAway.com v. City of Santa Monica (9th Cir. 2018) 918 F.3d 676, 683.
49 Ibid.
50 Barnes v. Yahoo!, Inc. (9th Cir. 2009) 570 F.3d 1096, 1109.
51 Id. at p. 1102.
52 Calise, supra, 103 F.4th at p. 742.
53 Doe v. Internet Brands, Inc., supra, 824 F.3d at pp. 852-853.
54 Lemmon v. Snap, Inc. (9th Cir. 2021) 995 F.3d 1085, 1092.
55 Doe v. MySpace, Inc. (5th Cir. 2008) 528 F.3d 413.
AB 2
Page 9
[. . . ]
ARGUMENTS IN SUPPORT: The Los Angeles County Office of Education, co-sponsors of the
bill, write:
Social media platforms must be held accountable for the harm they cause, particularly to
minors who are uniquely vulnerable to the harmful effects of online engagement. Research
has repeatedly shown the detrimental impact of social media on young people’s mental
health, contributing to a range of issues, including increased instances of cyberbullying,
mental health crises, and even acts of violence.
While social media platforms prioritize user engagement and growth, they often fail to
adequately consider the safety and wellness of younger users. Given this imbalance, it is
crucial that social media companies are required to uphold a standard of ordinary care in their
management of content and interactions involving minors. This bill would establish much-
needed accountability by holding social media platforms liable for civil penalties if they fail
to exercise the necessary care to protect children on their platforms.
Just as institutions and businesses serving youth are held accountable for ensuring the safety
and well-being of their patrons, social media companies should be held to the same standard.
AB 2 represents an important step toward protecting the mental and physical health of
children and ensuring that platforms act responsibly toward the younger population that
depends on them.
Children’s Advocacy Institute writes:
Making platforms pay more if a court finds they have negligently hurt children is not in any
way, shape, or form burdensome or unfair to stubbornly bad acting platforms. As the former,
long-time Chair of the Orange County Republican Party, Fred Whitaker, wrote in supporting
a similar bill before this Committee last year:
Thus, all the opposition to this bill needs to do to avoid any prospect of
liability under the bill is simply exercise reasonable care not to harm
children. Surely, a company like Meta which in 2021 earned an incredible
$100 billion profit (General Motors which we used to think of as a big
company earned 10 billion) can afford to exercise such care. If it doesn’t,
it should pay for the harm it causes.
That’s the American way.
ARGUMENTS IN OPPOSITION: In opposition to the bill, TechNet, California Chamber of
Commerce, and Computer and Communications Industry Association jointly write:
To the extent this bill provides an incentive for platforms to change their policies and
features, the extreme risk of liability will likely result in companies severely limiting or
completely eliminating online spaces for teens.
Litigation leads to uneven and inconsistent outcomes, with different companies choosing to
limit the immense exposure this bill will create in different ways. There are two main ways
AB 2
Page 10
platforms could respond to the vague requirements and extreme liability in this bill, neither
of which are good outcomes for teens.
First, companies could adjust their policies and terms of service to exclude all users under the
age of 18. This would be a tremendous and detrimental blow to teens’ ability to access
information and the open internet. As discussed below, this violates First Amendment
principles and protections for teens. However, even if a platform stated in its terms of service
that teens under 18 were not allowed on the platform and took steps to prevent their access,
that may not be enough to avoid liability for a teen who accesses the site anyway and has a
negative outcome.
Second, companies could also adjust their terms of service so that users under the age of 18
have a heavily sanitized version of the platform. This could include limiting which users
teens can interact with (e.g. only users approved by parents), which features they have access
to (no messaging or public posting), and even what content they can interact with or view (no
political, news, or other “potentially harmful” content). This might reduce but would not
prevent every instance of harm to teens given the nebulousness and subjectivity that is
inherent in defining “harm”.
This bill’s implicit concern is harmful content. It is impossible for companies to identify and
remove every potentially harmful piece of content because there’s no clear consensus on
what exactly constitutes harmful content, apart from clearly illicit content. Determining what
is harmful is highly subjective and varies from person to person, making it impossible to
make such judgments on behalf of millions of users. Faced with this impossible task and the
liability imposed by this bill, some platforms may decide to aggressively over restrict content
that could be considered harmful for teens. For instance, content promoting healthy eating
could be restricted due to concerns it could lead to body image issues. Similarly, content
about the climate crisis or foreign conflicts would need to be restricted as it could lead to
depression, anxiety, and self-harm. Additionally, beneficial information like anti-drug or
smoking cessation programs, mental health support, and gender identity resources could get
overregulated because of the impossibility of deciding what is harmful to every user.
Furthermore, platforms would need to evaluate whether to eliminate fundamental features
and functions of their platform, features that are the reason teens and users go to their
platforms, due to the legal risk involved. For instance, since direct messaging features could
potentially be misused for contacting and bullying other teens, such features would likely be
removed.
Teens’ use of these platforms would be overly policed and sanitized to such a degree that
they would surely leave our sites in favor of others that don’t meet AB 2’s $100 million
revenue threshold. Collectively, our organizations represent platforms that take their
responsibility to their users incredibly seriously and have devoted millions of dollars to
increasing the safety and enjoyment of their platforms. Teens will seek out the ability to
interact online, whether it is on our platforms or on others, including ones that don’t
prioritize their safety and well-being.
Electronic Frontier Foundation adds:
AB 2
Page 11
The heavy statutory damages imposed by A.B. 2 will result in broad censorship via scores of
lawsuits that may claim any given content online is harmful to any child. California should
not enact a law that would be more harmful to children and will not be enforceable in any
event. Further, should it become law, it will also be ineffective because federal law preempts
Californian’s ability to hold online services civilly liable for harm caused by user-generated
content.
REGISTERED SUPPORT / OPPOSITION:
Support
Common Sense Media
Los Angeles County Office of Education (Sponsor)
California Charter Schools Association
Childrens Advocacy Institute
Jewish Family and Children's Services of San Francisco, the Peninsula, Marin and Sonoma
Counties
Organization for Social Media Safety
Opposition
CalChamber
Computer & Communications Industry Association
Electronic Frontier Foundation
Technet-technology Network
Analysis Prepared by: Josh Tosney / P. & C.P. / (916) 319-2200
AB 2
Page 1
Date of Hearing: April 8, 2025
ASSEMBLY COMMITTEE ON JUDICIARY
Ash Kalra, Chair
AB 2 (Lowenthal) – As Amended April 3, 2025
SUBJECT: INJURIES TO CHILDREN: CIVIL PENALTIES
KEY ISSUE: SHOULD SOCIAL MEDIA PLATFORMS BE LIABLE FOR SPECIFIED
STATUTORY DAMAGES FOR BREACHING THEIR DUTY OF ORDINARY CARE AND
SKILL TO A CHILD BY CAUSING INJURY TO THE CHILD?
SYNOPSIS
Social media has become a ubiquitous element of our society. Across platforms, social media is
used as a tool to establish and maintain personal relationships; keep up to date with current
events; engage in political organizing; and develop professional relationships. In the United
States alone, approximately 70 percent of people use some form of social media. These
demographics broken down by age demonstrate a significant percentage of younger users. This
bill presents the latest approach to the issue of social media platforms’ responsibility to these
younger users by imposing statutory damages on social media platforms who breach their duty
of ordinary care and skill in relation to a minor. Like a number of its predecessors, including the
author’s practically identical version from just last year, the current measure seeks to squarely
situate social media platforms’ liability within existing parameters of tort law.
This bill is sponsored by Common Sense Media and supported by the California Charter Schools
Association, the California Initiative for Technology and Democracy (CITED), the Children’s
Advocacy Institute, the Consumer Federation of California, the Jewish Family and Children’s
Services of San Francisco, the Peninsula, Marin and Sonoma Counties, the Los Angeles County
Office of Education, and the Organization for Social Media Safety. It is opposed by a coalition of
tech-industry and advocates led TechNet, and the Electronic Frontier Foundation (EFF). The
bill was previously heard by the Assembly Committee on Privacy and Consumer Protection
where it was approved on a vote of 9-0.
SUMMARY: Provides for statutory penalties available in a negligence cause of action brought
on behalf of a child-user of a social media platform for harm caused by the platform.
Specifically, this bill:
1) Makes findings and declarations regarding the duty of care imposed on everyone by Civil
Code Section 1714 (a) and the risks to children on social media platforms.
2) Authorizes recovery of the larger of the following in a successful claim against a social
media platform that alleges the platform violated Civil Code Section 1714 (a) by causing
injury to a child:
a) Five thousand dollars ($5,000) per violation up to a maximum, per child, of one million
dollars ($1,000,000);
b) Three times the amount of the child’s actual damages;
AB 2
Page 2
3) Makes any waiver of 2) void and unenforceable as contrary to public policy.
4) Defines the following:
a) “Child” means a minor under 18 years of age;
b) “Social media platform” means a social media platform, as defined in Section 22675 of
the Business and Professions Code, that generates more than one hundred million dollars
($100,000,000) per year in gross revenues.
5) Establishes that the duties, remedies, and obligations imposed by the provisions of the bill are
cumulative to the duties, remedies or obligations under other law and shall not be construed
to relieve a social media platform from any duties, remedies, or obligations imposed under
any other law.
6) Includes a severability clause.
EXISTING LAW:
1) Establishes, under Section 230 of the Communications Decency Act, that no provider or user
of an interactive computer service shall be treated as the publisher or speaker of any
information provided by another information content provider. (47 U.S.C. Section 230(c)(1).)
2) Provides that every person is responsible, not only for the result of their willful acts, but also
for an injury occasioned to another by the person’s want of ordinary care or skill in the
management of their property or person, except so far as the latter has, willfully or by want
of ordinary care, brought the injury upon themselves. (Civil Code Section 1714 (a).)
3) Defines “social media platform” as a public or semipublic internet-based service or
application that has users in California and meets specified criteria. (Business and
Professions Code Section 22945.)
FISCAL EFFECT: As currently in print this bill is keyed non-fiscal.
COMMENTS: Social media has become a ubiquitous element of our society. Across platforms,
social media is used as a tool to establish and maintain personal relationships; keep up to date
with current events; engage in political organizing; and develop professional relationships. In the
United States alone, approximately 70 percent of people use some form of social media. These
demographics, when broken down by age, include a significant percentage of younger users.
Approximately 84 percent of people between the ages of 18 and 29 years use at least one social
media site, while the utilization rate for those between the ages of 50 to 84 years drops to 73
percent. Participation rates also vary between the social media platforms themselves. For
example, while 65 percent of users between 18 and 29 years use Snapchat, only 42 percent have
Twitter accounts. Perhaps particularly relevant to this bill, over 70 percent of 18 to 29 year -olds
use Instagram, while that number drops to 48 percent for those between 30 to 49 years, and to 29
percent for those between the ages of 50 and 64. (Social Media Fact Sheet, (April 7, 2021) Pew
Research Center available at: https://www.pewresearch.org/internet/fact-sheet/social-media/.) As
usage of social media apps and websites has grown, so has the research on its consequences. The
prevalence of social media in daily life impacts all demographics but has had, perhaps
predictably, an outsized effect on younger populations.
AB 2
Page 3
Over the course of the last few years, the Legislature has evaluated a substantial number of bills
attempting to hold social media platforms accountable for harms caused to minors who use their
products. Bills have taken a variety of approaches, including attempting to make platforms liable
for addicting children (AB 2408 (Cunningham, 2022)); requiring platforms that provide services
to children to comply with certain safety requirements (AB 2273 (Wicks, 2022)); establishing
reporting mechanisms to facilitate removal of child pornography on platforms (AB 1394 (Wicks,
2023)); authorizing individuals to bring claims against entities that distribute child pornography
(SB 646 (Cortese, 2023)); and making platforms liable for features that cause harm to child users
(SB 680 (Skinner, 2023)). Last year, the author shepherded a measure nearly identical to this bill
through the Legislature (AB 3172), which died on the Senate Floor.
According to the author:
[AB 2] amends Section 1714 only by adding statutory damages against platforms that are
found in court to be liable under current law for negligently causing harm to children under
the age of 18. Under the bill, if a company is proven to have failed to exercise its already
established duty of operating with ordinary care, the company becomes financially liable for
a set amount of $5,000 per violation, up to a maximum penalty of $1 million per child, or
three times the amount of the child’s actual damages, whichever is applicable. This financial
liability aims to incentivize platforms who count their profits in the tens of billions to
proactively safeguard children against potential harm by changing how they operate their
platforms.
This bill presents the latest approach to the issue of quantifying social media platform’s
responsibility to its young users. The bill imposes statutory damages on social media platforms
who breach their duty of ordinary care and skill to a minor. Like a number of its predecessors,
the current measure seeks to squarely situate social media platforms’ liability within existing
parameters of tort law. The bill does so by cross-referencing Civil Code section 1714 (a), which
provides that “[e]veryone is responsible, not only for the result of his or her willful acts, but also
for an injury occasioned to another by his or her want of ordinary care or skill in the management
of his or her property or person, except so far as the latter has, willfully or by want of ordinary
care, brought the injury upon himself or herself.”
The text of this bill is fairly straightforward. In a new section immediately after Civil Code
Section 1714, the bill makes a social media platform that violates subdivision (a) of Section 1714
by causing injury to a child liable for statutory damages for the larger of the following:
1) Five thousand dollars per violation up to a per-child maximum of one million dollars;
2) Three times the amount of the child’s actual damages
The bill defines social media platform as a public or semipublic internet-based service or
application that has users in California and meets specified criteria regarding their function, and
generates more than one hundred million dollars per year.
Section 230 of the Federal Communications Decency Act. The federal Communications
Decency Act (CDA) provides that “[n]o provider or user of an interactive computer service shall
be treated as the publisher or speaker of any information provided by another information
content provider,” and affords broad protection from civil liability for the good faith content
moderation decisions of interactive computer services. (47 U.S.C. Sec. 230(c)(1) and (2).)
AB 2
Page 4
Though Section 230 was originally passed in response to judicial inconsistency with respect to
the liability of internet service providers under statutes pertaining to “publishers” of content
created by others, it has since been interpreted to confer operators of social media platforms and
other online services with broad immunity from liability for content posted by others.
Section 230 also indicates that “[n]othing in this section shall be construed to prevent any State
from enforcing any State law that is consistent with this section,” but further provides that “[n]o
cause of action may be brought and no liability may be imposed under any State or local law that
is inconsistent with this section.” (47 U.S.C. Sec. 230(e)(3).) The latter provision has generally
been interpreted to expressly preempt any state law that has the effect of treating a social media
or other online platform as the publisher of information posted by other users, including
prescriptive requirements relating to content moderation. This is consistent with the law’s
original intent, which was to ensure that internet platforms facilitating the sharing of content can
do so without considerable risk of liability in the event that content is not meticulously policed.
Since the development of social media platforms, the extent of Section 230’s immunity shield
has been heavily litigated. Questions have arisen regarding what elements of a user’s experience
and therefore of a social media platform’s business model, benefit from Section 230, and which
fall outside its scope. In Fair Housing Council v. Roommates.com, LLC, Roommates.com was
sued by a coalition of fair housing councils in California for allegedly violating the federal Fair
Housing Act and state housing discrimination laws. (Fair Housing Council of San Fernando
Valley v. Roommates.com, LLC, 521 F.3d 1157 (2007).) The claim was based largely on the
website’s search function which appeared to prefer certain profiles over others, seemingly on the
basis of elements of the user’s identity, including their gender and sexual orientation, which were
collected by Roommate.com through a mandatory questionnaire developed by the website. The
District Court ruled in favor of Roommates.com, holding that the website was protected by
Section 230. The councils subsequently appealed to the Ninth Circuit, which held that the
website’s use of the information they required users to submit in order to utilize the website
rendered Roommate.com outside the protection of Section 230. In its analysis, the court reasoned
that a “website operator can be both a service provider and a content provider: If it passively
displays content that is created entirely by third parties, then it is only a service provider with
respect to that content. But as to content that it creates itself, or is ‘responsible, in whole or in
part’ for creating or developing, the website is also a content provider.” (Roommates.com, LLC,
521 F.3d at p. 1163 (2007).) With regard to Roommate.com’s own role in developing the users’
profiles, the court argued, “Roommate is ‘responsible’ at least ‘in part’ for each subscriber’s
profile page, because ever such page is a collaborative effort between Roommate and the
subscriber.” (Id. at p. 1167)
Two years later in Barnes v. Yahoo! Inc., the Ninth Circuit established a three-part test for
determining whether a website benefits from the liability shield of Section 230. According to
Barnes, Section 230 “only protects from liability (1) a provider or user of an interactive
computer service (2) whom a plaintiff seeks to treat, under a state law cause of action, as a
publisher or speaker (3) of information provided by another information content provider.”
(Barnes v. Yahoo! Inc., 570 F.3d 1096, 1100 - 1101 (2009).)
In 2023, the Court issued a decision in a pair of cases addressing the same practice – Twitter v.
Taamneh and Google v. Gonzalez. In November 2015, the Islamic State (ISIS) took
responsibility for a series of coordinated terrorist attacks in Paris. Among the at least 130 people
killed was 23-year-old American Nohemi Gonzalez. Shortly after Nohemi’s death, her father
AB 2
Page 5
filed suit against Google, Twitter, and Facebook, arguing that the platforms were liable for
aiding and abetting international terrorism by failing to appropriately respond to, or address,
terrorist organizations’ use of their services. (They additionally argued that Google was not
immune under Section 230 because their algorithm recommended ISIS videos to their users who
ultimately conducted the acts of terrorism. The petitioners lost in both district court and on
appeal to the Ninth Circuit, and subsequently appealed to the United States Supreme Court. In
June, 2023, the Court issued its decision in both matters, holding that “the failure to allege that
the platforms here do more than transmit information by billions of people – most of whom use
the platforms for interactions that once took place via mail, on the phone, or in public areas – is
insufficient to state a claim that defendants knowingly gave substantial assistance and thereby
aided and abetted ISIS’ acts. A contrary conclusion would effectively hold any sort of
communications provider liable for any sort of wrongdoing merely for knowing that the
wrongdoers were using its services and failing to stop them.” (Twitter, Inc. v. Taamneh, (2023)
143 S. Ct. 1206, 1213.) In issuing its ruling the Court never reached the question of whether
Section 230 shielded platforms from liability, and instead only adjudicated the question of
whether either platform was liable under the Antiterrorism Act as aiders and abettors. However,
the same logic would seem to apply for both questions. While we cannot assume how the Court
would rule on the matter, it seems likely that it would hold on a similar line of reasoning.
The extensive jurisprudence relating to liability of a social media platform for content distributed
by a user on the site that may have led to injury appears to bolster the immunity conferred by
Section 230. However, the question over whether choices made by the platform itself, is not quite
so straightforward.
In Lemmon v. Snap, Snap (commonly known as Snapchat) attempted to argue that the CDA
immunized them from liability in a negligent design lawsuit. (Lemmon v. Snap, Inc. (2021) 995
F.3d 1085.) In Lemmon, the plaintiff-parents’ children were tragically killed in a car accident. At
the time of the accident, one of the young men, Landen Brown, opened Snapchat, and used a
feature called the “Speed Filter,” which allowed users to superimpose the speed the user is going
at the time of the image or video is taken. At the time of the accident, the boys were going as fast
as 123 miles per hour. The parents claim against Snapchat focused, not on Landen’s video, but
on the “Speed Filter,” and that Snapchat “allegedly knew or should have known that its users
believed [that Snapchat will reward them for ‘recording a 100-MPH or faster [s]nap’” using the
filter. (Id. at pp. 5- 6.)
In response, Snap argued that Section 230 shielded them from liability for the publication of
third party materials, in this case referring to the image with the superimposed speed. (Id. at pp.
1090 – 1091.) The Ninth Circuit applied the three-pronged test set forth by Barnes v. Yahoo! Inc.
(2009) 570 F.3d 1096 described above. (Id. at p. 1091) The court ultimately decided against
Snap “because the Parents’ claim turns on Snap’s design of Snapchat.” (Id. at p. 1092.) While the
merits of the parents’ claim in Lemmon were not ultimately adjudicated, nonetheless this case
may indicate that basing liability on a platform’s own feature that they knew or reasonably
should have known would be a substantial factor in causing harm to its users may preserve a law
from being struck down under a Section 230 preemption argument.
The opposition argues that the bill violates Section 230, because it “effectively assumes that all
features are harmful and imposes liability on a site for offering any of those features to children.”
It is possible this bill presents an opportunity for plaintiffs to bring claims more akin to Lemmon
than Twitter v. Taamneh, particularly considering that the language of the measure itself includes
AB 2
Page 6
no reference to specific types of content. Ultimately, as with numerous measures introduced in
the Legislature each year, whether or not this bill survives legal challenge is a question to be
answered by the courts.
First Amendment Concerns. The First Amendment of the United States Constitution provides
that, “Congress shall make no law abridging the freedom of speech, or of the press.” As applied
to the states through the Fourteenth Amendment, and as interpreted by the courts, the First
Amendment prohibits any law or policy, at any level of government, from abridging freedom of
speech. Legislation seeking to regulate speech can generally be distinguished as either content-
based or content-neutral. Content-based laws, or laws that target a particular type of speech, must
meet a strict scrutiny standard and must therefore be the least restrictive means to achieve a
compelling government purpose in order to withstand legal challenge. Content neutral laws, on
the other hand, or laws which serve a purpose unrelated to the speech itself, need only pass
intermediate scrutiny. Under this standard, the law “need not be the least restrictive or least
intrusive means of’ serving the government’s interests,” but “may not regulate expression in a
manner that a substantial portion of the burden on speech does not serve to advance its goals.” It
is also possible for a law that is facially neutral on the issue of speech may nonetheless violate
the First Amendment if it creates a “chilling effect.”
Both organizations opposed to AB 2 argue that this bill runs afoul of the First Amendment. EFF
contends:
Under AB 2, allowing online discussion of these higher-risk activities could lead to court-
imposed penalties for the online information provider based on claims that discussing these
activities online harmed a child.
That’s a big problem for every Californian’s ability to access information online. It’s also a
First Amendment violation. Requiring platforms to apply the vague standard of ‘ordinary
care and skill’ is subjective and depends on many factors. The state cannot enact a law that
forces online services to steer clear of conversations about controversial or benign topics
such as LGBTQ+ youth or high school football, the overwhelming majority of which will be
protected speech. Relatedly, the state cannot set up a legal regime that allows anyone to seek
to censor speech they disagree with or view as dangerous.
Nothing in the text of the bill imposes a requirement on social media platforms to avoid specific
topics or content. Rather, EFF’s argument appears to be that that the economic risk of hosting
content that may cause harm to a child is so significant that social media platforms will
overcorrect and in so doing result in a silencing of particular conversations on the platform:
“allowing online discussion of these higher-risk activities could lead to court-imposed penalties
for the online information provider based on claims that discussing these activities online harmed
a child.”
It is worthwhile to note that any legislation imposing requirements or potential liability on a
social media platform risks a chilling effect. Whether or not the current measure, and the liability
it proposes, would inevitably cause a chilling effect and thus violate the First Amendment
depends in no small part in the success rate of claims brought under the bill’s provisions and thus
the actual liability to platforms. As discussed throughout this analysis, platforms would only face
actual liability if a plaintiff is able to bring a successful claim. If a claim is barred under either
Section 230 or the First Amendment, they would be dismissed and no liability would attach.
AB 2
Page 7
While this analysis is free to speculate on whether the bill violates the First Amendment, much
like the question of whether the bill is preempted by Section 230, the constitutional validity of
the proposal is a question that would be answered by the courts.
The opposition contends that the standard of care proposed by AB 2 is unclear and will result
in significant financial risk to platforms. TechNet submits:
It is entirely unclear what will constitute a violation of a platform’s ‘responsibility of
ordinary care and skill’ in this context. Feasibly, any sort of negative impact on a child could
be sufficient for a plaintiff to allege a breach of the platform’s responsibility of ordinary care
and skill. Every platform feature, every interaction between users, and every post that a teen
sees could be the basis for a lawsuit. Reasonable people, even parents in the same household,
might disagree about what is harmful to a particular teen. AB 2 asks social media to decide
what is harmful to every user and exercise ordinary care to prevent that harm. This ambiguity
will be impossible for platforms to operationalize. […] This vagueness, combined with high
per-violation statutory penalties […] that are decoupled from a plaintiff’s actual harm and
potential lawsuits from outside of California, will invite a flood of frivolous litigation.
The sponsors contend that the bill does not create any new duty of care, but rather specifies
damages for a violation of an existing duty of care owed to a child user (as opposed to an adult).
Under this theory, a claim brought under this new provision that is not dismissed as barred by
either Section 230 or the First Amendment would need to establish the same elements of any
other negligence claim – duty, breach of duty, causation (encompassing cause in fact and
proximate cause), and harm. Each of these claims are sure to be fact-specific. In practice it seems
that many claims will likely face steep uphill climbs to success. However, it is not impossible
that a court may find merit in a claim that hinges, not on any particular content, but on an
element of the platform itself, more akin to the approach suggested in Lemmon.
The difficulty of bringing a negligence claim against the platforms is not raised to throw water
on the concept proposed by this bill. Rather, it is raised to demonstrate that despite the potential
appearance of significant economic liability for platforms, the actual risk is arguably quite
minimal. It seems that platforms subject to liability under this measure, which is already limited
to those generating more than $100,000,000 per year in gross revenue, are unlikely to face
significant financial losses as a result of claims arising from the new statute. While it is possible
that the potential of such a high reward could draw additional claims to be filed, it is far from a
foregone conclusion that every claim will be successful.
ARGUMENTS IN SUPPORT: This bill is sponsored by Common Sense Media. It is supported
by the California Charter Schools Association, the California Initiative for Technology and
Democracy (CITED), the Children’s Advocacy Institute, the Consumer Federation of California,
the Jewish Family and Children’s Services of San Francisco, the Peninsula, Marin and Sonoma
Counties, the Los Angeles County Office of Education, and the Organization for Social Media
Safety. In support of the bill the sponsors submit:
This bill establishes statutory damages under California’s existing negligence law for harms
to minors related to social media that can be proven in court. That is the only change to
California law that this bill makes. These financial penalties are needed and intended to
motivate large social media companies to do what they currently refuse to do - ensure that the
way they design and operate their platforms does not injure young users. There is mounting
evidence, including from internal company communications, that social media platforms
AB 2
Page 8
contribute to our youth mental health crisis and to other direct harms to kids and teens,
including accessing fentanyl and other illegal drugs.
As the use of social media continues to climb among children and adolescents, so too does
the urgency for legislative action. AB 2 offers a path to mitigate the risks faced by our youth
in an increasingly connected world, ensuring that social media companies operate with the
due care our children deserve.
Again, AB 2 makes no other change to California law other than to introduce specific
financial liabilities for platforms whose products or designs are proven in court to result in
harm to minors, incentivizing those companies to prioritize the safety of their younger users.
In light of the compelling association between social media use and injuries to young users,
including effects on their mental well-being, we strongly urge your support for AB 2. Your
action on this bill will be a significant step toward protecting our children and teens from the
avoidable harms perpetuated through the negligence of social media companies.
ARGUMENTS IN OPPOSITION: It is opposed by a coalition of tech-industry and advocates
led TechNet, and the Electronic Frontier Foundation (EFF). EFF submits:
We respectfully oppose A.B. 2, authored by Assemblymember Lowenthal, which would
restrict all Californians’ access to online information. A.B. 2 would allow for plaintiffs suing
online information providers to collect statutory damages of up to $1 million dollars based on
the vaguest of claims that the service violated “its responsibility of ordinary care and skill to
a child.” To be sure, children can be harmed online. A.B. 2, however, takes a deeply flawed
and punitive approach to protecting children that will disproportionately harm everyone’s
ability to speak and to access information online.
A.B. 2 picks up where Assemblymember Lowenthal’s A.B. 3172 left off. Where A.B. 3172
set forth breaches of “ordinary care” that are “knowingly and willfully” made, A.B. 2 returns
to the lower negligence standard in existing section 1714(a) and simply refers to a social
media company breaching its “responsibility of ordinary care and skill to a child.” The
negligence standard is constitutionally deficient under the First Amendment, and what
constitutes a social media company’s duty of care to a minor remains vague.
The heavy statutory damages imposed by A.B. 2 will result in broad censorship via scores of
lawsuits that may claim any given content online is harmful to any child.
California should not enact a law that would be more harmful to children and will not be
enforceable in any event. Further, should it become law, it will also be ineffective because
federal law preempts Californian’s ability to hold online services civilly liable for harm
caused by user-generated content.
[…]
The platforms that do not block or moderate spaces where certain topics are discussed will
likely instead attempt to age-verify users, in order to shield minors from allegedly harmful
conversations, and to serve as a defense in lawsuits. As EFF has explained in other contexts,
mandatory online age verification is itself a bad idea.
AB 2
Page 9
As age verification requirements spread, Californians will be required to hand over much
more private data simply to access online information. Mandatory online age verification
invariably harms adults’ rights to speak anonymously or to access lawful speech online.
Further, age verification that relies on government-issued identification harms the tens of
millions of Americans, already vulnerable and often low-income, who do not have an ID.
Age verification induced by A.B. 2 could cause these Californians to lose access to basic
online information and services, such as the ability to seek housing and employment.
Californians should be concerned about the various ways children are harmed online, and
should be exploring ways to prevent those harms. This includes enacting legislation that
protects everyone’s privacy online, including children. Those proposals have the benefit of
reducing many online harms and being constitutional. A.B. 2 unfortunately will not reduce
online harms to children and will likely be struck down as unconstitutional. For all these
reasons, we must oppose A.B. 2 and respectfully urge the committee’s “no” vote.
REGISTERED SUPPORT / OPPOSITION:
Support
California Charter Schools Association
California Initiative for Technology & Democracy, a Project of California Common CAUSE
California School Boards Association
Children’s Advocacy Institute
Common Sense
Consumer Federation of California
Jewish Family and Children's Services of San Francisco, the Peninsula, Marin and Sonoma
Counties
Los Angeles County Office of Education
Organization for Social Media Safety
Opposition
California Chamber of Commerce
Civil Justice Association of California (CJAC)
Computer & Communications Industry Association
Electronic Frontier Foundation
TechNet-Technology Network
Analysis Prepared by: Manuela Boucher-de la Cadena / JUD. / (916) 319-2334
AB 2
Page 1
ASSEMBLY THIRD READING
AB 2 (Lowenthal and Patterson)
As Amended April 3, 2025
Majority vote
SUMMARY
Augments the liability large social media platforms may face if they violate existing law by
causing an injury to a minor through failure to exercise ordinary care.
Major Provisions
1) Provides that a social media platform that, with respect to a minor, violates the existing
statute governing liability for negligent harm caused to others and causes injury to the minor,
is, in addition to any other liability owed, liable for statutory damages for the larger of the
following:
a) $5,000 per violation up to a maximum, per minor, of $1 million.
b) Three times the amount of the minor's actual damages.
2) Provides that a waiver of the bill's provisions is void and unenforceable as contrary to public
policy.
3) Applies only to social media platforms that generate more than $100 million per year in gross
revenues.
4) Provides that all other duties, remedies, and obligations imposed under other provisions of
law continue to apply.
COMMENTS
Social media harms to children. From 2010 to 2019, "rates of depression and anxiety—fairly
stable during the 2000s—rose by more than 50% in many studies" and "[t]he suicide rate rose
48% for adolescents ages 10 to 19." This trend tracks "the years when adolescents in rich
countries traded their flip phones for smartphones and moved much more of their social lives
online—particularly onto social-media platforms designed for virality and addiction."1
According to the Surgeon General:
[T]he current body of evidence indicates that while social media may have benefits for some
children and adolescents, there are ample indicators that social media can also have a
profound risk of harm to the mental health and well-being of children and adolescents. At
1 Haidt, End the Phone-Based Childhood Now (March 13, 2024) The Atlantic,
https://www.theatlantic.com/technology/archive/2024/03/teen-childhood-smartphone-use-mental-health-
effects/677722/.
AB 2
Page 2
this time, we do not yet have enough evidence to determine if social media is sufficiently
safe for children and adolescents.2
Social media companies have known for some time that social media use can be harmful to
young users, and despite that knowledge, have continued to use algorithms and other design
features to capture and hold their attention. Whistleblower Frances Haugen, for instance,
revealed in 2021 that Facebook was well aware of the apparent connection between the teen
mental health crisis and social media – including the severe harm to body image visited
disproportionately on teen girls as a result of social comparison on these platforms – but
nonetheless sought to recruit more children and expose them to addictive features that would
lead to harmful content.3 Such revelations underscore the culpability of some social media
companies in propagating features detrimental to the wellbeing of youth through intentional
design choices that maximize engagement with profit-motivated online services.
Enhanced liability for negligence. State law provides that everyone, including individuals,
businesses, and other entities, has a duty of "ordinary care and skill" in the "management" of
their "property or person" – the long-established standard for negligence. This bill provides that a
social media platform that violates this duty and harms a minor is additionally liable for either
$5,000 per violation, with a per-minor maximum of $1,000,000, or three times the amount of the
minor's actual damages.
Under existing law, social media platforms, like other entities, owe everyone a duty of care. The
breach of this duty in a manner that causes harm can give rise to negligence lawsuits. This bill
does not change those underlying principles. It simply increases the amount of damages that may
be recovered if the injured party in such cases is a minor.
Constitutional considerations. Under existing law, some cases against social media platforms do
not make it past the hurdles posed by Section 230 of the federal Communications Decency Act
of 1996 and the First Amendment to the United States Constitution.
Section 230 states, "No provider or user of an interactive computer service shall be treated as the
publisher or speaker of any information provided by another information content provider."4
Case law suggests that social media platforms continue to have a duty of care to users and that
negligence claims arising from a platforms' independent conduct, rather than their status as
publishers of third party content, are compatible with Section 230.
The First Amendment to the United States Constitution, among other things, prohibits states
from abridging freedom of speech. Because the bill does not regulate speech, it likely does not
facially violate First Amendment. Nevertheless, as with negligence claims under existing law,
there may be some cases in which the application of the bill to a particular situation unduly
infringes on speech.
2 Surgeon General's Advisory, Social Media and Youth Mental Health (2023), p. 4,
https://www.hhs.gov/surgeongeneral/reports-and-publications/youth-mental-health/social-media/index.html.
3 Facebook Whistleblower Frances Haugen Testifies on Children & Social Media Use: Full Senate Hearing
Transcript (Oct. 5, 2021), https://www.rev.com/blog/transcripts/facebook-whistleblower-frances-haugen-testifies-
on-children-social-media-use-full-senate-hearing-transcript.
4 47 U.S.C. Section 230(c)(1).
AB 2
Page 3
See the policy committee analyses for more discussion of these issues.
According to the Author
AB 2 amends Section 1714 of the Civil Code by adding statutory damages against platforms
that are found in court to be liable under current law for negligently causing harm to children
under the age of 18. Under the bill, if a company is proven to have failed to exercise its
already established duty of operating with ordinary care, the company becomes financially
liable for a set amount of $5,000 per violation, up to a maximum penalty of $1 million per
child, or three times the amount of the child's actual damages, whichever is applicable. This
financial liability aims to incentivize platforms who count their profits in the tens of billions
to proactively safeguard children against potential harm by changing how they operate their
platforms.
Arguments in Support
The Los Angeles County Office of Education, co-sponsors of the bill, write:
Social media platforms must be held accountable for the harm they cause, particularly to
minors who are uniquely vulnerable to the harmful effects of online engagement. Research
has repeatedly shown the detrimental impact of social media on young people's mental
health, contributing to a range of issues, including increased instances of cyberbullying,
mental health crises, and even acts of violence.
While social media platforms prioritize user engagement and growth, they often fail to
adequately consider the safety and wellness of younger users. Given this imbalance, it is
crucial that social media companies are required to uphold a standard of ordinary care in their
management of content and interactions involving minors. This bill would establish much-
needed accountability by holding social media platforms liable for civil penalties if they fail
to exercise the necessary care to protect children on their platforms.
Arguments in Opposition
A coalition of industry opponents jointly write:
First, companies could adjust their policies and terms of service to exclude all users under the
age of 18. This would be a tremendous and detrimental blow to teens' ability to access
information and the open internet. As discussed below, this violates First Amendment
principles and protections for teens. However, even if a platform stated in its terms of service
that teens under 18 were not allowed on the platform and took steps to prevent their access,
that may not be enough to avoid liability for a teen who accesses the site anyway and has a
negative outcome.
Second, companies could also adjust their terms of service so that users under the age of 18
have a heavily sanitized version of the platform. This could include limiting which users
teens can interact with (e.g. only users approved by parents), which features they have access
to (no messaging or public posting), and even what content they can interact with or view (no
political, news, or other "potentially harmful" content). This might reduce but would not
prevent every instance of harm to teens given the nebulousness and subjectivity that is
inherent in defining "harm".
This bill's implicit concern is harmful content. It is impossible for companies to identify and
remove every potentially harmful piece of content because there's no clear consensus on what
AB 2
Page 4
exactly constitutes harmful content, apart from clearly illicit content. Determining what is
harmful is highly subjective and varies from person to person, making it impossible to make
such judgments on behalf of millions of users. Faced with this impossible task and the
liability imposed by this bill, some platforms may decide to aggressively over restrict content
that could be considered harmful for teens. For instance, content promoting healthy eating
could be restricted due to concerns it could lead to body image issues. Similarly, content
about the climate crisis or foreign conflicts would need to be restricted as it could lead to
depression, anxiety, and self-harm. Additionally, beneficial information like anti-drug or
smoking cessation programs, mental health support, and gender identity resources could get
overregulated because of the impossibility of deciding what is harmful to every user.
Furthermore, platforms would need to evaluate whether to eliminate fundamental features
and functions of their platform, features that are the reason teens and users go to their
platforms, due to the legal risk involved. For instance, since direct messaging features could
potentially be misused for contacting and bullying other teens, such features would likely be
removed.
Teens' use of these platforms would be overly policed and sanitized to such a degree that
they would surely leave our sites in favor of others that don't meet AB 2's $100 million
revenue threshold. Collectively, our organizations represent platforms that take their
responsibility to their users incredibly seriously and have devoted millions of dollars to
increasing the safety and enjoyment of their platforms. Teens will seek out the ability to
interact online, whether it is on our platforms or on others, including ones that don't prioritize
their safety and well-being.
FISCAL COMMENTS
As currently in print, this bill is keyed nonfiscal.
VOTES
ASM PRIVACY AND CONSUMER PROTECTION: 9-0-6
YES: Dixon, Berman, Bryan, Lowenthal, McKinnor, Ortega, Patterson, Pellerin, Ward
ABS, ABST OR NV: Bauer-Kahan, DeMaio, Irwin, Macedo, Petrie-Norris, Wilson
ASM JUDICIARY: 9-0-3
YES: Kalra, Wicks, Bryan, Connolly, Harabedian, Pacheco, Papan, Stefani, Zbur
ABS, ABST OR NV: Dixon, Sanchez, Tangipa
UPDATED
VERSION: April 3, 2025
CONSULTANT: Josh Tosney / P. & C.P. / (916) 319-2200 FN: 0000226