General (9)
Theresa Bass
From:Craig A Durfey <cadurfey@gmail.com>
Sent:Sunday, March 5,
To:Craig A Durfey; Public Comment; Theresa Bass; farrahkhan@cityofirvine.org;
cityclerk@santa-ana.org; citycouncil@cityoflapalma.org;
cityclerk@newportbeachca.gov; cityhall@buenapark.com; City Manager;
COB_Response; clerk@cityofirvine.org; Whill@Cityofirvine.org;
outreach@cityofirvine.org; jaguirre@cityofirvine.org; tpetropulos@cityofirvine.org;
TGOODBRAND@CITYOFIRVINE.ORG
Subject:\[EXTERNAL\] Fwd: Rep. Kim Introduces Bipartisan Bill to Support Community-Based
Youth and Young Adult Suicide Prevention Efforts.
Attachments:Bill Text - AB-638 Mental Health Services Act_ early intervention and prevention
programs_.pdf; 2023-02-14 - Testimony - Bride (1).pdf; 2023-02-14 - Testimony -
Golin1.pdf; 2023-02-14 - Testimony - Pizzuro.pdf; 2023-02-14 - Testimony -
Lembke.pdf; 2023-02-14 - Testimony - Prinstein1.pdf; H.R.7255 - Garrett Lee Smith
Memorial Reauthorization Act.pdf
Warning: This email originated from outside the City of Anaheim. Do not click links or open attachments unless you
recognize the sender and are expecting the message.
---------- Forwarded message ---------
From: <durfeycraig778@gmail.
Date: Sun, Mar 5, 2023 at 5:42 PM
Subject: Rep. Kim Introduces Bipartisan Bill to Support Community-Based Youth and Young Adult Suicide Prevention
Efforts.
To: <kim.vandermeulen@mail.house.gov>, Nick Dibs < , <senator.umberg@senate.ca.gov>,
<ADAM.BOMAN@asm.ca.gov>, <cindyt@ggcity.org>, stevej <stevej@garden-grove.org>,
<stephanie.jordan@sen.ca.gov>, <joneill@garden-grove.org>, Pam Haddad <pamha@ci.garden-grove.ca.us>, George
Brietigam <georgeb@ggcity.org>, Kim Nguyen <KIMN@ggcity.org>, <joed@ggcity.org>, Public Records Request
<cityclerk@ggcity.org>, Teresa Pomeroy <teresap@ggcity.org>, Lisa Kim <lisak@ci.garden-grove.ca.us>, Gabriela Mafi
<gmafi@ggusd.us>, Walter Muneton <walter.muneton@ggusd.us>, Teri Rocco <teri.rocco@ggusd.us>, Lan Nguyen
<lan.nguyen@ggusd.us>, Bob Harden <bob.harden@ggusd.us>, <Ddbarnes@ocsd.org>, <fganzales@ocsheriff.gov>,
<amire@ggpd.org>, <mariom@ggpd.org>, <jonanhan@ggpd.org>, Dina Nguyen <dina.nguyen@ggusd.us>
Cc: <cadurfey@gmail.
has shared a OneDrive file with you. To view it, click the link below.
Rep Kim Introduces Bipartisan Bill to Support Community-Based Youth and Young Adult Suicide Prevention
Efforts.docx
03-05-2023
1
(P.R.D.D.C.)
PARENTS FOR THE RIGHTS OF DEVELOPMENTALLY DISABLED CHILDREN
CRAIG A. DURFEY FOUNDER OF P.R.D.D.C.
CELL
SOCIALEMOTIONALPAWS.COM
FACEBOOK: CRAIG DURFEY
U.S. HOUSE OF CONGRESS H2404 - HONORING CRAIG DURFEY FOR HIS FIGHT AGAINST AUTISM
... Ms. LORETTA SANCHEZ of California.
https://www.govinfo.gov/content/pkg/CREC-2003-03-27/pdf/CREC-2003-03-27.pdf
new website socialemotionalpaws.org
Orange County Sheriff’s -Corner Don Barnes
550 N Flower Street, Santa Ana CA
92703
714-647-1800
Assemblyman Tri Ta
State Capitol
P.O. Box 942849
2
Sacramento, CA 94249-0070
Phone - 916-319-2070
Senator Janet Nguyen
1021 O Street, Suite 7610
Sacramento, CA 95814
Phone: (916) 651-4036
Chief Amir El-Farra
11301 Acacia Pkwy, Garden Grove,
CA 92840
(714) 741-5704
Mayor Steve Jones
City of Garden Grove
11222 Acacia Parkway
Garden Grove, CA 92840
City Council
CA State Senate
CA State Assembly
To whom it may concern
Congresswoman Michelle Steel
3
1127 Longworth House Office Building
Washington, DC 20515
Phone: (202) 225-2415
Dear Congresswoman Michelle Steel
Today there is a growing need for educational awareness at all levels parents, teachers,
school board members, first responders with social media addictions and accountability.
H.R.7255 - Garrett Lee Smith Memorial Reauthorization Act ought to include children as
early intervention that we know to much screentime have medical effects as in brain
injury with CA SCR 73 Blue Light 2019 causes eye myopia, sleep deprivation, mental
health issues.
On Thursday, March 3, 2022, Supervisor Foley hosted a public hearing describing the
Fentanyl crisis in Orange County alongside Chairman Doug ...Supervisor Foley's Fentanyl
Hearing with Local Law Enforcement, Health Experts, & Policymakers
https://www.youtube.com/watch?v=FVABSrVJE9I . The one of the keys points was
awareness education for fifth, six grade prevention.
# 1. U.S. SENATE HEARING ECOUNTERING ILLICIT FENTANYL TRAFFICKING:
https://socialemotionalpaws.com/blog-post-1/f/us-senate-hearing-ecountering-illicit-
fentanyl-trafficking
# 2. Kids’ mental health, safety in the spotlight as social media exec:
https://socialemotionalpaws.com/blog-post-1/f/kids%E2%80%99-mental-health-safety-in-
the-spotlight-as-social-media-exec
# 3. 10 things to know about how social media affects teens' brains:
https://socialemotionalpaws.com/blog-post-1/f/10-things-to-know-about-how-social-media-
affects-teens-brains
4
# 4. U.S. Senate Hearing Protecting Our Children Online:
https://socialemotionalpaws.com/blog-post-1/f/protecting-our-children-online
# 5. Mitch J. Prinstein, PhD, ABPP Chief Science Officer American Psychological
Association Washington, D.C.
Download Testimony.
Page 1-3
Today, we are seeing the repercussions of our underinvestment and lack of focus on
children’s mental health. Depression rates for teens doubled between 2009 and 2019 and
suicide
is the second leading cause of death for U.S. youth, up 4% since 2020, with one in five
teens
considering suicide during the pandemic and eating disorder emergency room
admissions for girls
12 to 17 years old doubling since 2019 1
. Furthermore, since the start of the pandemic, over
167,000 children have lost a parent or caregiver to the virus 2
. This kind of profound loss can have
significant impacts on the mental health of children, leading to anxiety, depression,
trauma, and
stress-related conditions 3
. Faced with such data, in December 2021, the U.S. Surgeon General
issued an advisory calling for a unified national response to the mental health challenges
young
5
people are facing 4
. The rarity of such advisories further underscores the need for action to help
stem the mental health crisis of children and adolescents.
There are many reasons why youth are experiencing this crisis today, and it is likely that
there are simultaneous contributors to the outcomes presented above. Today, we are
here to talk
about whether youths’ engagement with social media, and other online platforms, may be
a
relevant factor.
Many psychological scientists, including myself and my colleagues, have been
asking this same question for years. We seek to understand how this new context in
which youths’
social interactions occur may be related to development, including potential benefits or
risks that
may be conferred by the online environment.
As the discipline with expertise on all of human
behavior, our work has been broad in scope; and to date, our focus has been on the
adolescent
period, during which more complex and mature behaviors are developed through
intricate and
precise interactions among neural, biological, social, contextual, and social systems.
Today,
although this remains a relatively nascent body of research, I would like to share what
we know
so far, so policymakers, educators, parents, caregivers, and youth can learn from what
we are
6
beginning to discover and make choices that will ensure the safety of youth.
In this testimony, I outline emerging research with findings that have begun to suggest
possible benefits, and as well as possible adverse effects of technology and social media
use on
adolescent development. I also present legislative and regulatory solutions that if
enacted, would
represent positive steps towards learning more about, and hopefully solving this
problem.
I am calling for new legislation and regulations that increase research funding and
provide education
on how children can use online platforms without experiencing the most harmful
impacts;
legislation that creates a requirement that social media companies protect the well-
being of child
users; legislation that prohibits problematic business practices and prevents companies
from
tricking and manipulating users; and bills that provide more leverage for federal
regulators to
Page 9-10
Risks for Addictive Social Media Use. Youths’ biological vulnerabilities also have
significant implications for “problematic social media use” or addictive behaviors; note
that the
regions of the brain activated by social media use overlap considerably with the regions
involved
in addictions to illegal and dangerous substances 14. As noted above, the developing
brain is built
7
to increase a desire for social rewards (that social media delivers abundantly), without
the ability
to show the capacities of inhibition and restraint capable among adults. This suggests
that youth
may be at risk for extraordinarily frequent uses of social media. Several bodies of
research reveal
that this indeed may be a very significant concern. For instance, data suggest that
almost half of
all adolescents report that they use social media “almost constantly” 15. Research also
has
compared social media use to diagnostic criteria for substance use dependencies,
revealing that
many adolescents report an inability to stop using social media, even when they want to,
remarkable efforts to maintain access to social media, the use of social media to
regulate their
emotions, a need for increasing social media use to achieve the same level of pleasure
(i.e.,
tolerance symptoms), withdrawal symptoms following abstinence, an significant
impairment in
their daily educational, social, work routines. A recent study revealed that over 54% of
11– 13-
year-old youth reported at least one of these symptoms of problematic social media use
16
. About
85% of youth report spending more time than intended online and 61% reporting failing
when
trying to stop or reduce their use of social media 17
.
Alterations in Brain Development. Youths’ biological vulnerability to technology and
8
social media, and their resulting frequent use of these platforms, also has the potential
to alter
youths’ neural development since our brains develop in response to the environment we
live in.
Recent studies have revealed that technology and social media use is associated with
changes in
structural brain development (i.e., changing the size and physical characteristics of the
brain). In
addition, research with my own colleagues at the University of North Carolina at Chapel
Hill
recently has revealed that technology and social media use also is associated with
changes in how
the brain works). Our data has revealed that youth indeed spend a remarkable amount of
time
using their devices 18
. Objective data measured by teens’ phones themselves indicated that the
average number of times that youth in sixth grade picked up their phones was over 100,
with some
interrupting daily activities to pick up their phones over 400 times a day. On average,
adolescents
https://socialemotionalpaws.com/blog-post-1/f/protecting-our-children-online
Hearings to examine protecting our children online.
118th Congress (2023-2024)
Committee: Senate Judiciary
Related Items: Data will display when it becomes available.
Date: 02/14/2023 (11:00 AM EST)
Location: 216 Hart Senate Office Building, Washington, D.C.
9
Website: https://www.judiciary.senate.gov/
https://www.congress.gov/event/118th-congress/senate-event/333588
Full Committee Hearing
Date: Tuesday, February 14th, 2023
Time: 11:00am
Location: Hart Senate Office Building Room 216
Presiding: Chair Durbin
Status: Time Change
https://www.judiciary.senate.gov/committee-activity/hearings/protecting-our-children-
online
Kristin Bride
Survivor Parent and Social Media Reform Advocate Portland, OR
Testimony of Kristin Bride
United States Senate Committee on the Judiciary
Hearing on Protecting Our Children Online
February 14, 2023
Thank you, Chairman Durbin, Ranking Member Graham, and members of the committee.
My
name is Kristin Bride. I am a survivor parent and social media reform advocate, and
member of
the bipartisan Council for Responsible Social Media.
10
I am testifying here today to bring a face to the harms occurring every day resulting from
the
unchecked power of the social media industry. This is my son Carson Bride with beautiful
blue
eyes, an amazing smile, and a great sense of humor, who will be forever 16 years old. As
involved parents raising our two sons in Oregon, we thought we were doing everything
right.
We waited until Carson was in 8th grade to give him his first cell phone, an old phone
with no apps.
We talked to our boys about online safety and the importance of never sending anything
online
that you wouldn’t want your name and face next to on a billboard. Carson followed these
guidelines. Yet tragedy still struck our family.
It was June 2020; Carson had just gotten his first summer job making pizzas, and after a
successful first night of training, he wrote his upcoming work schedule on our kitchen
calendar.
We expressed how proud we were of him for finding a job during the pandemic. In so
many
ways, it was a wonderful night, and we were looking forward to summer. The next
morning, I
woke to the complete shock and horror that Carson had hung himself in our garage while
we
slept.
In the weeks that followed, we learned that Carson had been viciously cyberbullied by
his
“Snapchat friends,” his high school classmates who were using the anonymous apps
Yolo and
11
LMK on Snapchat to hide their identities. It wasn’t until Carson was a freshman in high
school
that we finally allowed him to have social media because that was how all the students
were
making new connections. What we didn’t know is that apps like Yolo and LMK were using
popular social media platforms to promote anonymous messaging to hundreds of millions
of teen
users.
After his death, we discovered that Carson had received nearly 100 negative, harassing,
sexually
explicit, and humiliating messages, including 40 in just one day. He asked his tormentors
to
“swipe up” and identify themselves so they could talk things out in person. No one ever
did. The
last search on his phone before Carson ended his life was for hacks to find out the
identities of
his abusers.
Anonymous apps like Whisper, Sarahah, and YikYak have a long history of enabling
cyberbullying, leading to teen suicides.1 The critical flaws in these platforms are
compounded by
the fact that teens do not typically report being cyberbullied. They are too fearful that
their
phones to which they are completely addicted will be taken away or that they will be
labeled a
snitch by their friends. https://www.judiciary.senate.gov/imo/media/doc/2023-02-14%20-
%20Testimony%20-%20Bride.pdf
# 6. Can Technology Encourage Mass Shootings? with Dr. Lisa Strohman:
https://socialemotionalpaws.com/blog-post-1/f/can-technology-encourage-mass-shootings-
with-dr-lisa-strohman
12
# 7. Violent Video Games OnThe Brain: What It Looks Like, with Dr. Li
https://socialemotionalpaws.com/blog-post-1/f/violent-video-games-on-the-brain-what-it-
looks-like-with-dr-li
#8. ‘Addictive as cocaine’: Parents sue Fortnite creators
https://socialemotionalpaws.com/blog-post-1/f/%E2%80%98addictive-as-
cocaine%E2%80%99-parents-sue-fortnite-creators
# 9. Daniel Amen |The most important lesson from 83,000 brain scans:
https://socialemotionalpaws.com/blog-post-1/f/daniel-amen-%7Cthe-most-important-
lesson-from-83000-brain-scans
# 10. HOAG HOSPITAL MENTAL ILLNESS SOCIAL MEDIA ADDICTIONS:
https://socialemotionalpaws.com/blog-post-1/f/hoag-hospital-mental-illness-social-media-
addictions
# 11. GAMING, SOCIAL MEDIA AND MENTAL WELLNESS PRESENTED BYSINA SAFAHIE:
https://socialemotionalpaws.com/blog-post-1/f/gaming-social-media-and-mental-wellness-
presented-bysina-safahie
# 10 .Nearly half of all U.S. teens have been cyberbullied, Pew Researc:
https://socialemotionalpaws.com/blog-post-1/f/nearly-half-of-all-us-teens-have-been-
cyberbullied-pew-researc
# 11. Tech Addiction: Digital Madness- How Social Media Is Driving Our:
13
https://socialemotionalpaws.com/blog-post-1/f/tech-addiction-digital-madness--how-social-
media-is-driving-our
# 12 .Surgeon general warns 13 is too young for children to be on socia:
https://socialemotionalpaws.com/blog-post-1/f/surgeon-general-warns-13-is-too-young-for-
children-to-be-on-socia
and https://www.cnn.com/videos/business/2023/01/29/vivek-murthy-social-media-13-too-
young-brown-nr-sot-vpx-contd.cnn
# 13,and Surgeon General Vivek Murthy warns 13 is far too young for
...https://www.dailymail.co.uk › news › article-11690563
Jan 29, 2023 — Surgeon General Vivek Murthy warned children should be banned from
social media until they're between 16 and 18 to avoid a 'distorted' sense ...
https://www.dailymail.co.uk/news/article-11690563/Surgeon-General-Vivek-Murthy-warns-
13-far-young-children-sign-social-media-sites.html
# 14. Parents who say their kids are addicted to 'Fortnite' slam Epic G:
https://socialemotionalpaws.com/blog-post-1/f/parents-who-say-their-kids-are-
addicted-to-fortnite-slam-epic-g
# 15.Al Muratsuchi's Assembly Bill (AB) 272, Banning Smartphones in Sc:
"Growing evidence shows excessive smartphone use at school interferes with a
student’s education and success, encourages cyberbullying, and contributes to teenage
anxiety, depression, and suicide,” stated Assemblymember Muratsuchi. "This new law
will encourage school districts to develop their own policy that strikes a balance
between allowing appropriate student use of smartphones at school, while making sure
that smartphones are not interfering with a student's educational, social and emotional
development."
14
Evidence has shown that unrestricted use of smartphones by students at schools lowers
academic performance, particularly among low-achieving students; promotes
cyberbullying; and contributes to teenage mental health issues. Between 2009 and 2017,
the number of 14 to 17 year olds experiencing clinical level depression jumped more than
60%, with a 47% increase among 12 to 13 year olds. AB 272 will take effect in January
2020. https://socialemotionalpaws.com/blog-post-1/f/al-muratsuchis-assembly-bill-ab-272-
banning-smartphones-in-sc
# 16. Orange County YMCA Esports under 13 years old that should never be permitted
https://ymcaoc.org/esports/
# 17.Federal legislation provides guidance to States by identifying a minimum set of acts
or behaviors that define child abuse and neglect. The Federal Child Abuse Prevention and
Treatment Act (CAPTA) (42 U.S.C.A. § 5106g), as amended by the CAPTA Reauthorization
Act of 2010, defines child abuse and neglect as, at minimum:
"Any recent act or failure to act on the part of a parent or caretaker which results in
death, serious physical or emotional harm, sexual abuse or exploitation"; or
"An act or failure to act which presents an imminent risk of serious harm."
This definition of child abuse and neglect refers specifically to parents and other
caregivers. A "child" under this definition generally means a person who is younger than
age 18 or who is not an emancipated minor.
What is child abuse or neglect? What is the definition of child abuse and neglect? |
HHS.gov
# 18. World Health Organization Recommends Against Screen Time for Infa:
https://socialemotionalpaws.com/blog-post-1/f/world-health-organization-recommends-
against-screen-time-for-infa
# 19. What Does Too Much Screen Time Do to Children’s Brains?
https://socialemotionalpaws.com/blog-post-1/f/what-does-too-much-screen-time-do-to-
children%E2%80%99s-brains-1
15
# 20.MRIs show screen time linked to lower brain development in presch:
https://socialemotionalpaws.com/blog-post-1/f/mris-show-screen-time-linked-to-lower-
brain-development-in-presch-3
# 21. Press Release will Esports recognized as Child Abuse brain injury:
https://socialemotionalpaws.com/blog-post-1/f/press-release-will-esports-recognized-as-
child-abuse-brain-injury
# 22.The key was awareness education:
Rep. Kim Introduces Bipartisan Bill to Support Community-Based Youth and
Young Adult Suicide Prevention Efforts:
H.R.7255 - Garrett Lee Smith Memorial Reauthorization Act
117th Congress (2021-2022)
Sponsor: Rep. McMorris Rodgers, Cathy \[R-WA-5\] (Introduced 03/28/2022)
Committees: House - Energy and Commerce
Latest Action: House - 03/29/2022 Referred to the Subcommittee on Health. (All
Actions)
Tracker: Tip This bill has the status IntroducedHere are the steps for Status of
Legislation:
https://www.congress.gov/bill/117th-congress/house-bill/7255/text
16
Shown Here:
Introduced in House (03/28/2022)
117th CONGRESS
2d Session
H. R. 7255
To amend title V of the Public Health Service Act to reauthorize the Garrett Lee Smith
Memorial Act, and for other purposes.
IN THE HOUSE OF REPRESENTATIVES
March 28, 2022
Mrs. Rodgers of Washington (for herself, Mrs. Trahan, Mrs. Axne, and Mrs. Kim of
California) introduced the following bill; which was referred to the Committee on Energy
and Commerce
A BILL
To amend title V of the Public Health Service Act to reauthorize the Garrett Lee Smith
Memorial Act, and for other purposes.
Be it enacted by the Senate and House of Representatives of the United States of
America in Congress assembled,
SECTION 1. SHORT TITLE.
17
This Act may be cited as the “Garrett Lee Smith Memorial Reauthorization Act”.
SEC. 2. SUICIDE PREVENTION RESOURCE CENTER.
Section 520C of the Public Health Service Act (42 U.S.C. 290bb–34) is amended—
(1) in subsection (a), by striking “tribes, tribal organizations” and inserting “Tribes, Tribal
organizations”;
(2) in subsection (b), by striking “tribal” each place it appears and inserting “Tribal”; and
(3) in subsection (c), by striking “$5,988,000 for each of fiscal years 2018 through 2022”
and inserting “$9,000,000 for each of fiscal years 2023 through 2027”.
SEC. 3. GARRETT LEE SMITH STATE AND TRIBAL YOUTH SUICIDE PREVENTION AND
EARLY INTERVENTION GRANT PROGRAM.
Section 520E of the Public Health Service Act (42 U.S.C. 290bb–36) is amended—
(1) in subsection (a), by striking “tribal” each place it appears and inserting “Tribal”;
(2) in subsection (b)(1)(C)—
(A) by striking “Indian tribe or tribal organization” and inserting “Indian Tribe or Tribal
organization”; and
18
(B) by striking “tribal youth” and inserting “Tribal youth”;
(3) in subsection (c), in the matter preceding paragraph (1), by striking “tribal” each
place it appears and inserting “Tribal”;
(4) in subsection (e)(3), by striking “tribal” and inserting “Tribal”; and
(5) in subsection (m), by striking “$30,000,000 for each of fiscal years 2018 through
2022” and inserting “$40,000,000 for each of fiscal years 2023 through 2027”.
SEC. 4. GARRETT LEE SMITH CAMPUS SUICIDE PREVENTION PROGRAM.
Section 520E–2(i) of the Public Health Service Act (42 U.S.C. 290bb–36b(i)) is amended by
striking “2018 through 2022” and inserting “2023 through 2027”.
SEC. 5. MENTAL AND BEHAVIORAL HEALTH OUTREACH AND EDUCATION.
Section 549(f) of the Public Health Service Act (42 U.S.C. 290ee–4(f)) is amended by
striking “2018 through 2022” and inserting “2023 through 2027”.
https://youngkim.house.gov/media/press-releases/rep-kim-introduces-bipartisan-bill-
support-community-based-youth-and-young
42 USC 290ee-4: Mental and behavioral health outreach and education on college
campuses Text contains those laws in effect on December 28, 2022Pending Updates:
Pub L. 117-328 (12/29/2022) \[View Details\]
19
From Title 42-THE PUBLIC HEALTH AND WELFARECHAPTER 6A-PUBLIC HEALTH
SERVICESUBCHAPTER III-A-SUBSTANCE ABUSE AND MENTAL HEALTH SERVICES
ADMINISTRATION Part D-Miscellaneous Provisions Relating to Substance Abuse and
Mental Health
Jump To:Source Credit
§290ee–4. Mental and behavioral health outreach and education on
college campuses
(a) Purpose
It is the purpose of this section to increase access to, and reduce the stigma
associated with, mental health services to ensure that students at institutions of higher
education have the support necessary to successfully complete their studies.
(b) National public education campaign
The Secretary, acting through the Assistant Secretary and in collaboration with the
Director of the Centers for Disease Control and Prevention, shall convene an interagency,
public-private sector working group to plan, establish, and begin coordinating and
evaluating a targeted public education campaign that is designed to focus on mental and
behavioral health on the campuses of institutions of higher education. Such campaign
shall be designed to-
(1) improve the general understanding of mental health and mental disorders;
(2) encourage help-seeking behaviors relating to the promotion of mental health,
prevention of mental disorders, and treatment of such disorders;
(3) make the connection between mental and behavioral health and academic
success; and
(4) assist the general public in identifying the early warning signs and reducing the
stigma of mental illness.
(c) Composition
The working group convened under subsection (b) shall include-
(1) mental health consumers, including students and family members;
(2) representatives of institutions of higher education;
(3) representatives of national mental and behavioral health associations and
associations of institutions of higher education;
20
(4) representatives of health promotion and prevention organizations at institutions
of higher education;
(5) representatives of mental health providers, including community mental health
centers; and
(6) representatives of private-sector and public-sector groups with experience in the
development of effective public health education campaigns.
(d) Plan
The working group under subsection (b) shall develop a plan that-
(1) targets promotional and educational efforts to the age population of students at
institutions of higher education and individuals who are employed in settings of
institutions of higher education, including through the use of roundtables;
(2) develops and proposes the implementation of research-based public health
messages and activities;
(3) provides support for local efforts to reduce stigma by using the National Health
Information Center as a primary point of contact for information, publications, and
service program referrals; and
(4) develops and proposes the implementation of a social marketing campaign that is
targeted at the population of students attending institutions of higher education and
individuals who are employed in settings of institutions of higher education.
(e) Definition
In this section, the term "institution of higher education" has the meaning given such
term in section 1001 of title 20.
(f) Authorization of appropriations
To carry out this section, there are authorized to be appropriated $1,000,000 for the
period of fiscal years 2018 through 2022.
(July 1, 1944, ch. 373, title V, §549, as added Pub. L. 114–255, div. B, title IX, §9033, Dec.
13, 2016, 130 Stat. 1261 .)
CA State law AB-638 Mental Health Services Act: early intervention and prevention
programs.(2021-2022)
21
(e) Prevention and early intervention funds may be used to broaden the provision of
community-based mental health services by adding prevention and early intervention
services or activities to these services, including prevention and early intervention
strategies that address mental health needs, substance misuse or substance use
disorders, or needs relating to cooccurring mental health and substance use services.
(f) In consultation with mental health stakeholders, and consistent with regulations from
the Mental Health Services Oversight and Accountability Commission, pursuant to
Section 5846, the department shall revise the program elements in Section 5840
applicable to all county mental health programs in future years to reflect what is learned
about the most effective prevention and intervention programs for children, adults, and
seniors.
Request a letter of support currently there is far too many silos have impeded progress
to be effective our nation crisis of children’s welfare at stake. U.S. Congress H.R.7255 -
Garrett Lee Smith Memorial Reauthorization Act request be brought back with additional
language to include social media outline above reference links with Fentanyl crisis
campaign awareness funding with law enforcement investments. Request language all
schools from elementary to high school be required under federal funds to address the
lack training with social media and be held accountable for child abuse such as Esports
in schools and not promote in YMC under 13 years old by law.
Requests create a federal with CA State Bi partisan caucuses to address to many silo
impediments to hold hearings, raise awareness. The U.S. Senate Judiciary report 150
organization support product safety social medial please add one more to the list
P.R.D.D.C. Craig A Durfey.
Thank You
Craig A Durfey
22
23
1
Testimony of Kristin Bride
United States Senate Committee on the Judiciary
Hearing on Protecting Our Children Online
February 14, 2023
Thank you, Chairman Durbin, Ranking Member Graham, and members of the committee. My
name is Kristin Bride. I am a survivor parent and social media reform advocate, and member of
the bipartisan Council for Responsible Social Media.
I am testifying here today to bring a face to the harms occurring every day resulting from the
unchecked power of the social media industry. This is my son Carson Bride with beautiful blue
eyes, an amazing smile, and a great sense of humor, who will be forever 16 years old. As
involved parents raising our two sons in Oregon, we thought we were doing everything right. We
waited until Carson was in 8th grade to give him his first cell phone, an old phone with no apps.
We talked to our boys about online safety and the importance of never sending anything online
that you wouldn’t want your name and face next to on a billboard. Carson followed these
guidelines. Yet tragedy still struck our family.
It was June 2020; Carson had just gotten his first summer job making pizzas, and after a
successful first night of training, he wrote his upcoming work schedule on our kitchen calendar.
We expressed how proud we were of him for finding a job during the pandemic. In so many
ways, it was a wonderful night, and we were looking forward to summer. The next morning, I
woke to the complete shock and horror that Carson had hung himself in our garage while we
slept.
In the weeks that followed, we learned that Carson had been viciously cyberbullied by his
“Snapchat friends,” his high school classmates who were using the anonymous apps Yolo and
LMK on Snapchat to hide their identities. It wasn’t until Carson was a freshman in high school
that we finally allowed him to have social media because that was how all the students were
making new connections. What we didn’t know is that apps like Yolo and LMK were using
popular social media platforms to promote anonymous messaging to hundreds of millions of teen
users.
After his death, we discovered that Carson had received nearly 100 negative, harassing, sexually
explicit, and humiliating messages, including 40 in just one day. He asked his tormentors to
“swipe up” and identify themselves so they could talk things out in person. No one ever did. The
last search on his phone before Carson ended his life was for hacks to find out the identities of
his abusers.
Anonymous apps like Whisper, Sarahah, and YikYak have a long history of enabling
cyberbullying, leading to teen suicides.1 The critical flaws in these platforms are compounded by
the fact that teens do not typically report being cyberbullied. They are too fearful that their
phones to which they are completely addicted will be taken away or that they will be labeled a
snitch by their friends.
2
Yolo’s own policies stated that they would monitor for cyberbullying and reveal the identities of
those who do so. I reached out to Yolo on 4 separate occasions in the months following Carson’s
death, letting them know what happened to my son and asking them to follow their own policies.
I was ignored all 4 times. At this point, I decided to fight back.
I filed a National Class Action Lawsuit in May 2021, against Snap Inc., Yolo, and LMK.2 We
believe Snap Inc. suspended Yolo and LMK from their platform because of our advocacy.
However, our complaint against Yolo and LMK for product liability design defects and
fraudulent product misrepresentation was dismissed in the Central District Court of California
last month, citing Section 230 immunity.3 And still, new anonymous apps like NGL and sendit
are appearing on social media platforms and charging teens subscription fees to reveal the
messenger or provide useless hints.
I speak before you today with the tremendous responsibility to represent the many other parents
who have also lost their children to social media harms. In the audience are Rose Bronstein from
Illinois who lost her son Nate and Christine McComas from Maryland who lost her daughter
Grace, both to suicide after being viciously cyberbullied over social media. Our numbers
continue to grow exponentially with teen deaths from dangerous online challenges fed to them
on TikTok, sextortion over Facebook, fentanyl-laced drugs purchased over Snapchat, and deaths
from eating disorder content over Instagram. I have included the stories of my fellow survivor
parent advocates in this written testimony.
Let us be clear—these are not coincidences, accidents, or unforeseen consequences. They are the
direct result of products designed to hook and monetize America’s children.
It should not take grieving parents filing lawsuits on behalf of their dead children to hold this
industry accountable for their dangerous and addictive product designs. Federal legislation like
the Kids Online Safety Act (KOSA), which requires social media companies to have a duty of
care when designing their products for America’s children, is long overdue. We need our
lawmakers to step up, put politics aside, and finally protect all children online.
Thank you for this opportunity, and I look forward to answering your questions.
3
Cyberbullying Frequency (2022, Pew Research Center)4
US Teens aged 13-17 reported:
• 46% experienced cyberbullying, with offensive name calling being the most common
type of harassment
• 22% had false rumors spread about them
• 17% received explicit images they didn't ask for
• 15% report being constantly asked where they are; what they are doing or who they are
with by someone other than a parent
• 10% reported receiving physical threats
• 7% reported having explicit images of them shared without their consent
Cyberbullying Impact (2018, Cyberbullying Research Center)5
Cyberbullying is more devastating than traditional bullying because:
• The victim may not know who is bullying them due to anonymity.
• Hurtful actions go viral which increases the audience and aggressors to limitless.
• It is easier to be cruel on-line as no social cues exist.
Cyberbullying and Suicidal Ideations (2022, JAMA Network Open Study)6
• Cyberbullying was the #1 cause of suicidal ideations in adolescents aged 10-13 years old
based on a study of 10,414 United States adolescents.
Cyberbullying Reporting:
Reasons teens don’t report cyberbullying (2021)7:
• Fear of losing their access to their technology:
o The Pew Research Center reports that 65% of parents have taken away a teen’s
phone or internet privileges as punishment.8
• They don’t want to be seen as snitch and lose even more social status.
• Ashamed for being a target
Parent Concerns (2023, Pew Research)9
• 35% of parents are worried that their kids may be bullied (2nd to Anxiety and Depression)
4
Citations:
1Ian Martin, Hugely Popular NGL App Offers Teenagers Anonymity In Comments About Each other
(June 29, 2022), FORBES at https://www.forbes.com/sites/iainmartin/2022/06/29/hugely-popular-ngl-
app-offers-teenagers-anonymity-in-comments-about-each-other/
2Bride et al. v. Snap Inc., Yolo Technologies Inc., Lightspace Inc., No. 21-cv-6680 (Central District of
California), ECF No. 1 (Class Action Complaint)
3Bride et al. v. Snap Inc., Yolo Technologies Inc., Lightspace Inc., No. 21-cv-6680 (Central District of
California), ECF No. 142 (Order Dismissing Complaint)
4Vogels, E. (2022, Dec 15), Teens and Cyberbullying 2022, Pew Research Center,
https://www.pewresearch.org/internet/2022/12/15/teens-and-cyberbullying-2022/
5Hinduja, Sameer PhD., Patchin, Justin W. PhD., Cyberbullying, identification, Prevention and Response,
(2018) at https://cyberbullying.org/Cyberbullying-Identification-Prevention-Response-2018.pdf
6Arnon S, Brunstein Klomek A, Visoki E, et al. (2022), Association of Cyberbullying Experiences and
Perpetration With Suicidality in Early Adolescence (2022). JAMA Network Open. 2022;5(6):e2218746.
doi:10.1001/jamanetworkopen.2022.18746
7 Dong, Menga, (2021, Feb 9), Why Teens Don’t Report Cyberbullying at
https://desis.osu.edu/seniorthesis/index.php/2021/02/09/why-teens-dont-report-cyberbullying/
8 Pew Research Statistics (2016, Jan 7), Parents Teens & Digital Monitoring at
https://www.pewresearch.org/internet/2016/01/07/parents-teens-and-digital-monitoring/
9 Pew Research Statistics from Pew Research Center https://www.axios.com/2023/01/29/kids-parents-
mental-health-depression-anxiety
5
Social Media Harms Parent Survivor Advocates
Tawainna Anderson, Pennsylvania
Tawainna lost her 10-year-old daughter, Nylah, last year when she tried the “Blackout
Challenge” in a closet of their home. TikTok’s algorithm served Nylah a video featuring the
dangerous challenge on her "For You" page. Tawainna discovered her daughter’s body next to
her phone, and the strangulation marks on her neck suggested she desperately tried to free herself
before she died.
Joann Bogard, Indiana
Joann’s son Mason died at age 15 years old after he participated in a challenge he’d seen on
YouTube called “the Choking Game.” He was rushed to the hospital, but his parents had to make
the heart wrenching decision to take him off life support and donate his organs. In the years
since, Joann has reported hundreds of choking game videos to YouTube, TikTok, and other
platforms but they have universally told her the videos don’t violate their guidelines, despite
killing hundreds of children like Mason, because they have a commercial interest in maximizing
content on their platforms.
Kristin Bride, Oregon
Kristin’s son, Carson was 16 years old when he died by suicide after he was viciously
cyberbullied by his high school “Snapchat Friends” who were using the anonymous apps Yolo
and LMK to hide their identities. Carson received over 100 humiliating, threatening and
sexually explicit messages before he ended his life. The last search on his phone was for hacks
to find out who was abusing him. When Kristin repeatedly contacted Yolo asking them to
follow their own stated policies for monitoring and revealing the identities of those who
cyberbully on their platform, she was ignored all 4 times.
Rose Bronstein, Illinois
Rose’s son Nate was 15 years old when he died by suicide after he was viciously cyberbullied by
over 20 high school classmates. Nate received hateful and threatening messages via iMessage. A
Snapchat message was created by a classmate and reposted 7 times by others. It included threats
of physical harm and death. The Snapchat quickly went viral and reached hundreds of Chicago
area students. Nate also received a separate Snapchat message that read “go kill yourself”.
LaQuanta Hernandez, Texas
LaQuanta’s 13-year-old daughter, Jazmine, was cyberbullied for months via TikTok and
Instagram on the basis of her race. The bullies sent her racist comments and photos, including
photoshopping her face onto Emmett Till’s body after being lynched by the KKK. Instagram
took over three days to take down the posts. Jazmine was too scared to sleep in her own bed until
the posts were taken down.
6
Tracy Kemp, Texas
Tracy’s 14-year-old son Brayden was among a group of Black students who were targeted by
racist cyberbullies on Instagram and Snapchat. The accounts used the school’s name and logo
and called on other students to take and submit pictures of Black students without their consent.
She says the racist cyberbullying has drastically affected her son’s mental health. The anonymity
these platforms provide encourages this type of abusive and bullying behavior.
Rosemarie Maneri, New York
Shylynn was 16 years old when she was contacted by an adult via Facebook who coerced her
into sending inappropriate photos of herself. Although she tried to block him, he reached out to
her best friend and her best friend’s mom to get back into her life. He then threatened to release
her photos to her friends and family if she did not continue to send him photos and continue the
relationship with him. Embarrassed, scared and not knowing what to do to make it all go away,
Shylynn died from suicide at just 18 years old.
Christine McComas, Maryland
Christine’s 14-year-old daughter Grace went from being a joyful, active teen to death by suicide
in less than a year after malicious, death-wishing and dehumanizing cyber-abuse on Twitter.
Christine screenshot the abuse and sought help from multiple public agencies including schools,
police and the court system to no avail. The screenshot proof of social media abuse led to the
unanimous passage of Maryland's criminal statute named Grace's Law less than a year after her
death. An update to Grace's Law (2.0) was passed in 2019 to keep up with digital dangers.
Annie McGrath, Wisconsin
Annie’s son Griffin died at 13 years old after he participated in an online challenge called “the
Choking Game.” Griffin had a YouTube channel and was trying to get more likes and comments
on his videos, which may have tempted him to participate in the dangerous challenge.
Maurine Molak, Texas
David Molak died by suicide at the age of 16 after months of devastating and relentless
cyberbullying by a group of students on Instagram, text, video, and GroupMe. Bullies threatened
him and told him he should never go back to school. The cyberbullying left him feeling helpless
and hopeless because neither he nor his parents could make it stop.
Amy Neville, Arizona
At 14, Amy’s son, Alexander Neville, had his whole life ahead of him until he took a single pill
that he was led to believe was oxycodone. However, it contained deadly fentanyl. Snapchat made
it easy for a drug dealer to connect with him. Unfortunately, Alexander’s case is not a one off
situation. This happens everyday all across our country.
7
Erin Popolo, New Jersey
Erin's daughter, Emily Murillo, was a special education student who was bullied in person for
most of her school career. During the pandemic shutdown, the bullies continued to reach out to
her via Snapchat and Instagram. At 17 years old Emily lost hope that she would ever be viewed
as ‘normal’ by her peers and died by suicide in January of 2021. The bullying continued as
hackers hijacked Emily’s Zoomed funeral, sending cruel messages, and posting inappropriate
images on the Zoom for all of Emily’s mourning family to see, until they finally had to stop the
funeral.
Despina Prodromidis, New York
Despina’s daughter Olivia died at 15 years old after meeting an adult stranger over Snapchat – a
common problem across platforms who introduce kids to adult strangers to keep them engaged
and online. This man gave her a drug which turned out to be pure fentanyl.
Neveen Radwan, California
Neveen’s 15-year-old daughter, Mariam, was an avid user of several social media platforms at
the time of her anorexia diagnosis. These apps constantly bombarded her with “pro-ano” (pro-
anorexia) content. The algorithms targeted her with “skinny challenges” and manipulated content
that triggered her illness to an all-time high. She then embarked on a life-threatening journey of
over 2 years, in multiple hospitals, and almost dying multiple times.
Mary Rodee, New York
Mary’s son, Riley, died by suicide at 15 years old. He was sextorted on Facebook by an adult
who pretended to be a teenage girl and then threatened to release compromising images of Riley
unless he gave them thousands of dollars. Riley, ashamed and frightened, died just six hours after
the contact began. Facebook never responded when Mary and Riley’s father reported the
incident.
Judy Rogg, California
Judy’s son, Erik Robinson, died at 12 years old after participating in the “choking challenge” that
was and continues to be widely circulated on YouTube. Erik rarely used YouTube – he heard
about the challenge from a friend who did, a sadly common pattern that shows even children
whose parents don’t allow them access to social media are vulnerable to its harms. Investigators
determined that Erik died from this just the day after he learned about it. He had no idea that this
could cause harm or death.
Deb Schmill, Massachusetts
Deb’s daughter, Becca, died at 18 years old of fentanyl poisoning from drugs she and a friend
purchased from a dealer they found on Snapchat. Becca was sexually assaulted at 15 by a boy
she’d met on social media and, shortly after the assault, her peers started cyberbullying her by
text and over Snapchat. Becca turned to drugs to help ease the pain and was unaware the drugs
she bought over Snapchat – a massive, nearly untraceable drug market thanks to the platform’s
design – contained fentanyl.
Written Testimony of Josh Golin
Executive Director, Fairplay
Before the Senate Judiciary Committee
Hearing on “Protecting our Children Online”
February 14, 2023
My name is Josh Golin and I am Executive Director of Fairplay.
I would like to thank Chairman Durbin, Ranking Member Graham, and the Distinguished
Members of the Committee for holding this hearing of critical importance to America’s families,
and for inviting me to testify.
For more than a decade, social media companies have been performing a vast uncontrolled
experiment on our children. They use the reams of data they collect on young people and
endless A/B testing to fine tune their platforms’ algorithms and design to maximize
engagement, because more time and activity on a platform means more revenue. And because
the way these platforms engage with young people is largely unregulated, there is no obligation
to consider and mitigate the harmful effects of their design choices on children and teens.
The resulting impact on children and families has been devastating. Compulsive overuse,
exposure to harmful and age-inappropriate content, cyberbullying, eating disorders, harms to
mental health, and the sexual exploitation of children are just some of the problems linked to
Big Tech’s insidious business model.
It doesn’t have to be this way. Instead of prioritizing engagement and data collection, apps,
websites, and online platforms could be built in ways that reduce risks and increase safeguards
for children and teens. With many young people now spending a majority of their waking hours
online and on social media, improving the digital environment so it is safer and not exploitative
or addictive is one of the most important things we can do to address the mental health crisis.
But that won’t happen through self-regulation. It is past time for Congress to enact legislation
that expands privacy protections for young people and requires online operators to prioritize
children’s wellbeing in their design choices. Without meaningful congressional action, children
and teens will continue to be harmed in the most serious and tragic ways by Instagram, TikTok,
Snapchat, YouTube, and thousands of lesser known apps, websites, and platforms.
My testimony today will describe how many of the most serious issues facing children and
teens online are a direct result of design choices made to further companies’ bottom lines, and
Congress’s failure to enact meaningful safeguards. I will then describe the types of protections
that should be included in any online safety and privacy legislation.
Testimony of Josh Golin, Fairplay, February 14, 2023 2
I. About Fairplay
Fairplay is the leading independent watchdog of the children’s media and marketing industries.
We are committed to building a world where kids can be kids, free from the false promises of
marketers and the manipulations of Big Tech. Our advocacy is grounded in the overwhelming
evidence that child-targeted marketing – and the excessive screen time it encourages –
undermines kids’ healthy development.
Through corporate campaigns and strategic regulatory filings, Fairplay and our partners have
changed the child-targeted marketing and data collection practices of some of the world’s
biggest companies. In 2021, we led a large international coalition of parents, advocates, and
child development experts to stop Meta from releasing a version of Instagram for younger
children.1 Our 2018 Federal Trade Commission complaint against Google for violating the
Children’s Online Privacy Protection Act (COPPA) led to the 2019 FTC settlement that required
Google to pay a record fine and to limit data collection and targeted ad vertising on child-
directed content on YouTube.2 With our partners at the Center for Digital Democracy, we have
filed other requests for investigation at the FTC that remain pending. We have documented, for
example, that Google Play recommends apps for young children that violate COPPA and uses
unfair monetization techniques;3 that TikTok has not complied with the 2019 FTC Consent
Decree that it was violating COPPA;4 and that Prodigy, a popular online math game assigned to
millions of elementary school students across the country, uses manipulative design to unfairly
promote expensive subscriptions to children.5
Fairplay also leads the Designed with Kids in Mind Coalition, which advocates for regulations
that would require operators to make the best interests of children a primary consideration
1 Brett Molina and Terry Collins, Facebook postponing Instagram for kids amid uproar from parents, lawmakers,
USA Today (Sept. 27, 2021),
https://www.usatoday.com/story/tech/2021/09/27/instagram-kids-version-app-children-pause/5881425001/.
2 Campaign for a Commercial-Free Childhood (now Fairplay) and Center for Digital Democracy, Request to
Investigate Google’s YouTube Online Service and Advertising Practices for Violating the Children’s Online Privacy
Protection Act, Counsel for Center for Digital Democracy and Campaign for a Commercial-Free Childhood before
the Federal Trade Commission (filed April 2, 2018), https://fairplayforkids.org/advocates-say-googles-youtube-
violates-federal-childrens-privacy-law/.
3 Campaign for a Commercial-Free Childhood (now Fairplay) and Center for Digital Democracy, Request to
Investigate Google’s Unfair and Deceptive Practices in Marketing Apps for Children , Counsel for Center for Digital
Democracy and Campaign for a Commercial-Free Childhood before the Federal Trade Commission (filed Dec. 12,
2018), https://fairplayforkids.org/apps-which-google-rates-safe-kids-violate-their-privacy-and-expose-them-other-
harms/.
4 Campaign for a Commercial-Free Childhood (now Fairplay) and Center for Digital Democracy, Complaint and
Request for Investigation of TikTok for Violations of the Children’s Online Privacy Protection Act and Implementing
Rule, Counsel for Campaign for a Commercial-Free Childhood and Center for Digital Democracy before the Federal
Trade Commission (filed May 14, 2020), https://fairplayforkids.org/wp-
content/uploads/2020/05/tik_tok_complaint.pdf.
5 Campaign for a Commercial-Free Childhood (now Fairplay), Request for Investigation of Deceptive and Unfair
Practices by the Edtech Platform Prodigy, Campaign for a Commercial-Free Childhood before the Federal Trade
Commission (filed Feb. 19, 2020), https://fairplayforkids.org/wp-
content/uploads/2021/02/Prodigy_Complaint_Feb21.pdf.
Testimony of Josh Golin, Fairplay, February 14, 2023 3
when designing apps, websites, and platforms likely to be accessed by young people.6 Fairplay
and many of our coalition members actively supported the successful passage of the California
Age Appropriate Design Code. We were also lead organizers on the 2022 federal legislative
campaigns for the Kids Online Safety Act and the Children and Teens’ Online Privacy Protection
Act. And in November of last year, we filed a Petition for Rulemaking, signed by 21
organizations, urging the FTC to declare that certain design techniques used by online platforms
to maximize engagement are unfair practices.7
Fairplay is also home to the Screen Time Action Network, a collaborative community of
practitioners, educators, advocates, and parents who work to reduce excessive technology use
harming children, adolescents, and families. The Action Network hosts seven work groups,
including Online Harms Prevention, a group whose members include today’s witness Kristin
Bride and several other parents who have tragically lost their children to social media harms.
II. Children and teens spend a significant portion of their day using digital media.
Digital device use begins in early childhood: Nearly half of 2 - to 4-year-olds and more than two-
thirds of 5- to 8-year-olds have their own tablet or smartphone.8 Preschool-age children
average 2.5 hours of screen media use per day, and five- to eight-year-olds average about 3
hours.9 In a study of elementary school-aged children’s digital media use during the pandemic,
approximately one-third of parents reported that their children began using social media at a
younger age than they had originally planned.10
Despite the fact that all major social media sites have a minimum age of 13 in their terms of
service, a growing number of younger children use platforms like TikTok, Snapchat and
Instagram. About half of parents of children ages 10 to 12 and 32% of parents of kids ages 7 to
9 reported their child used social media apps in the first six months of 2021.11 That same year,
18% of 8- to 12-year-olds reported using social media every day, a 38% increase from just two
years prior.12 Leaked documents from TikTok revealed the company used machine learning to
6 Coalition members include Accountable Tech, American Academy of Pediatrics, Center for Digital Democracy,
Center for Humane Technology, Children and Screens, Common Sense, Electronic Privacy Information Center,
Exposure Labs: The Creators of The Social Dilemma, Fairplay, ParentsTogether, and RAINN:
https://designedwithkidsinmind.us/.
7 Center for Digital Democracy & Fairplay, In the Matter of Petition for Rulemaking to Prohibit the Use on Children
of Design Features that Maximize for Engagement, (filed Nov. 17 2022). https://fairplayforkids.org/wp-
content/uploads/2022/11/EngagementPetition.pdf
8 Victoria Rideout & Michael B. Robb, The Common Sense Census: Media Use by Kids Age Zero to Eight, 2020,
Common Sense Media at 25, (2020), https://www.commonsensemedia.org/sites/
default/files/research/report/2020_zero_to_eight_census_final_web.pdf.
9 Id.
10 Tiffany Munzer, Chioma Torres, et al., Child Media Use During COVID-19: Associations with Contextual and
Social-Emotional Factors, 43 Journal of Developmental and Behavioral Pediatrics at 3 (2022),
https://pubmed.ncbi.nlm.nih.gov/36106745/.
11 Kristen Rogers, Children under 10 are using social media. Parents can help them stay safe online, CNN, (Oct. 18,
2021), https://www.cnn.com/2021/10/18/health/children-social-media-apps-use-poll-wellness/index.html
12 Victoria Rideout, Alanna Peebles, et al., The Common Sense Census: Media Use by Tweens and Teens at 12,
(2022), https://www.commonsensemedia.org/sites/default/files/research/report/8 -18-census-integrated-report-
final-web_0.pdf.
Testimony of Josh Golin, Fairplay, February 14, 2023 4
analyze user accounts and classified one-third of the platform’s users as under 14,13 which
suggests platform operators are well aware that children lie about their age in order to access
social media.
Further, research indicates the pandemic has increased screen media use for preteens and
teenagers. In 2021, preteens (ages 8 to 12) averaged over 5.5 hours of entertainment screen
time per day and teens (ages 13 to 18) averaged a remarkable 8.5 hours daily - a 17% increase
from 2019 for both age groups.14 Much of this time is spent on the major social media
platforms. Ninety-five percent of teens say they use YouTube, and 67% say they use TikTok.15
Thirty-five percent of teens say they are using one of the top five online platforms – YouTube,
TikTok, Instagram, Snapchat, or Facebook – “almost constantly.”16
Teens’ and preeteens’ daily screentimes vary based on race and household income. White
preteens average 4.5 hours of entertainment screen time use daily, compared to Black preteens
(6.5 hours) and Hispanic/Latino preteens (7 hours). White teens spend appro ximately 8 hours
per day on screens for entertainment, while Black and Hispanic/Latino teens average
approximately two hours more.17 Preteens in higher-income households spend just under 4.5
hours of screen time per day, compared to preteens in middle-income households (5.75 hours)
and lower-income households (7.5 hours). Teens in higher-income households spend about 2.5
hours less daily on screens for entertainment compared to teens in lower - and middle-income
households, (7 and 9.5 hours daily, respectively).18
III. Overuse of digital media is linked to a number of serious harms for young people
Increased time online and social media use is linked to serious harms for young people. As the
Surgeon General has observed – and as described in detail in Section IV of this testimony –
“[b]usiness models are often built around maximizing user engagement as opposed to
safeguarding users’ health and ensuring that users engage with one another in safe and healthy
ways . . . This translates to technology companies focusing on maximizing time spent, not time
well spent.” 19 By maximizing time and activities online, the design choices made by platforms
to maximize engagement harm minors in a number of ways, including: undermining mental
health, harm to body image, fostering problematic internet use, harming physical health,
13 Raymond Zhong and Sheera Frenkel, A Third of TikTok’s U.S. Users May Be 14 or Under, Raising Safety Questions,
New York Times, (Aug. 14, 2020), https://www.nytimes.com/2020/08/14/technology/tiktok-underage-users-
ftc.html.
14 Common Sense, The Common Sense Census: Media Use by Tweens and Teens at 12 (2022),
https://www.commonsensemedia.org/sites/default/files/research/report/8 -18-census-integrated-report-final-
web_0.pdf.
15 Emily A. Vogels et al., Teens, Social Media and Technology 2022, Pew Research Center (Aug. 10, 2022),
https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022.
16 Id.
17 Victoria Rideout, Alanna Peebles, et al., The Common Sense Census: Media Use by Tweens and Teens at 12,
(2022), https://www.commonsensemedia.org/sites/default/files/research/report/8-18-census-integrated-report-
final-web_0.pdf.
18 Id.
19 Protecting Youth Mental Health: The U.S. Surgeon General’s Advisory at 25 (2021),
https://www.hhs.gov/sites/default/files/surgeon-general-youth-mental-health-advisory.pdf.
Testimony of Josh Golin, Fairplay, February 14, 2023 5
increasing minors’ risk of contact with dangerous or harmful people, and increasing minors’
exposure to age-inappropriate and otherwise harmful content.
Harm to mental health
Maximizing minors’ time and activities online is linked with worse psychological wellbeing in
minors in concrete and serious ways that cannot be ignored in the context of the current youth
mental health crisis.
Heavy users of digital media are more likely to be unhappy, to be depressed, or to have
attempted suicide.20 Two nationally representative surveys of U.S. adolescents in grades 8
through 12 found “a clear pattern linking screen activities with higher levels of depressive
symptoms/suicide-related outcomes and nonscreen activities with lower levels.”21 The same
research found that suicide-related outcomes became elevated after two hours or more a day
of electronic device use.22 Among teens who used electronic devices five or more hours a day, a
staggering 48% exhibited at least one suicide risk factor.23 Of particular concern, a large and
growing body of research indicates a strong link between time spent on social media—some of
the services most relentless in their deployment of engagement-maximizing techniques—and
serious mental health challenges.24 More frequent and longer social media use is associated
with depression,25 anxiety,26 and suicide risk factors.27
Even if some of these documented associations are explained by children’s underlying
emotional challenges, the design features that online platforms deploy to maximize
engagement are likely to have differential negative effects on these young people. For example,
children with more negative emotionality may seek endless scrolling as a means of dissociating
20 Jean M. Twenge & W. Keith Campbell, Media Use Is Linked to Lower Psychological Well-Being: Evidence from
Three Datasets, 90 Psychol. Q., 311 (2019). https://pubmed.ncbi.nlm.nih.gov/30859387/
21 Jean M. Twenge et al., Increases in Depressive Symptoms, Suicide-Related Outcomes, and Suicide Rates Among
U.S. Adolescents After 2010 and Links to Increased New Me dia Screen Time, 6 Clinical Psychol. Sci. 3, 9 (2018)
https://doi.org/10.1177/2167702617723376. See also Jane Harness et al., Youth Insight About Social Media Effects
on Well/Ill-Being and Self-Modulating Efforts, 71 J. Adolescent Health, 324-333 (Sept. 1, 2022),
10.1016/j.jadohealth.2022.04.011; Amy Orben et al., Windows of Developmental Sensitivity to Social Media, 13
Nature Comm., 1649, (2022), 10.1038/s41467-022-29296-3
22 Id.
23 Id.
24 See, e.g., K.E. Riehm et al., Associations Between Time Spent Using Social Media and Internalizing and
Externalizing Problems Among US Youth, 76 JAMA Psychiatry, 1266 (2019),
https://doi.org/10.1001/jamapsychiatry.2019.2325; N. McCrae et al., Social Media and Depressive Symptoms in
Childhood and Adolescence: A Systematic Review, 2 Adolescent Res. Rev., 315 (2017),
https://doi.org/10.1007/s40894-017-0053-4; H. Allcott et al., The Welfare Effects of Social Media, 110 Econ. Rev.
Am. 629 (2020), https://www.aeaweb.org/articles?id=10.1257/aer.20190658
25 Jean M. Twenge & W. Keith Campbell, Media Use Is Linked to Lower Psychological Well-Being: Evidence from
Three Datasets, 90 Psychol. Q. at 312 (2019). https://pubmed.ncbi.nlm.nih.gov/30859387/
26 Royal Society for Public Health, #StatusOfMind: Social Media and Young People’s Mental Health and Wellbeing 8
(May 2017), https://www.rsph.org.uk/static/uploaded/d125b27c-0b62-41c5-a2c0155a8887cd01.pdf
27 Jean M. Twenge & W. Keith Campbell, Media Use Is Linked to Lower Psychological Well-Being: Evidence from
Three Datasets, 90 Psychol. Q. (2019). https://pubmed.ncbi.nlm.nih.gov/30859387/
Testimony of Josh Golin, Fairplay, February 14, 2023 6
from emotional distress,28 yet may be recommended more negative content based on their
previous behavior.29 Former Meta employee Frances Haugen has described how the company
(then called Facebook) documented this harmful cycle in its own internal research on
Instagram: “And what's super tragic is Facebook's own research says, as these young women
begin to consume this -- this eating disorder content, they get more and more depressed. And it
actually makes them use the app more. And so, they end up in this feedback cycle where they
hate their bodies more and more.”30
Harm to body image
Design features that maximize time spent on social media can also lead to heightened exposure
to content which increases minors’ susceptibility to poor body image and, consequently,
disordered eating. A 2019 study of 7th and 8th graders in the International Journal of Eating
Disorders “suggest[ed] that [social media], particularly platforms with a strong focus on image
posting and viewing, is associated with elevated [disordered eating] cognitions and behaviors in
young adolescents.”31 Another study found a positive correlation between higher Instagram use
and orthorexia nervosa diagnoses.32 Personal stories from sufferers of disordered eating have
highlighted the link to social media,33 as has Meta’s own internal research; the documents
Frances Haugen shared with the Wall Street Journal in 2021 revealed that Facebook has been
aware at least since 2019 that “[w]e make body image issues worse for one in three teen
girls.”34
Risk of problematic internet use and its associated harms
Maximizing time and activities online also fosters “problematic internet use”—psychologists’
term for excessive internet activity that exhibits addiction, impulsivity, or compulsion.35 A 2016
28Amanda Baughan et al., “I Don’t Even Remember What I Read”: How Design Influences Dissociation on Social
Media, CHI Conference on Human Factors in Computing Systems, 1-13 (2022),
https://dl.acm.org/doi/pdf/10.1145/3491102.3501899.
29 Kait Sanchez, Go Watch this WSJ investigation of TikTok’s Algorithm, The Verge, (July 21, 2021),
https://www.theverge.com/2021/7/21/22587113/tiktok-algorithm-wsj-investigation-rabbit-hole.
30 Scott Pelley, Whistleblower: Facebook is misleading the public on progress against hate speech, violence,
misinformation ,CBS, (Oct. 3, 2021), https://www.cbsnews.com/news/facebook-whistleblower-frances-haugen-
misinformation-public-60-minutes-2021-10-03/.
31 Simon M. Wilksch et al., The Relationship Between Social Media Use and Disordered Eating in Young Adolescents ,
53 Int. J. Eat. Disord. 96, 104 (2020).
32 Pixie G. Turner & Carmen E. Lefevre, Instagram Use Is Linked to Increased Symptoms of Orthorexia Nervosa, 22
Eating Weight Disorders 277, 281 (2017).
33 See, e.g., Jennifer Neda John, Instagram Triggered My Eating Disorder, Slate (Oct. 14, 2021),
https://slate.com/technology/2021/10/instagram-social-media-eating-disorder-trigger.html; Clea Skopeliti, ‘I Felt
My Body Wasn’t Good Enough’: Teenage Troubles with Instagram , The Guardian (Sep. 18, 2021),
https://www.theguardian.com/society/2021/sep/18/i-felt-my-body-wasnt-good-enough-teenage-troubles-with-
instagram.
34 Georgia Wells et al., Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show, W.S.J. (Sept.
14, 2021), https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-
show-11631620739.
35 Chloe Wilkinson et al., Screen Time: The Effects on Children’s Emotional, Social, and Cognitive Development,
Informed Futures, at 6, (2021), https://informedfutures.org/wp-content/uploads/Screen-time-The-effects-on-
childrens-emotional-social-cognitive-development.pdf.
Testimony of Josh Golin, Fairplay, February 14, 2023 7
nationwide survey of minors ages 12 to 18 found that 61% of teens thought they spent too
much time on their mobile devices, and 50% felt “addicted” to them.36 In a 2022 Pew Research
survey, 35% of teens said they are on YouTube, TikTok, Instagram, Snapchat, or Facebook
“almost constantly.”37 And a report released last week by Amnesty International on young
people ages 13-24 found “a staggering 74% of respondents report checking their social media
accounts more than they would like to. Respondents bemoaned the ‘addictive’ lure of the
constant stream of updates and personalized recommendations, often feeling ‘overstimulated’
and ‘distracted.’”38
Problematic internet use, in turn, is linked to a host of additional problems. For example, one
study of 564 children between the ages of 7 and 15 found that problematic internet use was
positively associated with depressive disorders, Attention Deficit Hyperactivity Disorder,
general impairment, and increased sleep disturbances.39 A meta-analysis of peer-reviewed
studies involving cognitive findings associated with problematic internet use in both adults and
adolescents found “firm evidence that [problematic internet use]. . . is associated with cognitive
impairments in motor inhibitory control, working memory, Stroop attentional inhibition and
decision-making.”40 Another study of over 11,000 European adolescents found that among
teens exhibiting problematic internet use, 33.5% reported moderate to severe depress ion;
22.2% reported self-injurious behaviors such as cutting; and 42.3% reported suicidal ideation.41
The rate of attempted suicides was a staggering ten times higher for teens exhibiting
problematic internet use than their peers who exhibited healthy inter net use.42
Harm to physical health
Maximizing minors’ time spent online at the expense of sleep or movement also harms their
physical health. When minors are driven to spend more time online, they sleep less for a variety
of reasons – because it is impossible to be online and sleep at the same time, because
stimulation before bedtime disrupts sleep patterns, and because many of the design features
used by online platforms make users feel pressured to be connected constantly, and that
feeling often doesn’t go away at bedtime. Research shows that minors who exhibit problematic
36 Common Sense, Dealing with Devices: Parents, 10-11, (2016), https://www.commonsensemedia.
org/sites/default/files/research/report/commonsense_dealingwithdevices-topline_release.
pdf.
37 Emily A. Vogels et al., Teens, Social Media and Technology 2022, Pew Research Center (Aug. 10, 2022),
https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022.
38Amnesty International,“We are totally exposed”: Young people share concerns about social media’s impact on
privacy and mental health in global survey (Feb. 7, 2023)
https://www.amnesty.org/en/latest/news/2023/02/children-young-people-social-media-survey-2/.
39 Restrepo et al., Problematic Internet Use in Children and Adolescents: Associations with Psychiatric Disorders and
Impairment, 20 BMC Psychiatry 252 (2020), https://doi.org/10.1186/s12888-020-02640-x.
40 Konstantinos Ioannidis et al., Cognitive Deficits in Problematic Internet Use: Meta-Analysis of 40 Studies, 215
British Journal of Psychiatry 639, 645 (2019), https://pubmed.ncbi.nlm.nih.gov/30784392/.
41 Michael Kaess et al., Pathological Internet use among European adolescents: psychopathology and self -
destructive behaviours, 23 Eur. Child & Adolescent Psychiatry 1093, 1096 (2014),
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4229646/.
42 Id.
Testimony of Josh Golin, Fairplay, February 14, 2023 8
internet use often suffer from sleep problems.43 One-third of teens report waking up and
checking their phones for something other than the time at least once per night.44 Some teens
set alarms in the middle of the night to remind them to check their notifications or complete
video game tasks that are only available for a limited time.45
These behaviors in turn create new risks for young people. Screen time before bed is associated
with lower academic performance.46 Teenagers who use social media for more than five hours
per day are about 70% more likely to stay up late on school nights.47 A lack of sleep in teenagers
has been linked to inability to concentrate, poor grades, drowsy-driving incidents, anxiety,
depression, thoughts of suicide, and even suicide attempts.48
A large body of research demonstrates that more time online displaces physical activity49 and is
consistently correlated with minors’ risk of obesity, which in turn increases their risk of serious
illnesses like diabetes, high blood pressure, heart disease, and depression.50 Further, when
minors spend more time online, they are exposed to more advertisements for unhealthy food
and beverages,51 which are heavily targeted toward minors52 and disproportionately marketed
to Black and Hispanic youth.53 In addition, poor sleep quality —which, as discussed above, is
associated with problematic internet use—increases the risk of childhood obesity by 20%.54
43 Anita Restrepo, Tohar Scheininger, et al., Problematic Internet Use in Children and Adolescents: Associations with
Psychiatric Disorders and Impairment, 20 BMC Psychiatry 252 (2020), https://doi.org/10.1186/s12888-020-02640-
x.
44 Common Sense, Screens and Sleep: The New Normal: Parents, Teens, Screens, and Sleep in the United States at 7
(2019), https://www.commonsensemedia.org/sites/default/files/research/
report/2019-new-normal-parents-teens-screens-and-sleep-united-states-report.pdf.
45 Emily Weinstein & Carrie James, Behind Their Screens: What Teens Are Facing (And Adults Are Missing), MIT
Press, at 38 (2022).
46 Chloe Wilkinson et al., Screen Time: The Effects on Children’s Emotional, Social, and Cognitive Development at 4
(2021), https://informedfutures.org/wp-content/uploads/Screen-time-The-effects-on-childrens-emotional-social-
cognitive-development.pdf.
47 Heavy Social Media Use Linked to Poor Sleep, BBC News (Oct. 23, 2019), https://www.bbc.com/
news/health-50140111.
48 Among teens, sleep deprivation an epidemic , Stanford News Ctr. (Oct. 8, 2015),
https://med.stanford.edu/news/all-news/2015/10/among-teens-sleep-deprivation-an-epidemic.html.
49 E de Jong et al., Association Between TV Viewing, Computer Use and Overweight, Determinants and Competing
Activities of Screen Time in 4- to 13-Year-Old Children, 37 Int’l J. Obesity 47, 52 (2013),
https://pubmed.ncbi.nlm.nih.gov/22158265/.
50 Jeff Chester, Kathryn C. Montgomery, et al., Big Food, Big Tech, and the Global Childhood Obesity Pandemic at 3
(2021), https://www.democraticmedia.org/sites/default/files/field/public-files/2021/full_report.pdf.
51 Id.
52 Jeff Chester, Kathryn C. Montgomery, et al., Big Food, Big Tech, and the Global Childhood Obesity Pandemic at 3
(2021), https://www.democraticmedia.org/sites/default/files/field/public-files/2021/full_report.pdf.
53 University of Connecticut Rudd Center for Food Policy & Health et. al., Targeted Food and Beverage Advertising
to Black and Hispanic Consumers: 2022 Update, (Nov. 2022), https://uconnruddcenter.org/wp-
content/uploads/sites/2909/2022/11/TargetedMarketing2022-Executive-Summary.pdf.
54 Yanhui Wu et al., Short Sleep Duration and Obesity Among Children: A Systematic Review and Meta -Analysis of
Prospective Studies, 11 Obesity Rsch. & Clinical Prac. 140, 148 (2015),
https://pubmed.ncbi.nlm.nih.gov/27269366/; Michelle A. Miller et al., Sleep Duration and Incidence of Obesity in
Infants, Children, and Adolescents: A Systematic Review and Meta -Analysis of Prospective Studies, 41 Sleep 1, 15
(2018), https://pubmed.ncbi.nlm.nih.gov/29401314/.
Testimony of Josh Golin, Fairplay, February 14, 2023 9
Harms to Safety
The pressure to spend more time on digital media platforms and maximize interactions with
other users also puts children at risk of predation. Twenty-five percent of 9-to-17-year-olds
report having had an online sexually explicit interaction with someone they believed to be an
adult.55 In 2020, 17% of minors – including 14% of 9-12-year-olds – reported having shared a
nude photo or video of themselves online. Of these children and teens, 50% reported having
shared a nude photo or video with someone they had not met in real life, and 41% reported
sharing with someone over the age of 18.56
Design features that maximize engagement also increase young people’s risk of cyberbullying. A
2022 survey by the Pew Research Center found that nearly 50% of teens reported being
cyberbullied.57 Sexual minority and gender expansive youth report being exposed to
anonymous forms of cyberbullying more than their heterosexual and cisgender counterparts.58
Cyberbullying is linked to increased risky behaviors such as smoking and incr eased risk of
suicidal ideation.59
It’s worth noting that these serious threats to children’s safety aren’t limited to social media.
The FTC’s recent settlement with Epic Games documented how the default text and voice chat
settings on Fortnite led children and teens to communicate with strangers, including adults. As
a result, children were subject to harassment, bullying, and predation while playing the wildly
popular game.60
IV. The platforms where children spend the majority of their time online are des igned to
maximize engagement, often at the expense of children’s wellbeing and safety.
Digital platforms are designed to maximize engagement. The longer a user is on a plat form and
the more they do on the platform, the more data the user generates. Tech co mpanies and their
marketing partners use this valuable data to target users with advertising .61 Gaming app
companies employ teams of experts who specialize in user acquisition and retention .62 The
55 Thorn. “Responding to Online Threats: Minors’ Perspectives on Disclosing, Reporting, and Blocking .” (May 2021),
https://info.thorn.org/hubfs/Research/Responding%20to%20Online%20Threats_2021-Full-Report.pdf.
56 Thorn. “Understanding sexually explicit images, self-produced by children.” (9 Dec. 2020),
https://www.thorn.org/blog/thorn-research-understanding-sexually-explicit-images-self-produced-by-children/.
57 Emily A. Vogels et. al,,Teens and Cyberbullying 2022, Pew Research Center, (Dec. 2022),
https://www.pewresearch.org/internet/2022/12/15/teens-and-cyberbullying-2022/.
58Bauman, S., & Baldasare, A., Cyber aggression among college students: Demographic differences, predictors of
distress, and the role of the university, 56 Journal of College Student Development 317 (2015),
https://doi.org/10.1353/csd.2015.0039.
59van Geel M, Vedder P, Tanilon J. Relationship Between Peer Victimization, Cyberbullying, and Suicide in Children
and Adolescents: A Meta-analysis, JAMA Pediatr. 2014;168(5):435–442. doi:10.1001/jamapediatrics.2013.4143
https://jamanetwork.com/journals/jamapediatrics/fullarticle/1840250.
60Case 5:22-cv-00518-BO, Epic Games: Complaint for Permanent Injunction, (Dec. 19, 2022),
https://www.ftc.gov/system/files/ftc_gov/pdf/2223087EpicGamesComplaint.pdf.
61 See generally 5Rights Foundation. “Pathways: How digital design puts children at risk.” (July 2021),
https://5rightsfoundation.com/uploads/Pathways-how-digital-design-puts-children-at-risk.pdf.
62 See, e.g., Leading User Acquisition in the quickly growing mobile games industry: Get to know Winnie Wen of Jam
City, Jam City (Nov. 15, 2021), https://www.jamcity.com/leading-user-acquisition-in-the-quickly-growing-mobile-
Testimony of Josh Golin, Fairplay, February 14, 2023 10
major social media platforms – including Facebook, Instagram, YouTube, and TikTok – have
both in-house and external research initiatives focused on documenting and improving
engagement, as well as utilizing neuromarketing and virtual reality techniques to measure
effectiveness.63
Engagement-maximizing design features prey upon minors’ developmental vulnerabilities and
can lead to significant harm. These features create risk for children because they can lead to
problematic internet use and the associated harm. In addition, many of the techniques used to
extend engagement create new risks and harms in their own right. They include: social
manipulation design features; variable reward design features; and algorithmic content
recommendation systems.
Social manipulation design features
Social manipulation design features leverage a minor’s desire for social relationships to
encourage users to spend more time and/or perform more activities on a website or service.
These features are the hallmarks of social media platforms: follower, view, and like counts;
interaction streaks; displays of the names of users who have commented, viewed, or liked a
piece of content; and prompts that encourage a user to share with a larger audience by adding
suggested new friends or making their account or posts public.
Younger adolescents have specific developmental needs for social connectedness and are
particularly attuned to social validation.64 Children develop a need to fit in with their peers
around age 665 and the need to be noticed and admired by others around age ten.66 Social
games-industry-get-to-know-winnie-wen-of-jam-city/; Mediation that supports everything your app business needs
to scale, ironSource, https://www.is.com/mediation/; Mihovil Grguric, 15 Key Mobile Game Metrics That
Developers MUST Track, udonis (Sept. 20, 2022), https://www.blog.udonis.co/mobile-marketing/mobile-
games/key-mobile-game-metrics.
63 See, e.g., Meta Careers, Shape the Future of Marketing with the Marketing Science Team , Meta (Sept. 19, 2018),
https://www.metacareers.com/life/come-build-with-the-facebook-marketing-science-team/; Bob Arnold & Anton
Miller, How Google’s Media Lab Boosts YouTube Ad Results , AdAge (May 14, 2021),
https://adage.com/article/google/how-googles-media-lab-boosts-youtube-ad-results/2335796; TikTok Insights,
TikTok for Business (2022), https://www.tiktok.com/business/en-US/insights; TikTok Ads Break Through Better
than TV and Drive Greater Audience Engagement, TikTok for Business,
https://www.tiktok.com/business/library/TikTokDrivesGreaterAudienceEngagement.pdf; How Virtual Reality
Facilitates Social Connection, Meta, https://www.facebook.com/business/news/insights/how-virtual-reality-
facilitates-social-connection.
64 Nicholas D. Santer et al., Early Adolescents’ Perspectives on Digital Privacy, Algorithmic Rights and Protections
for Children (2021) at 6, 30.
65 In particular, between the ages of six and nine, children start to feel the need to fit in to peer social groups. See
Jun Zhao et al., ‘I Make Up a Silly Name’: Understanding Children’s Perception of Privacy Risks Online , CHI
Conference on Human Factors in Computing Systems Proceedings (May 2, 2019),
https://doi.org/10.1145/3290605.3300336.
66 Zara Abrams, Why Young Brains Are Especially Vulnerable to Social Media, APA (Feb. 3, 2022),
https://www.apa.org/news/apa/2022/social-media-children-teens (“Starting around age 10, children’s brains
undergo a fundamental shift that spurs them to seek social rewards, including attention and approval from their
peers.”).
Testimony of Josh Golin, Fairplay, February 14, 2023 11
acceptance evokes activation in the brain’s reward center.67 Further, minors’ prefrontal cortex,
which helps regulate responses to social rewards, is not as mature as adults’.68 These factors all
converge to create a feedback loop in which, because minors crave this social reinforcement,
they seek it out, and ultimately are unequipped with the tools to protect themselves against
the allure of “rewards” that these manipulative design features purportedly pro mise.
Social manipulation design features also exploit young people’s tendency for social comparison
and recreate, on a 24/7 basis, the high school cafeteria experience where everyone can
instantly see who is popular and who is not. Features such as like and follower counts and
comment displays induce anxiety in minors that they or their content may not be as popular as
that of their peers. In the words of one high school student, “[I]f you get a lot of likes, then
‘Yay,’ you look relevant, but then if you don’t get a lot of likes and/or views, it can completely
crush one’s confidence. Especially knowing that you're not the only one who’s able to see it.”69
Snapchat streaks literally quantify the strength of users’ relationships and create pressure on
users to communicate with their friends on the app daily.70 Teens report feeling obligated to
maintain Snapstreaks to “feel more popular” and show that they “care about that person.”71
Ultimately, these design features create strong incentives for young people to engage in
potentially harmful behaviors. Their drive for social rewards “lead[s] to greater relinquishing of
security in certain arenas to gain social validation and belonging, for example, disclosing
publicly to participate in online communities and accrue large amounts of likes, comments, and
followers.”72 Young users quickly learn that they can improve their social media metrics by
posting frequently and posting particularly provocative or risqué content.73 Such posts can
increase the risk of cyberbullying and sexual exploitation. In addition, the pressure to
67 Eveline Crone & Elly A. Konijn, Media Use and Brain Development During Adolescence, 9 Nature Comm. 1, 4
(2018), https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5821838/.
68 For example, adults “tend to have a fixed sense of self that relies less on feedback from peers” and “adults have
a more mature prefrontal cortex, an area that can help regulate emotional responses to social rewards.” Zara
Abrams, Why Young Brains Are Especially Vulnerable to Social Media , APA (Feb. 3, 2022),
https://www.apa.org/news/apa/2022/social-media-children-teens.
69 Katie Joseff, Social Media Is Doing More Harm than Good, Common Sense Media (Dec. 17, 2021),
https://www.commonsensemedia.org/kids-action/articles/social-media-is-doing-more-harm-than-good.
70 Taylor Lorenz, Teens Explain the World of Snapchat’s Addictive Streaks, Where Friendships Live or Die, Insider
(Apr. 14, 2017, 1:58 PM), https://www.insider.com/teens-explain-snapchat-streaks-why-theyre-so-addictive-and-
important-to-friendships-2017-4; Lori Janjigian, What I Learned After Taking Over My 13-Year-Old Sister’s Snapchat
for Two Weeks, Business Insider (Aug. 4, 2016, 11:53 AM), https://www.businessinsider.com/how-teens-are-using-
snapchat-in-2016.
71 Id.
72 Nicholas D. Santer et al., Early Adolescents’ Perspectives on Digital Privacy, Algorithmic Rights and Protections
for Children (2021) at 6 (citing J.C. Yau & S. M. Reich, “It's Just a Lot of Work”: Adolescents’ Self-Presentation Norms
and Practices on Facebook and Instagram, 29 J. Res. on Adolescence 196, 196-209 (2019)).
73 For example, Adolescent girls report feeling pressure to post sexualized selfies as a means of generating
attention and social acceptance from their peers. Macheroni, G., Vincent, J., Jimenez, E. ‘Girls Are Addicted to Likes
so They Post Semi-Naked Selfies’: Peer Mediation, Normativity and the Construction of Identity Online , 9
Cyberpsychology: Journal of Psychosocial Research on Cyberspace (May 1, 2015), https://doi.org/10.5817/CP2015-
1-5.
Testimony of Josh Golin, Fairplay, February 14, 2023 12
demonstrate popularity through high friend, follower, and like counts can lead children to
accept friend requests from strangers, putting them at risk of predation.
Variable reward design features
One objective of persuasive design is to reduce friction so that platforms are easier to use, and
so young people will keep using them. Low-friction variable rewards are highly effective at
maximizing the amount of time users spend on the service. The psychology that renders these
features effective is based on research that predates the internet by many years, beginning
with experiments by renowned psychologist B.F. Skinner in the early 20th century.74 Research
by Skinner and others revealed that when test subjects – both humans and other animals – are
rewarded unpredictably for a given action, they will engage in the action for a longer period of
time than if the reward is predictable.75 Specifically, the brain generates more dopamine in
response to an uncertain reward than in response to an expected and reliable one.76 The
tendency of variable rewards to drive compulsive behavior is sometimes referred to as the
“Vegas Effect,” and is the primary mechanism at work in slot machines.77 In the words of Nir
Eyal, a consumer psychology expert who wrote the popular industry how-to Hooked: How to
Build Habit-Forming Products, “[v]ariable schedules of reward are one of the most powerful
tools that companies use to hook users.”78
One common example of variable rewards design features is the infinite or endless scroll
mechanism with variable content. When a platform uses endless scroll, a user is continuously
fed new pieces of content as they scroll down a feed or page, and they never know what might
appear next. Harvard researchers Emily Weinstein and Carrie James explain in their recent book
on teens and technology: “Apps like TikTok have an endless database of content to offer users.
Some videos are pointless or boring or upsetting; others give a fleeting reward in the form of
funny, relatable, or compelling content.”79 The pursuit of the next “rewarding” piece of content
keeps users scrolling. As one 16-year-old told Weinstein and James, Snapchat is “so addictive
because it’s so easy to go on to the next thing…. And you never know what amazing thing could
be on the next Story, and all you have to do is tap once and you get to the next thing.”80
74 J. E. Staddon & D. T. Cerutti, Operant Conditioning, 54 Annual Review of Psychology 115 (2003),
https://doi.org/10.1146/annurev.psych.54.101601.145124; B. F. Skinner, Two Types of Conditioned Reflex: A Reply
to Konorski and Miller, 16 J. Gen. Psychology, 272 (1937), https://doi.org/10.1080/00221309.1937.9917951.
75 Laura MacPherson, A Deep Dive into Variable Designs and How to Use Them, DesignLi (Nov. 8, 2018),
https://designli.co/blog/a-deep-dive-on-variable-rewards-and-how-to-use-them/; Mike Brooks, The "Vegas Effect"
of Our Screens, Psychol. Today (Jan. 4, 2019), https://www.psychologytoday.com/us/blog/tech-happy-
life/201901/the-vegas-effect-our-screens.
76 Anna Hartford & Dan J. Stein, Attentional Harms and Digital Inequalities, 9 JMIR Mental Health 2, 3 (Feb. 11,
2022), https://pubmed.ncbi.nlm.nih.gov/35147504/.
77 Mike Brooks, The “Vegas Effect”of Our Screens, Psychol. Today (Jan. 4, 2019),
https://www.psychologytoday.com/us/blog/tech-happy-life/201901/the-vegas-effect-our-screens.
78 Nir Eyal, The Hook Model: How to Manufacture Desire in 4 Steps, Nir and Far, https://www.nirandfar.com/how-
to-manufacture-desire/.
79 Emily Weinstein & Carrie James, Behind Their Screens: What Teens Are Facing (And Adults Are Missing), MIT
Press, at 33 (2022); see also GCFGlobal.org, Digital Media Literacy: Why We Can’t Stop Scrolling,
https://edu.gcfglobal.org/en/digital-media-literacy/why-we-cant-stop-scrolling/1/.
80 Id. at 34.
Testimony of Josh Golin, Fairplay, February 14, 2023 13
All popular social media platforms, including those used heavily by minors such as TikTok,
Snapchat, Instagram, and Facebook, feature endless scroll feeds strategical ly designed to
intermittently surface content that users are algorithmically predicted to engage with . An
internal TikTok document said that the app maximizes for two metrics: user retention and time
spent.81 Similarly, a product manager for YouTube’s recommendation system explain ed that the
platform’s recommendation algorithm “is designed to do two things: match users with videos
they’re most likely to watch and enjoy, and . . . recommend videos that make them happy. . . .
[S]o our viewers keep coming back to YouTube, because they know that they’ll find videos that
they like there.”82 And Adam Mosseri of Instagram said, “[W]e make a set of predictions. These
are educated guesses at how likely you are to interact with a post in different ways…. The more
likely you are to take an action, and the more heavily we weigh that action, the higher up you’ll
see the post.”83
Tech companies know that variable rewards are a valuable tool to increase users’ activity and
time spent online and ultimately, to maximize profits. But they are similarly aware of the risks
associated with these types of rewards. For example, in 2020, responding to internal resea rch
indicating that teen users had difficulty controlling their use of Facebook and Instagram, a Meta
employee wrote to a colleague: “I worry that the driving [users to engage in more frequent]
sessions incentivizes us to make our product more addictive, without providing much more
value… Intermittent rewards are the most effective (think slot machines), reinforcing behaviors
that become especially hard to extinguish.”84 Ultimately, these sophisticated variable reward
techniques prey upon minors’ developmental sensitivity to rewards.
Algorithmic content recommendation systems
Algorithms designed to maximize engagement fill young people’s feeds with the content that is
most likely to keep them online, even when that means exposing them to a post, image, or
video that is dangerous or abusive. Platforms such as YouTube, TikTok, and Instagram serve
users content based on automated suggestions. Algorithms choose which content to suggest to
children and teens based on the vast amount of data they collect on users, such as likes, shares,
comments, interests, geolocation, and information about the videos a user watches and for
how long. As described above, these algorithms are designed to extend engagement by
discerning which pieces of content a user is most likely to engage with – not whether the
content or overall online experience is beneficial to the user.85
81 Ben Smith, How TikTok Reads Your Mind, New York Times, (Dec. 5, 2021),
https://www.nytimes.com/2021/12/05/business/media/tiktok-algorithm.html.
82 Creator Insider, Behind the Algorithms - How Search and Discovery Works on YouTube, YouTube (Apr. 16, 2021),
https://youtu.be/9Fn79qJa2Fc.
83 Adam Mosseri, Shedding More Light on How Instagram Works, Instagram (June 8, 2021),
https://about.instagram.com/blog/announcements/shedding-more-light-on-how-instagram-works.
84 Spence v. Meta Platforms, N.D. Cal. Case No. 3:22-cv-03294 at 82 (June 6, 2022) (citing Facebook Papers: “Teen
Girls Body Image and Social Comparison on Instagram – An Exploratory Study in the US” (March 2020), at p. 8).
85 A former YouTube engineer observed: “recommendations are designed to optimize watch time, there is no
reason that it shows content that is actually good for kids. It might sometimes, but if it does, it is coincidence .”
Orphanides, K.G. “Children's YouTube is still churning out blood, suicide and cannibalism.” Wired, (March 23,
2018), https://www.wired.co.uk/article/youtube-for-kids-videos-problems-algorithm-recommend
Testimony of Josh Golin, Fairplay, February 14, 2023 14
Algorithmic recommendations can be particularly dangerous when they target children and
teens’ greatest vulnerabilities. Investigations have repeatedly demonst rated the way social
media feeds deliver harmful mental health and eating disorder content to accounts registered
to minors. A December 2022 report by the Center for Countering Digital Hate (CCDH) found
that newly created TikTok accounts registered to teenagers that watched or liked videos about
body image, mental health, or eating disorders received videos in their For You feed related to
self-harm, suicide, or eating disorders within minutes.86 These videos appeared on the
accounts’ For You feeds every 206 seconds on average. CCDH also studied the For You feeds of
newly created TikTok accounts registered to teenagers that included the phrase “loseweight” in
their usernames. Those accounts received videos about self-harm, suicide, or eating disorders
in their For You feeds every 66 seconds on average.87
Other reports have made similar findings: A 2021 Wall Street Journal investigation documented
how TikTok users were served videos that encouraged eating disorders and discussed suicide.88
The same year, Senator Richard Blumenthal’s office created an account for a fake 13-year-old
girl that “liked” content about dieting, and the account was served pro-eating disorder and self-
harm content within 24 hours.89 Young users’ engagement with this harmful content is valuable
to tech companies: Our 2022 report detailed how Meta profits from 90,000 unique pro-eating
disorder accounts that reach 20 million people, one-third of whom are minors, some as young
as nine.90
Content recommendation algorithms also expose minors to videos of dangerous viral
“challenges,” which has tragically led to the serious injury and death of many young people. For
example, media reports have documented how “the blackout challenge” on TikTok, in which
young people hold their breath or choke themselves until they pass out, is responsible for the
deaths of several children.91 Many families say that their children learned about the challenge
through recommended videos on their For You feeds.92
V. Apps, websites, and platforms target children with unfair surveillance advertising and
influencer marketing techniques.
86 Center for Countering Digital Hate, Deadly by Design: Tik Tok Pushes Harmful Content Promoting Eating
Disorders and Self-harm into users’ feeds, (Dec. 15, 2022), https://counterhate.com/research/deadly-by-design/
87 Id.
88 Wall Street Journal Staff, Inside TikTok’s Algorithm: A WSJ Video Investigation, Wall Street Journal, (July 21,
2021), https://www.wsj.com/articles/tiktok-algorithm-video-investigation-11626877477.
89Nihal Krishan, Senate office impersonates 13-year-old girl on Instagram to flag eating disorder content , Yahoo
News, (Sep. 30 2021), https://www.yahoo.com/entertainment/senate-office-impersonates-13-old-
212700515.html.
90 Fairplay, Designing for Disorder: Instagram’s Pro-eating Disorder Bubble at 1 (Apr. 2022),
https://fairplayforkids.org/wp-content/uploads/2022/04/designing_for_disorder.pdf.
91Olivia Carville, TikTok’s Viral Challenges Keep Luring Young Kids to Their Deaths, Bloomberg, (Nov. 30, 2022),
https://www.bloomberg.com/news/features/2022-11-30/is-tiktok-responsible-if-kids-die-doing-dangerous-viral-
challenges; Anne Marie Lee, Child deaths blamed on TikTok ‘blackout challenge’ spark outcry , CBS News, (Aug. 19,
2021), https://www.cbsnews.com/news/tik-tok-blackout-challenge-child-deaths/.
92Michael Levenson and April Rubin, Parents Sue TikTok, Saying Children Died After Viewing ‘Blackout Challenge’,
New York Times, (July 6, 2022), https://www.nytimes.com/2022/07/06/technology/tiktok-blackout-challenge-
deaths.html.
Testimony of Josh Golin, Fairplay, February 14, 2023 15
Digital platforms also harm children and teens through unfair digital advertising practices,
including surveillance advertising and influencer marketing. These techniques make it harder
for young people to recognize content as advertising designed to influence their behaviors and
defend themselves against it, rendering them vulnerable to the influence of corporate actors
that can collect and utilize data to target them with precision.
Children face pervasive and inappropriate advertising from a young age: According to one
study, more than 95% of early childhood videos on YouTube contain ads, and one in five videos
viewed by children 8 and under contained ads that were not age-appropriate, such as ads that
featured violent or sexualized content.93 Researchers have also found a high rate of age-
inappropriate advertisements on preschool apps94 and have found that the educational
potential of children's apps is severely degraded by the high number of disruptive ads that
appear, particularly on free apps that are more likely to be used by low-income children.95
Surveillance advertising
Surveillance advertising – targeted advertising using personal data collected by websites and
platforms – is the dominant form of marketing online. Programmatic data-driven advertising
accounted for 90% of display ads in the U.S. last year.96 This pervasive form of advertising draws
on massive amounts of data about young people. By some estimates, advertisers already
possess over 13 million data points about a child by the time they turn 13, despite the fact that
the Children’s Online Privacy Protection Act (COPPA) requires parental permission before
sharing the personal information of children 12 and under with advertisers.97 These data are
drawn from countless daily activities, including web surfing, interacting with friends on social
media, and recording messages and exchanging images and other communications on
computers, phones, and tablets.98 Smart home technologies allow companies to collect data on
a young person’s home life; extended reality (virtual, augmented, and mixed reality) devices
can collect unique biometric data.
93 Radesky, J. S., Schaller, A., Yeo, S. L., Weeks, H. M., & Robb, M.B. “Young kids and YouTube: How ads, toys, and
games dominate viewing.” Common Sense Media, (2020),
https://d2e111jq13me73.cloudfront.net/sites/default/files/uploads/research/2020_youngkidsyoutube -
report_final-release_forweb.pdf.
94 Meyer M, Adkins V, Yuan N, Weeks HM, Chang YJ, Radesky J. “Advertising in Young Children's Apps: A Content
Analysis.” J Dev Behav Pediatr, (Jan. 2019), https://pubmed.ncbi.nlm.nih.gov/30371646/.
95 Meyer, M., Zosh, J.M., McLaren, C., Robb, M., McCaffery, H., Golinkoff, R.M., Hirsh -Pasek, K., & Radesky, J. ”How
educational are “educational” apps for young children? App store content analysis using the Four Pillars of
Learning framework.” Journal of Children and Media, (2021),
https://www.tandfonline.com/doi/abs/10.1080/17482798.2021.1882516?journalCode=rchm20.
96 Meaghan Yuen, Programmatic Digital Display Advertising in 2022: Ad Spend, Formats, and Forecast, Insider
Intelligence (May 23, 2022), https://www.insiderintelligence.com/insights/programmatic-digital-display-ad-
spending/.
97 SuperAwesome Launches Kid-Safe Filter to Prevent Online Ads from Stealing Children’s Personal Data ,
SuperAwesome (Dec. 6, 2018), https://www.superawesome.com/superawesome-launches-kid-safe-filter-to-
prevent-online-ads-from-stealing-childrens-personal-data/.
98 Wolfie Christl, Corporate Surveillance in Everyday Life: How Companies Collect, Combine, Analyze, Trade, and Use
Personal Data on Billions, Cracked Labs (June 2017),
https://crackedlabs.org/dl/CrackedLabs_Christl_CorporateSurveillance.pdf.
Testimony of Josh Golin, Fairplay, February 14, 2023 16
Kids and teens cannot appreciate the depth and breadth of these data collection systems, nor
the way they are used to target them with precision. Younger children largely think about
privacy in interpersonal terms, such as the ability to be left alone and control access to physical
places.99 As children get older, they may start to think about privacy in terms of freedom from
surveillance at school or by the government, but they do not think about privacy in the sense
that companies might use information about them to influence their purchasing choices, for
example. 100
Ultimately, surveillance ads are inherently unfair when targeted to children. As Fairplay, Global
Action Plan, and Reset Australia described in a report about Facebook:
On the one side is a child, poorly equipped to distinguish between advertising
and information, especially within digital contexts. On the other, Facebook with
its vast troves of data about the child, including but not limited to their browsing
history, mood, insecurities, their peers’ interests, and more. This power
imbalance makes surveillance advertising inherently more manipulative than
contextual digital advertising, let alone traditional analogue advertising.101
As with algorithmically recommended content, surveillance ads can be used to target and
exacerbate young people’s vulnerabilities. Leaked documents from Facebook revealed in 2017
that the company told advertisers it could help them target teens at moments when they are
feeling specific emotions, such as “silly,” “defeated,” “overwhelmed,” “useless” and “a
failure.”102 Facebook Australia told advertisers it could specify when teens are likely to
experience certain moods, sharing that “earlier in the week, teens post more about
‘anticipatory emotions’ and ‘building confidence,’ while weekend teen posts contain more
‘reflective emotions’ and ‘achievement broadcasting.’”103
This capability allows marketers to target vulnerable young people with ads for harmful
products. Ads for risky “Flat Tummy Teas” and dangerous exercise routines target young
women on Instagram. Early digital marketing campaigns for Juul vaping products were
deliberately targeted at young audiences.104 Researchers were able to target ads to teenagers
99 Kaiwen Sun et al., They See You’re a Girl if You Pick a Pink Robot with a Skirt: A Qualitative Study of How Children
Conceptualize Data Processing and Digital Privacy Risks, CHI Conference on Human Factors in Computing Systems
(May 2021), https://dblp.org/rec/conf/chi/SunSASGRS21; Priya Kumar et al., No Telling Passcodes Out Because
They’re Private: Understanding Children’s Mental Models of Privacy and Security Online , 1 Proceedings of the ACM
on Human-Computer Interaction 64, (Nov. 2017), https://pearl.umd.edu/wp-content/uploads/2017/08/kumar-
etal-2018-CSCW-Online-First.pdf.
100 Mariya Stoilova et al., Digital by Default: Children’s Capacity to Understand and Manage Online Data and
Privacy, 8 Media and Commc’n 197, 200, (2020), http://dx.doi.org/10.17645/mac.v8i4.3407.
101 Yi-ching Ho, E., Farthing, R., How Facebook still targets surveillance ads to teens, Reset Australia, Fairplay, and
Global Action Plan (Nov. 2021), https://fairplayforkids.org/wp-content/uploads/2021/11/fbsurveillancereport.pdf.
102 Sam Machkovech, Report: Facebook Helped Advertisers Target Teens Who Feel “Worthless”, ArsTechnica (May
1, 2017), https://arstechnica.com/information-technology/2017/05/facebook-helpedadvertisers-target-teens-
who-feel-worthless/.
103 Id.
104 Jidong Huang et al., Vaping versus JUULing: how the extraordinary growth and marketing of JUUL transformed
the US retail e-cigarette market, 28 Tobacco Control 146, 150 (Feb. 22, 2019),
https://doi.org/10.1136%2Ftobaccocontrol-2018-054382 (“JUUL was one of the first major retail e-cigarette
Testimony of Josh Golin, Fairplay, February 14, 2023 17
on Facebook based on their interests in gambling, alcohol, and dieting.105 While Meta
announced in 2021 that they were restricting advertisers’ ability to target teens based on their
interests, this change was misleading, as the company’s ad targeting algorithm still used the
data it collected on young people to determine who is most likely to be vulnerable to a given
ad.106
Even in cases where the products aren’t as harmful as alcohol or dieting aids, surveillance
advertising exploits children. As Common Sense notes, “Kids may be profiled as gamers,
impulsive purchasers, or anxious oversharers – and then unfairly targeted by ads that
encourage more of these things.”107
Influencer marketing
Product placement and host-selling are not permitted on children’s television, where
regulations require clear separation between content that is advertising and content that is not.
The online marketing ecosystem does not have similar rules, and as a result, advertising and
entertainment and informational content are deeply intertwined.
One of the ways that marketers reach kids and teens online is by advertising products through
influencers and trusted fictional characters. This method of advertising is highly appealing to
marketers because it is seen as more “authentic” and it capitalizes on the relationships that kids
and teens form with the characters and media figures they see online. This advertising sector is
huge and getting bigger. Market research shows that influencer marketing is currently growing
by billions of dollars annually.108 Influencer marketing reaches even the youngest kids online:
“kidfluencers” on YouTube receive millions of views on videos of themselves unboxing and
showing off new toys from brands and marketers.
Research demonstrates that influencer marketing overcomes children and teenagers’ nascent
cognitive ability to understand and defend themselves against advertising. For example, young
people identify closely with these media characters and figures and develop feelings or
brands that relied heavily on social media to market and promote its products.”); Julia Cen Chen -Sankey et al., E-
cigarette Marketing Exposure and Subsequent Experimentation Among Youth and Young Adults, 144 Pediatrics at
8 (Nov. 2019), https://doi.org/10.1542/peds.2019-1119; see also Erik Larson et al., Juul Reaches $439 Million
Settlement Over Marketing to Kids, Bloomberg Law, (Sept. 6, 2022), https://news.bloomberglaw.com/health-law-
and-business/juul-reaches-439-million-multi-state-settlement-over-marketing.
105 Farthing, Rys, et al., Profiling Children for Advertising: Facebook’s Monetisation of Young People’s Personal Data ,
Reset Australia, (April 2021), https://au.reset.tech/uploads/resettechaustralia_profiling-children-for-advertising-
1.pdf.
106 Id. In February 2023, Meta announced yet another change to its ad targeting for teens and now claims it will not
use teens interests or online activities at all for the targeting of ads to minors. As of this writing, Fairplay has not
had the opportunity to verify this claim.
107 Joseph Jerome and Ariel Fox Johnson, AdTech and Kids: Behavioral Ads Need a Time-Out, Common Sense,
(2021), https://d2e111jq13me73.cloudfront.net/sites/default/files/uploads/AdTech%20and%20Kids.pdf.
108 Traackr, 2022 Influencer Marketing Impact Report at 2, (2022),
https://www.traackr.com/content/influencermarketing-impact-report-2022; State of Influencer Marketing 2022,
Influencer Marketing Hub at 10, (2022),
https://influencermarketinghub.com/ebooks/Influencer_Marketing_Benchmark_Report_2022.pdf.
Testimony of Josh Golin, Fairplay, February 14, 2023 18
friendships known as parasocial relationships.109 As a result of these relationships, kids and
teens have difficulty responding to content from a be loved character or creator as an
advertisement,110 and can therefore be unduly influenced by marketers. As Fairplay outlined in
its comments to the Federal Trade Commission last year, the existing system of disclosures –
even when it is followed – does very little to alert kids and teens to the massive amounts of
advertising content they encounter online every day.111
This form of stealth marketing negatively impacts kids and teens. Children who watch unboxing
videos are more likely to nag their parents for products and throw a tantrum if the answer is
“no” than when they watch regular commercials.112 In internal Meta research leaked by Frances
Haugen, teens specified that influencers and their materialistic, over-the-top “money for
nothing” – or effortlessly rich – lifestyles triggered social comparisons and contributed to young
people feeling bad about themselves. The research emphasized the cumulative effect of
influencer marketing: “However, users report seeing multiple pieces of content from celebrities
and influencers in each app session, multiplying their effect. In addition, their friends mimic
celebrities’ beauty and fashion standards, further compounding the effects of one piece of
content.”113
VI. Congress must take action to protect young people online.
When kids are in digital spaces for learning, socializing, and relaxing, they deserve the
opportunity for the most positive experience, designed in a way that understands and supports
their unique ways of seeing the world. They should be able to explore in developmentally-
appropriate ways without being manipulated into spending more time or targeted by
algorithms that amplify harmful content.
We cannot continue to hope that tech platforms will unilaterally disarm in the race for young
people’s valuable attention. Nor can we expect young people to extract themselves from the
109 Amanda N. Tolbert & Kristin L. Drogos, Tweens’ Wishful Identification and Parasocial Relationships With
YouTubers, 10 Frontiers In Psychology 1, (2019),
https://www.frontiersin.org/articles/10.3389/fpsyg.2019.02781/full; Frans Folkvord, K.E. Bevelander & Esther
Rozendaal, et al., Children’s bonding with popular YouTube vloggers and their attitudes toward brand and product
endorsements in vlogs: an explorative study, 20 Young Consumers Insight And Ideas For Responsible Marketers
(2019), https://doi.org/10.1108/YC-12-2018-0896.
110 Emmelyn Croes & Jos Bartels, Young adults’ motivations for following social influencers and their relationship to
identification and buying behavior, 125 Computers In Human Behavior at 7, (2021),
https://doi.org/10.1016/j.chb.2021.106910; 4 Brigitte Naderer, Jörg Matthes & Stephanie Schäfer, Effects of
disclosing ads on Instagram: the moderating impact of similarity to the influencer, 40 International Journal of
Advertising 686, 687-88 (2021).
111 See generally Comments of Fairplay, Alexander Neville Foundation, et al. in the Matter of Protecting Kids from
Stealth Advertising in Digital Media (filed July 18, 2022), https://fairplayforkids.org/wp-
content/uploads/2022/07/influencer-comments.pdf.
112 Harsha Gangadharbatla & Deepti Khedekar, The Role of Parental Mediation and Persuasion Knowledge in
Children’s Consumption of Unboxing Videos, 22 Advertising & Society Quarterly (2021),
https://muse.jhu.edu/article/813891.
113 The Wall Street Journal, Teen Girls Body Image and Social Comparison on Instagram – An Exploratory Study in
the U.S., (Sep. 29, 2021) https://s.wsj.net/public/resources/documents/teen-girls-body-image-and-social-
comparison-on-instagram.pdf.
Testimony of Josh Golin, Fairplay, February 14, 2023 19
exploitative platforms where their friends are, or expect overworked parents to navigate
confusing settings across multiple platforms and monitor every moment their kids are online.
The last time Congress passed a law to protect children online was 25 years ago. The digital
landscape has changed dramatically, in many unforeseen ways, since the passage of the
Children’s Online Privacy Protection Act in 1998 when smart phones, YouTube, social media,
multiplayer gaming with voice chat, and virtual reality didn’t even exist. In addition, COPPA only
covers children until they turn 13 and has failed to effectively keep kids ages 12 and under off
of platforms like Snapchat, Instagram and TikTok, leaving significant demographics vulnerable
to exploitation and harm. Congress’s continued inaction has emboldened Big Tech to develop
an exploitative business model without considering or mitigating its harmful effects on children
and teens. Consequently, the social media platforms that define youth culture and norms and
shape children’s values, behavior, and self-image were developed with little to no thought
given to how young people might be negatively affected.
We cannot expect a 25-year-old framework to adequately protect children from today’s
sophisticated persuasive technologies powered by big data and machine learning or in the
rapidly developing metaverse. We need new legislation that puts brakes on this harmful
business model and curbs dangerous and unfair design practices.
At a minimum, such legislation should:
1. Extend privacy protections to teens. Currently, COPPA only covers children until their
13th birthday. It is critical to limit the collection of adolescents’ data, which fuels
harmful recommendations and puts young people at risk of privacy harms.
2. Ban targeted advertising to children and teens to protect them from harmful marketing
targeted to their vulnerabilities. Surveillance ads not only take advantage of young
people’s developing capacities and sell them on harmful products, but they also
incentivize tech platforms to prioritize engagement over safety.
3. Require tech companies to make the best interests of children and teens a primary
consideration in the design and operation of their platforms, including their algorithms.
It is important that such liability be broad enough to capture current harmful practices,
such as quantified popularity, as well as emerging features and products. The latter is
particularly important given the rapid development of metaverse applications targeted
to young people.114 Companies should have a duty to prevent and mitigate harms to
young people before new features or products are released.
4. Prohibit the use of dark patterns, which are used to undermine young people’s
autonomy and manipulate them into spending more time or money on a platform.
5. Impose transparency requirements, including access to algorithms, that enable outside
researchers to better understand the impacts of social media on young people. We
114 See, e.g, Salvador Rodriguez, Meta Pursues Teen Users as Horizon Metaverse App Struggles to Grow, The Wall
Street Journal (Feb. 8, 2023) https://www.wsj.com/articles/meta-to-revamp-horizon-metaverse-app-plans-to-
open-for-teen-use-as-soon-as-march-11675749223.
Testimony of Josh Golin, Fairplay, February 14, 2023 20
shouldn’t have to rely on courageous whistleblowers like Frances Haugen to
understand how social media platforms are impacting our youth.
6. Require minors’ privacy and account settings to be on the most protective by default,
rather than putting the onus on youth or their parents to navigate a maze of confusing
settings just to have a safer, more age-appropriate experience.
7. Have a clear and effective enforcement mechanism, such as a division at the FTC, solely
dedicated to protecting young people and their privacy online.
The good news is that two bills which together would do all of the above, the Kids Online Safety
Act and the Children and Teens’ Online Privacy Protection Act, advanced out of the Commerce
Committee with broad bipartisan support last July – the first such legislation to advance out of
committee in more than two decades. The Committee votes came on the heels of a number of
important hearings with whistleblowers, child development experts, and tech executives in the
Senate Judiciary and Commerce Committees and House Energy and Commerce Committee,
which established a clear record of harm and the need for new online protections for young
people.
The bad news, of course, is that neither bill became law or even received a floor vote. And
every day that the status quo continues, children are suffering – and even dying – from
preventable harms.
We’ve named the problem and debated the solutions. Now it’s time to build on last year’s
momentum and disrupt the cycle of harm by passing privacy and safety-by-design legislation.
Let’s make 2023 the year that Congress finally takes a huge step toward creating the internet
children and families deserve.
Thank you again for having me here today and I look forward to discussing all of this with you.
Written Testimony of Emma Lembke,
Founder and Executive Director of the LOG OFF Movement
United States Senate Committee on the Judiciary: Protecting Our Children Online
February 14, 2023
My name is Emma Lembke. I am originally from Birmingham, Alabama. I am currently a
college sophomore studying Political Science at Washington University in St. Louis. I am
honored and humbled to be here today.
I created my first social media account on Instagram when I was 12. I was in 6th grade and I
was the last in my friend group allowed on social media platforms. At the time, I distinctly
remember watching these apps pull my friends' attention away from games of tag and
down, towards their screens. To 12-year-old me, these platforms almost seemed magical;
tools that could deepen society’s connective, expressive, and exploratory capabilities.
It felt as though I, a girl from Birmingham, Alabama, had the world at my fingertips, but as I
began to spend more time on these platforms, I was met with a harsh reality.Social media
was not magic. It was an illusion, a carefully designed product predicated on maximizing
my attention at the cost of my well-being.
As my screen time steadily increased, my mental and physical health suffered. The constant
quantification of my worth through likes, comments, and followers increased my anxiety
and deepened my depression. As a young woman, being exposed to unrealistic body
standards and harmful recommended content severely damaged my sense of self and led
me towards disordered eating. I became the living embodiment of Facebook’s own 2019
internal research finding that their platforms made body image issues worse for one in
three teen girls.
No matter the harm I incurred, addictive features like the endless scroll and autoplay
pulled me back into the online world where I continued to suffer. And there, I remained for
over three years, scrolling mindlessly for 5-6 hours a day. I eventually reached a personal
breaking point in the 9th grade that caused me to temporarily remove social media apps
from my device. I am still recovering today from the damage caused by social media and
hyper aware that many of its effects are long lasting, if not permanent.
Senators, my story does not exist in isolation– it is a story representative of my generation,
Generation Z. As the first digital natives, we grew up alongside technology. We have never
known a world without the internet. Every answer has been a Google search away, every
moment captured on Facebook or Instagram.
To be clear, social media can enhance our connective, expressive, and exploratory
capabilities, but we are only just beginning to understand the consequences associated
with growing up online. Yet, it is from our lived experience as Generation Z - the generation
most harmed - that we can begin to build the most promising solutions. Decision makers
from other generations must hear from us to fully understand the challenges and
opportunities associated with being a young person in the digital world. It is only when
young people are given a space at the table that effective solutions can emerge and safer
online spaces can be created. The power of youth voices in the space is far too great to
continue to be ignored.
This is why, as a senior in high school, after years of researching and reflecting on my own
relationship with social media, I founded the LOG OFF Movement. I knew a community had
to be created by young people for young people to tackle the complexities of social media
and its impact on younger generations.
Through LOG OFF, I have engaged with youth around the world who have shared their
experiences of harm with me. I’ve listened to stories of unwanted direct messages, vicious
cyberbullying, and dangerous pro-anorexia rabbit holes. While our stories may differ, as
young people we share the frustration of being portrayed as passive victims of Big Tech
when in reality, we are ready to be included as active agents of change; rebuilding new, and
safer online space for the next generation. Ten years from now social media will not be
what it is today, it will be what people of my generation build it to be. We want to build it
differently, we want to build it right.
I came here today as the representative for those young changemakers. To be the voice not
just of those of my generation who have been harmed or who are currently struggling, but
as a voice for all the 12-year-old girls yet to come. The genie is out of the bottle, and screen
time across younger generations is only increasing, with the number of US teenagers
online continuously almost doubling from 2015 to 2018: 24% to 45%. In 2020,81% of 14 to
22-year-olds said they used social media either “daily” or “almost constantly.”
As a society, we will never go back to a time where social media does not exist, nor should
we. But make no mistake, unregulated social media is a weapon of mass destruction that
continues to jeopardize the privacy, safety, and wellbeing of all American youth. This harm
does not stop at the borders of the United States, this is a global crisis. The United States
has a unique opportunity to lead the world in putting a stop to predatory and targeted
actions by Big Tech against the world’s most vulnerable.
It’s time to act and, Senators, I urge you to meaningfully regulate these companies not just
for my generation but with my generation. Integrating our lived experience into the
regulatory process is essential to getting it right.
Thank you for having me here today. I look forward to answering your questions.
TESTIMONY OF
CEO John Pizzuro, Raven
Commander, New Jersey Internet Crimes Against Children (Ret)
New Jersey State Police (Ret)
for the
UNITED STATES SENATE
COMMITTEE ON THE JUDICIARY
Protecting Our Children Online
February 14, 2023
Chairman Durbin, Ranking Member Graham, and distinguished Senators, thank you for the
opportunity to testify today on Protecting Our Children Online. For me, there is no more
significant issue than safeguarding our children, as well as those who protect them from harm.
I wish I did not have to be here to testify on this issue because it would mean our children are
safe when they go online. The truth is, we have not protected our children sufficiently due to the
ever-increasing use of social media apps and the growth of their online lives. Their risk for harm
has increased at such a significant pace that shielding them from abuse and exploitation has
become untenable. To quote a sentiment shared by thousands of global experts in this space:
“We cannot arrest our way out of this problem.” Today there are countless victims of Child
Sexual Abuse Material (CSAM), sextortion, and other exploitative crimes. The sad reality is that
we are failing to protect our children from the threats they face online.
Those who would protect our youth are overburdened and under-resourced, which makes
children vulnerable. Our nation’s young people are unable to escape from the bombardment of
posts, reels, and online social interaction. A major disadvantage of our global society is that any
offender can reach any victim, anywhere in the world, through any app or gaming platform. We
live in a world where everyday tasks increasingly are accomplished through apps, from
shopping, to making a flight reservation, to – sadly - even children buying drugs.
I am here today as the CEO of Raven, an advocacy group comprised of 14 professionals,
including nine retired Internet Crimes Against Children (ICAC) Task Force Commanders, who
have committed their lives to the advocacy and protection of children. The Internet Crimes
Against Children Task Force Program (ICAC program) helps state and local law enforcement
agencies develop an effective response to technology -facilitated child sexual exploitation and
Internet crimes against children. The ICAC program is a national network of 61 coordinated task
forces, with at least one in each state, representing more than 4,700 federal, state, and local law
enforcement and prosecutorial agencies. These agencies are engaged in both proactive and
reactive investigations, forensic investigations, and criminal prosecutions. This ICAC program
also encompasses training and technical assistance, victim services, and community education.1
1 The ICAC Task Force program was developed in 1998 response to the increasing number of children and
teenagers using the Internet, the proliferation of child sexual abuse images available electronically, and heightened
online activity by predators seeking unsupervised contact with potential underage victims. The Providing Resources,
I am retired from the New Jersey State Police, where I served as the Commander of the Internet
Crimes Against Children task force from 2015 to 2021. I personally experienced the struggles of
how best to protect our children online. We witnessed children targeted by offenders across all
platforms – no social media or gaming platform was safe, from apps such as Snapchat, Twitter,
Kik, Telegram, Discord, LiveMe, and Meetme, to gaming platforms and online games such as
Minecraft, Roblox, and Fortnite. And these represent just a fraction of the places where
offenders regularly interact with children. If the platform allows individuals to chat, or a way to
share photographs and videos, I assure you there is a very real danger that offenders are using
that access to groom or sexually exploit minors. Sadly, in addition to sexual exploitation, the
platforms allow children to buy drugs such as Fentanyl.2
Our children’s world has become focused on “likes,” followers, and views, and in this way social
media exploits vulnerabilities in our children’s psychology. In an interview with Axios, the
former President of Facebook stated, “That means that we needed to sort of give you a little
dopamine hit every once in a while, because someone liked or commented on a photo or a post
or whatever ... It's a social-validation feedback loop ... You're exploiting a vulnerability in human
psychology ... [The inventors] understood this, consciously, and we did it anyway.” 3
That interview occurred on November 9, 2017 - more than five years ago, and our dependence
on technology has only increased. Cell phones have become ubiquitous, even in elementary
schools, providing offenders with an entirely new way to exploit children on the playground.
Children are made vulnerable on these platforms as the result of poor moderation, the absence of
age or identity verification, and inadequate or missing safety mechanisms. Of course, as the
amount of screentime has increased, so has the likelihood the children can be groomed and
manipulated.
Grooming is defined as simply manipulating and gaining a child’s trust, but it is much more than
that. Grooming is what offenders do to victimize children, and it happens daily to unsusceptible
children who cannot see the danger. Children do not know the threat online because they
primarily engage in their online world in a safe place. As a result, the amygdala, the fear center
of their brain, is not activated, and children do not see the danger. This is what offenders will
capitalize on.
While sending compliments, virtual currency, gift cards, and other incentives are certainly part
of grooming, today’s offenders do even more to access children’s trust. Offenders research
children to know what they like, and do not like, what music they listen and so on. The offender
will then mirror their words and repeat the exact language. The child then will see someone who
Officers, and Technology to Eradicate Cyber Threats to Our Children Act ("the PROTECT Act") of 2008, (P.L. 110 -
401, codified at 42 USC 17601, et seq.), authorized the ICAC program through FY 2013. On November 2, 2017, the
Providing Resources, Officers, and Technology to Eradicate Cyber Threats to (PROTECT) Our Children Act of
2017 was signed into law, reauthorizing the ICAC Task Force Program through FY 2022. More information is
available at https://www.icactaskforce.org/.
2 https://ktla.com/news/local-news/mother-mourns-sons-death-from-fentanyl-laced-drugs-purchased-on-snapchat/.
3 https://www.axios.com/2017/12/15/sean-parker-facebook-was-designed-to-exploit-human-vulnerability-
1513306782
is just like them. Chat forums on Tor share success stories on successfully grooming children of
all ages. Each offender will attempt to groom hundreds of children using various techniques
beyond just sending a picture or a video. We discuss numerous “in real life” dangers in school
curriculums, yet online grooming is not part of it.
As the New Jersey ICAC Commander, I struggled with the significant increases in
investigations, arrests, and victims we faced each year. For example, in 2015 we received 2,315
Cybertips and made 125 arrests, and by the end of 2019 we had 8,000 Cybertips and we made
420 arrests. We understood the importance of trying to keep up, but even creative attempts to
“do more with less” became unsustainable. And this was prior to COVID, when screentime
increased substantially and cemented our children’s reliance on apps. These challenges were
frustratingly present with every ICAC task force across the United States. The most staggering
increase we faced was self-generated CSAM cases – children taking sexual images of themselves
as the request of offenders. These were not images of older teens sending photos of themselves
to their boyfriends and girlfriends – we began to see images of 7, 8, and 9-year-olds in sexual
poses. The online landscape is horrifying because offenders know this is where our children
live, and they recognize there are not enough safeguards to keep them at bay.
During one case, I received a call from a Child Advocacy Center in another state. The advocate
told me a mother had just arrived with her 8-year-old daughter after she found sexual abuse
videos on the child’s phone. An offender had obtained a sexually abusive video of an 11-year-old
girl, and then used that video to coerce 60 children to share sexually explicit videos of
themselves. This included a video of a 12-year-old girl abusing her 1½-year-old brother. These
child victims were located throughout the United States and Canada and were using a popular
live-streaming app. This is one example of thousands of cases throughout the United States and
the globe.4
The Protect Our Children Act of 2008 created a funding mechanism for Internet Crimes Against
Children task forces that are responsible for 90% of the child exploitation investigations in the
United States. But things have changed in this space since 2008. In 2008 there was an average of
one computer per household. Today, families in the U.S. have an average of 20 Internet-capable
devices, including phones, tablets, laptops, and gaming consoles. And the volume of data
investigators must comb through to find victims has increased significantly. Reactive
investigations take place when law enforcement receives information, such as a CyberTip, that a
crime has occurred. A proactive investigation involves the use of intelligence to try to identify
potential offenders.
Today, law enforcement is often unable to proactively investigate child exploitation cases due to
the volume of Cybertips. As a result of the exponential increase in Cybertips (these tips increased
by 2,800% between 2012 and 2021) law enforcement agencies have been forced to become
4 https://www.app.com/story/news/crime/2019/09/24/lakewood-sex-offender-had-more-than-1-000-images-child-
porn-his-iphone-feds-say/2435710001/.
reactive, and most can no longer engage in the proactive operations that are designed to target
the most dangerous offenders.5
It is important to understand that the CyberTipline is challenging law enforcement not only with
respect to the quantity of leads, but also the quality of leads. Most of the investigative leads
provided by service providers, through NCMEC, to the ICAC Task Forces are not actionable,
meaning they do not contain sufficient information to permit an investigation to begin. The lack
of uniformity in what is reported by service providers results in law enforcement being forced to
sort through thousands of leads trying desperately to identify worth-while cases. Cases where
abusers and offenders who are considered particularly sadistic and dangerous. The Ackerman
case out of the Fourth Circuit, and the Wilson case out of the Ninth Circuit, have also increased
the burden on law enforcement officers trying to review CyberTips.
As noted above, the sheer volume of Cybertips also prevents law enforcement from pursuing
proactive investigative effort that would efficiently target the most egregious offenders. For
example, peer-to-peer file sharing investigations and operations used to allow ICAC Task Forces
to efficiently locate and apprehend hands-on offenders.6 In the last 90 days, alone, there have
been 99,172 IP addresses throughout the United States that have distributed known CSAM
images and videos through peer-to-peer networks. Yet only 782 - less than 1% - are being
investigated (see Exhibit 1). Consistently, 75% of these cases have resulted in successful
prosecutions. Significantly, the most rigorous studies involving interviews with offenders have
shown that between 57% and 85% of individuals arrested for these crimes have committed
undetected sexual abuse of minors; on average, those offenders have assaulted between 10 to 13
victims.7 Due to the overwhelming volume of Cybertips, law enforcement is simply not
investigating peer-to-peer to the degree that it wants and should.
EXHIBIT 1
5 Reactive investigations take place when law enforcement receives information, such as a CyberTip, that a crime
has occurred. A proactive investigation involves the use of intellige nce to try to identify potential offenders.
6 https://www.nj.gov/njsp/news/2016/20160818.shtml
7 https://www.ojp.gov/ncjrs/virtual-library/abstracts/butner-study-redux-report-incidence-hands-child-victimization-
child.
ICAC Task Forces throughout the United States used to regularly conduct undercover operations
targeting offenders who traveled to meet and assault individuals they believed were 10- to 14-
year-olds. All of these undercover investigations are performed using social media apps or
online ads that solicit the sexual assault of children. When arrests are made, investigators rarely
find it is the first time the offender has traveled to sexually abuse a child.
These offenders bring drugs, alcohol, sex toys, and other paraphernalia. In one an offender
brought a dog leash and collar so he could be “walked” by a 12-year-old.8 Task forces
throughout the U.S. would conduct these operations on a routine basis, and they were very
successful. The North Florida ICAC task force, for example, conducted 48 of these operations,
arresting thousands of individuals, and obtained a conviction rate of 98.7%. Unfortunately, task
forces are no longer able to perform these types of operations - they are resource intensive, and
the volume of reactive cases prohibits it.
The Darknet, including Tor, has become the newest online haven for child exploitation.9 Some
forums and boards contain the most abusive child exploitation videos and images law
enforcement has encountered. Chat forums allow offenders to create “best practices” on how to
groom and abuse children effectively. A post named the “Art of Seduction” that explained how
to “seduce” children was read more than 54,000 times. Other posts discuss the best way to
introduce sexual activity to children without alarming them or offer such topics as “Thoughts on
having oral sex with 0-2-year-olds.” These conversations are horrific, yet Tor is easily
downloaded as a web browser, and children and teens can install it on their phones and begin
accessing it within minutes.
8 https://www.nj.gov/oag/newsreleases19/pr20190424a.html.
9 The Dark Net is an encrypted portion of the internet that is not indexed by search engines where users can
communicate anonymously without divulging identifying information, such as a user's location . Tor is one network
on the Dark Net.
In one undercover operation a registered sex offender paid to sexually abuse an 11-year-old,
spoke about how he was able to victimize his two-year-old nephew, and described how he
groomed children into providing him with child sexual abuse videos. 10 The offender sent screen
shots of his texts with children with whom he had connected using Kik, which revealed his
technique for convincing them to send him sexually explicit material. He admitted sexually
assaulting a massage therapist and indicated he wanted to kidnap an eight-year-old child, but he
was afraid of being caught.
Another offender, a Jersey City police officer, used the Wikr and Kik apps to communicate with
his victims. He used those apps to communicate undercover investigators, where he attempted to
pay to sexually assault an 8- and 10-year-old girl. He then traveled to Atlantic City with
condoms and cash, with the intent of abusing the child. These are just a few examples of the
depravity that law enforcement deals with daily. The crimes that lead to their apprehension is
nearly always only the tip of the iceberg – there is never just one victim.
The details of these undercover investigations shock the conscious. There is no shortage of case
reports describing the sexual abuse of 11-year-olds. Or a mother who is targeted by an offender
because her 5-year-old is too young to text but is of the age interest for the offender. Or the
offender who brought a stuffed animal for the 10-year-old he was going to rape, along with a
bottle of Viagra and other sexual devices for when the Viagra failed.
The impact of these cases does not only affect our children. They impact the law enforcement
community. Investigators, prosecutors, child advocacy professionals, and everyone involved in
these horrendous acts must bear witness to the depraved images, sounds, words, videos, and case
specifics eroding their mental health. The toll these cases place on law enforcement’s mental
state comes with a price. We need to support these law enforcement professionals from a
wellness standpoint. Many times, our law enforcement professionals suffer in silence with
limited resources. Every day I would come to work and worry about the damage these cases do
to the people investigating them every day. I am concerned about the lack of resources available
to the law enforcement community from a wellness standpoint. No one can prepare you for what
you see in these cases; once you see them, they are challenging to unsee. These cases will stay
with investigators throughout their lives to the detriment of their lives and families.
The reality is everything happens online. Offenders, including registered sex offenders, are
lurking in the same places where our children are communicating with their friends or playing
online games. There is very little to stop these predators from communicating with, and then
grooming, any child they perceive as vuln erable. Those who seek to police these spaces are in
need of significant help if they are to bring about change.
This past summer, I took a short walk on the beach in Point Pleasant. It was a beautiful 80-
degree day, and along my half-mile walk I counted 67 children and teens on their phones, 12 of
whom were making a TikTok video. I then came across a four-year-old who was lost and could
not find his parent. Statistically, at least 1/4 of those children will be victimized. We are at a
point where we need to identify what works and provide authorities with sufficient resources to
10 https://www.justice.gov/usao-edca/pr/sacramento-county-man-sentenced-25-years-prison-sexual-exploitation-
child.
increase their protective capabilities. Children need our help. Every day, social media
companies write posts and release one press release after another in which they tout their
successes at keeping children safe. While appreciated, these actions constitute mere drops in the
bucket. One simply can look at the statistics to determine the real story - what is truly happening
to our children. Based on what I have experienced, I can confidently tell you three things: At
the moment the predators are winning, our children are not safe, and those who are fiercely
committed to protecting them are drowning and will continue to so unless we can get them the
resources they need.
1
Written Testimony
of
Mitch Prinstein, PhD, ABPP
Chief Science Officer
American Psychological Association
Protecting Our Children Online
Before the U.S. Senate Committee on Judiciary
February 14, 2023
Chairman Durbin, Ranking Member Graham, and members of the Judiciary Committee,
thank you for the opportunity to testify today on the online dangers facing our children and teens.
I am Dr. Mitch Prinstein, Chief Science Officer at the American Psychological Association (APA).
APA Services, Inc. is the companion organization of the American Psychological Association,
which is the nation’s largest scientific and professional nonprofit organization representing the
discipline and profession of psychology, as well as over 146,000 members and affiliates who are
clinicians, researchers, educators, consultants, and students in psychological science. Through the
application of psychological science and practice, our association’s mission is to use psychological
science and information to benefit society and improve lives.
I am grateful you have called attention to youth and the online environment. Our youth are
struggling in many ways, largely due to our society’s failure to adequately attend to child and
adolescent mental health.
My testimony is broken down into the following sections to help inform the Committee
about the complexities of the challenges before us and to help shape policy solutions:
• Overview pg. 2
• Online/ Social Media Behaviors and Youth Mental Health pg. 6
• Psychological Effects of Lost Opportunities While Youth Are Online pg. 17
2
• Potential Solutions and Policy Implications pg. 18
Overview
Today, we are seeing the repercussions of our underinvestment and lack of focus on
children’s mental health. Depression rates for teens doubled between 2009 and 2019 and suicide
is the second leading cause of death for U.S. youth, up 4% since 2020, with one in five teens
considering suicide during the pandemic and eating disorder emergency room admissions for girls
12 to 17 years old doubling since 2019 1. Furthermore, since the start of the pandemic, over
167,000 children have lost a parent or caregiver to the virus 2. This kind of profound loss can have
significant impacts on the mental health of children, leading to anxiety, depression, trauma, and
stress-related conditions 3. Faced with such data, in December 2021, the U.S. Surgeon General
issued an advisory calling for a unified national response to the mental health challenges young
1Radhakrishnan, L. (2022). Pediatric Emergency Department Visits Associated with Mental Health Conditions
Before and During the COVID-19 Pandemic — United States, January 2019–January 2022. MMWR. Morbidity and
Mortality Weekly Report, 71(8). https://doi.org/10.15585/mmwr.mm7108e2 ; Curtin, S. (2022). Vital Statistics
Rapid Release Provisional Numbers and Rates of Suicide by Month and Demographic Characteristics: United
States, 2021. https://www.cdc.gov/nchs/data/vsrr/vsrr024.pdf; Daly, M. (2021). Prevalence of Depression Among
Adolescents in the U.S. From 2009 to 2019: Analysis of Trends by Sex, Race/Ethnicity, and Income. Journal of
Adolescent Health. https://doi.org/10.1016/j.jadohealth.2021.08.026 ; Suicide. (n.d.). National Institute of Mental
Health (NIMH). Retrieved February 10, 2023, from
https://www.nimh.nih.gov/health/statistics/suicide#%3A~%3Atext%3DSuicide%20is%20a%20Leading%20Cause%
20of%20Death%20in%20the%20United%20States%2C-
According%20to%20the%26text%3DSuicide%20was%20the%20second%20leading%2Cages%20of%2035%20and
%2044; Yard, E. (2021). Emergency Department Visits for Suspected Suicide Attempts Among Persons Aged 12 –
25 Years Before and During the COVID-19 Pandemic — United States, January 2019–May 2021. MMWR.
Morbidity and Mortality Weekly Report, 70(70(24);888 –894). https://doi.org/10.15585/mmwr.mm7024e1.
2 Hidden Pain: Children Who Lost a Parent or Caregiver to COVID-19 and What the Nation Can Do To Help Them
| COVID Collaborative. (n.d.). Www.covidcollaborative.us. https://www.covidcollaborative.us/initiatives/hidden-
pain.
3 Almeida, I. L. L., Rego, J. F., Teixeira, A. C. G., & Moreira, M. R. (2021). Social isolation and its impact on child
and adolescent development: a systematic review. Revista paulista de pediatria : orgao oficial da Sociedade de
Pediatria de Sao Paulo, 40, e2020385. https://doi.org/10.1590/1984-0462/2022/40/2020385.
3
people are facing 4. The rarity of such advisories further underscores the need for action to help
stem the mental health crisis of children and adolescents.
There are many reasons why youth are experiencing this crisis today, and it is likely that
there are simultaneous contributors to the outcomes presented above. Today, we are here to talk
about whether youths’ engagement with social media, and other online platforms, may be a
relevant factor. Many psychological scientists, including myself and my colleagues, have been
asking this same question for years. We seek to understand how this new context in which youths’
social interactions occur may be related to development, including potential benefits or risks that
may be conferred by the online environment. As the discipline with expertise on all of human
behavior, our work has been broad in scope; and to date, our focus has been on the adolescent
period, during which more complex and mature behaviors are developed through intricate and
precise interactions among neural, biological, social, contextual, and social systems. Today,
although this remains a relatively nascent body of research, I would like to share what we know
so far, so policymakers, educators, parents, caregivers, and youth can learn from what we are
beginning to discover and make choices that will ensure the safety of youth.
In this testimony, I outline emerging research with findings that have begun to suggest
possible benefits, and as well as possible adverse effects of technology and social media use on
adolescent development. I also present legislative and regulatory solutions that if enacted, would
represent positive steps towards learning more about, and hopefully solving this problem. I am
calling for new legislation and regulations that increase research funding and provide education
on how children can use online platforms wit hout experiencing the most harmful impacts;
legislation that creates a requirement that social media companies protect the well-being of child
users; legislation that prohibits problematic business practices and prevents companies from
tricking and manipulating users; and bills that provide more leverage for federal regulators to
4 Richtel, M. (2021, December 7). Surgeon General Warns of Youth Mental Health Crisis. The New York Times.
https://www.nytimes.com/2021/12/07/science/pandemic-adolescents-depression-
anxiety.html#:~:text=The%20United%20States%20surgeon%20general .
4
clamp down on known harmful impacts while building internal expertise to prepare to tackle newly
discovered harms. APA supported these efforts in past Congresses and commits to work to see
these proposals enacted because, as I present below, scientific data are beginning to suggest areas
of serious concern that must not be allowed to continue unchecked.
Before we discuss specific impacts of online platforms or solutions, it is important to
acknowledge that causal data are not available for many of these issues, since the experimental
designs needed to make cause-and-effect statements would be considered unethical or require
access to currently inaccessible data. This underscores the need for increased access to data and
funding for high-quality research. However, as with non-causal research revealing the effects of
childhood adversity on mental health, or the effects of combat on PTSD among veterans, extant,
rigorous science can nevertheless allow us to reach reasonable conclusions that can shape policy.
It also is important to acknowledge that technology and social media may not, in
themselves, be problematic for child development, as each device and platform offers a multitude
of features and communication opportunities that users can choose from. Extensive research has
demonstrated that the amount of screentime alone is not likely associated with negative
psychological outcomes among youth 5. Moreover, not all youth exposed to identical stimuli are
affected in the same ways. Thus, the most appropriate question is: what specific online behaviors,
features, or content may be associated with benefit or risk to which youth. This is the focus of the
most recent work among psychological scientists, yielding some comforting, but also some
worrying results.
But first, to understand the role of social media in youths’ development, it is necessary to
understand the role of social interactions more generally at this critical developmental stage.
5 Odgers CL, Jensen MR. Annual Research Review: Adolescent mental health in the digital age: facts, fears, and
future directions. J Child Psychol Psychiatry. 2020;61(3):336 -348. doi:10.1111/jcpp.13190.
5
Children’s interactions with peers are not merely for fun. It is within the social context that
most children’s education occurs; thus, peer interactions significantly affect cognitive
development. The peer context also is the milieu in which children learn social rules, norms, and
expectations; develop emotional competence and morality; and in which all of children’s behaviors
are consistently reinforced (or corrected), thus influencing long-term behavioral development.
Indeed, numerous studies have revealed that children’s interactions with peers have enduring
effects on their occupational status, salary, relationship success, emotional development, mental
health, and even on physical health and mortality over 40 years later 6. These effects are stronger
than the effects of children’s IQ, socioeconomic status, and educational attainment. These
enduring effects likely occur because of remarkably powerful and reciprocal interactions between
youths’ social experiences and their biological development. Children’s brains and peripheral
nervous systems influence how they interact with peers, and in turn, those experiences change the
development of their brain structures, neural pathways, and even how their nervous system
responds to stress throughout their lives.
Our brains, our bodies, and our society have been evolving together to shape human
development for millennia, influencing our communities, our culture, and our society. Within the
last twenty years, the advent of portable technology and social media platforms is changing what
took 60,000 years to evolve. We are just beginning to understand how this may impact youth
development.
I will first discuss the potential effects of technology and social media use on youth mental
health. This will include an outline of five main issues emerging from the research, including the
risks of pre-adulthood use of social media, the ramifications that come from unmonitored (and
“liked”) content online, the potential effects of digital stress, the encouragement of social
comparisons, and research demonstrating benefits of social media use among youth. In the
6 For a review, see; Prinstein, M. J., & Giletta, M. (2020). Future Directions in Peer Relations Research. Journal of
Clinical Child & Adolescent Psychology, 49(4), 556–572. https://doi.org/10.1080/15374416.2020.1756299.
6
following section, I will discuss the psychological effects of opportunities lost while youth spend
time online. Last, I will discuss potential solutions and policy recommendations.
Online/ Social Media Behaviors and Youth Mental Health
Pre-adulthood use of technology and social media may be particularly concerning. There
is reason to be significantly concerned about the age at which many youth begin using
technology and social media. Developmental neuroscientists have revealed that there are two
highly critical periods for adaptive neural development. Aberrations in our brain growth during
these periods may have lifetime implications. One of these is the first year of life. The second
begins at the outset of puberty and lasts until early adulthood (i.e., from approximately 10 to 25
years old). This latter period is highly relevant, as this is when a great number of youths are
offered relatively unfettered access to devices and unrestricted or unsupervised use of social
media and other online platforms 7. Within the age range of 10-25 years, change occurs
gradually and steadily; thus risks likely are greater towards the beginning of this range and
become attenuated as youth mature. Herein, this period is referred to as “pre-adulthood.”
At the outset of puberty, adolescents’ brains begin developing in a specific, pre-determined
sequence. Generally, sub-cortical areas shared with many mammalian species mature before areas
at the top layer of the brain, which is responsible for many of our more human capabilities, such
as premeditation, reflection, and inhibition. Among these initial areas developing among most
youth, typically starting at the ages of 10-12 years old, are regions associated with our craving for
“social rewards,” such as visibility, attention, and positive feedback from peers. In contrast,
regions involved in our ability to inhibit our behavior, and resist temptations (i.e., the prefrontal
cortex) do not fully develop until early adulthood (i.e., approximately 10-15 years later). In other
words, when it comes to youths’ cravings for social attention, they are “all gas pedal with no
7 Vogels, E. A., Gelles-Watnick, R., & Massarat, N. (2022, August 10). Teens, social media and technology 2022.
Pew Research Center. https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/.
7
brakes.” Adolescence is thus a developmentally vulnerable period during which youth may be
especially motivated to pursue social rewards, and not yet fully capable of restraining themselves.
Research suggests that technology and social media use may exploit this biological
vulnerability among youth. Data reveal that social media stimuli, such as receiving “likes” or
followers activates the social reward regions of the brain 8. In other words, these features of social
media capitalize on youths’ biologically based need for social rewards before they are able to
regulate themselves from over-use. This has at least four significant implications for youth mental
health.
Social Media and Loneliness. Although ostensibly social media platforms are built to
foster interpersonal contacts and connections, they are not designed primarily to foster meaningful
and mutually rewarding relationships that confer psychological benefits. Relationships are most
beneficial to youths’ psychological development when they are characterized by support,
emotional intimacy, disclosure, positive regard, reliable alliance (e.g., “having each other’s
backs”), and trust 9. It is possible to use social media to foster exactly these types of relationship
qualities, such as through direct messaging features. However, these are not the functions that are
highlighted on most platforms. More typically, users are directed towards the number of “likes,”
followers, or reposts they received, often without immediate access to the identity of those who
engaged with their profile or content. In other words, platforms are more apt to motivate users
towards one’s metrics than people themselves, which has led many youth to upload curated or
filtered content to portray themselves most favorably. Note that these features of social media, and
the resulting behaviors of those who use social media create the exact opposite qualities needed
for successful and adaptive relationships (i.e., disingenuous, anonymous, depersonalized). In other
8 Sherman, L. E., Hernandez, L. M., Greenfield, P. M., & Dapretto , M. (2018). What the brain 'Likes': neural
correlates of providing feedback on social media. Social cognitive and affective neuroscience, 13(7), 699 –707.
https://doi.org/10.1093/scan/nsy051.
9 Furman, W., Bukowski, W. M., Newcomb, A. F., & Hartup, W. W. (1996). The company they keep: Friendship in
childhood and adolescence. Cambridge studies in social and emotional development. In W. Bukowski., A.
Newcomb & W. Hartup (Eds), The measurement of friendship perceptions: Conceptual and methodological, (41 -
65).
8
words, social media offers the “empty calories of social interaction,” that appear to help satiate our
biological and psychological needs, but do not contain any of the healthy ingredients necessary to
reap benefits. Anecdotally, teens’ behavior reflects this issue – the “Finsta” phenomenon reflects
digital natives’ attempt to find more honest and intimate relationships with one another, but
without experience in doing so first offline. Scientific data also support this claim; research reveals
that in the hours following social media use, teens paradoxically report increases rather than
decreases in loneliness 10.
Heightened Risk for Negative Peer Influence. Adolescents frequently are exposed to
content online depicting illegal, immoral, dangerous, and unethical behavior. The architecture of
many social media platforms allows users to like, repost, or comment on this content. Emerging
data suggest that these features of social media present a significant risk to adolescents’ mental
health. Specifically, data reveal that social media may change adolescents’ susceptibility to
maladaptive behavior through both biological and psychological pathways. Research examining
adolescents’ brains while on a simulated social media site, for example, revealed that when
exposed to illegal, dangerous imagery, activation of the prefrontal cortex was observed suggesting
healthy inhibition towards maladaptive behaviors. However, when these same images were shown
with icons indicating that they were “liked” on social media, there was a significant decrease in
activation of the brain’s imbibition center, suggesting that the “likes” may reduce youths’
inhibition (i.e., perhaps increasing their proclivity) towards dangerous and illegal behavior.11 This
is evidence that social media features are changing how youths’ brains respond to images in ways
that confer risk for the development of maladaptive behavior.
10 Armstrong-Carter, E., Garrett, S. L., Nick, E. A., Prinstein, M. J., & Telzer, E. H. (2022). Momentary links
between adolescents’ social media use and social experiences and motivations: Individual differences by peer
susceptibility. Developmental Psychology. Advance online publication. https://doi.org/10.1037/dev0001503.
11 See for example, Sherman, L. E., Hernandez, L. M., Greenfield, P. M., & Dapretto, M. (2018). What the brain
'Likes': neural correlates of providing feedback on social media. Social cognitive and affective neuroscience, 13(7),
699–707. https://doi.org/10.1093/scan/nsy051.
9
There also is evidence that these features of social media may promote a psychological
affinity for dangerous and risk-taking behavior. For instance, a study of young high school
students revealed that adolescents’ exposure to “liked” posts depicting alcohol use was associated
with changes in teens’ perceptions of their peers’ acceptance of alcohol use, which in turn predicted
these same teens’ early engagement in heavy episodic drinking (i.e., five or more drinks on a single
occasion) 12. Related research has demonstrated that individuals are more likely to “like” a post
that they see others have “liked” before them, and this may increase the likelihood of exposure to
similarly themed-posts, via AI-derived algorithms 13. These findings illustrate clear and powerful
ways that the features embedded in social media platforms may have an important and highly
concerning effect on youth mental health. Note, it is also possible that these same processes can
be used to influence peers towards positive behaviors; however, this has not been adequately
investigated.
Risks for Addictive Social Media Use. Youths’ biological vulnerabilities also have
significant implications for “problematic social media use” or addictive behaviors; note that the
regions of the brain activated by social media use overlap considerably with the regions involved
in addictions to illegal and dangerous substances 14. As noted above, the developing brain is built
to increase a desire for social rewards (that social media delivers abundantly), without the ability
to show the capacities of inhibition and restraint capable among adults. This suggests that youth
may be at risk for extraordinarily frequent uses of social media. Several bodies of research reveal
that this indeed may be a very significant concern. For instance, data suggest that almost half of
12 Nesi J, Rothenberg WA, Hussong AM, Jackson KM. Friends’ Alcohol-Related Social Networking Site Activity
Predicts Escalations in Adolescent Drinking: Mediation by Peer Norms. J Adolesc Health. 2017;60(6):641 -647.
doi:10.1016/j.jadohealth.2017.01.009.
13 Egebark J, Ekström M. Liking what others “Like”: using Facebook to identify determinants of conformity. Exp
Econ. 2017;21(4):1-22. doi:10.1007/s10683-017-9552-1.
14 De-Sola Gutiérrez, J., Rodríguez de Fonseca, F., & Rubio, G. (2016). Cell-Phone Addiction: A Review. Frontiers
in Psychiatry, 7(175). https://doi.org/10.3389/fpsyt.2016.00175; Griffiths, M. D., Kuss, D. J., & Demetrovics, Z.
(2014). Social networking addiction: An overview of preliminary findings. In K. P. Rosenberg & L. Curtiss Feder
(Eds.), Behavioral addictions: Criteria, evidence, and treatment (pp. 119 –141). Elsevier Academic Press.
https://doi.org/10.1016/B978-0-12-407724-9.00006-9; Kirby, B., Dapore, A., Ash, C., Malley, K., & West, R.
(2020). Smartphone pathology, agency and reward processing. Lecture Notes in Information Systems and
Organisation, 321-329. https://doi.org/10.1007/978-3-030-60073-0_37.
10
all adolescents report that they use social media “almost constantly” 15. Research also has
compared social media use to diagnostic criteria for substance use dependencies, revealing that
many adolescents report an inability to stop using social media, even when they want to,
remarkable efforts to maintain access to social media, the use of social media to regulate their
emotions, a need for increasing social media use to achieve the same level of pleasure (i.e.,
tolerance symptoms), withdrawal symptoms following abstinence, an significant impairment in
their daily educational, social, work routines. A recent study revealed that over 54% of 11– 13-
year-old youth reported at least one of these symptoms of problematic social media use 16. About
85% of youth report spending more time than intended online and 61% reporting failing when
trying to stop or reduce their use of social media 17.
Alterations in Brain Development. Youths’ biological vulnerability to technology and
social media, and their resulting frequent use of these platforms, also has the potential to alter
youths’ neural development since our brains develop in response to the environment we live in.
Recent studies have revealed that technology and social media use is associated with changes in
structural brain development (i.e., changing the size and physical characteristics of the brain). In
addition, research with my own colleagues at the University of North Carolina at Chapel Hill
recently has revealed that technology and social media use also is associated with changes in how
the brain works). Our data has revealed that youth indeed spend a remarkable amount of time
using their devices 18. Objective data measured by teens’ phones themselves indicated that the
average number of times that youth in sixth grade picked up their phones was over 100, with some
interrupting daily activities to pick up their phones over 400 times a day. On average, adolescents
15 Vogels, E. A., Gelles-Watnick, R., & Massarat, N. (2022, August 10). Teens, social media and technology 2022.
Pew Research Center. https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/.
16 Boer M, Stevens GWJM, Finkenauer C, van den Eijnden RJJM. The course of problematic social media use in
young adolescents: A latent class growth analysis. Child Dev. 2022;93(2):e168-e187. doi:10.1111/cdev.13712
17 The Common Sense Census: Media Use by Tweens and Teens. (2021).
https://www.commonsensemedia.org/sites/default/files/research/report/8-18-census-integrated-report-final-
web_0.pdf.
18 Armstrong-Carter, E., Garrett, S. L., Nick, E. A., Prinstein, M. J., & Telzer, E. H. (2022). Momentary links
between adolescents’ social media use and social experiences and motivations: Individual differences by peer
susceptibility. Developmental psychology.
11
also reported an average of 8.2 hours of time on their devices each day, with some logging double
this amount 19. The phone “apps” adolescents picked up their devices to use most often were
popular social media platforms. Our research using annual fMRI brain scans revealed that more
frequent uses of adolescents’ devices (i.e., predominantly for social media) was associated with
changes in how their brains developed. More phone “pickups” were associated with unique
development of brain regions. In short, results found that high social media users may have
promoted brain development in a way that may make adolescents more inclined to focus on social
rewards (e.g., attention from peers) and altered self-control 20.
Youth’s Exposure to Unmonitored Content Poses Potential Risks. There are two domains
of problematic content online that many youth are exposed to. Research demonstrates that this
also likely contributes to mental health difficulties among children and adolescents. One domain
pertains to content that actively showcases and promotes engagement in psychologically
disordered behavior, such as sites that discuss eating disordered behaviors (i.e., “pro-Anna” sites
that encourage fasting, laxative use, excessive exercise) and pro-cutting sites depicting
nonsuicidal self-injury 21. Research indicates that this content has proliferated on social media
sites, not only depicting these behaviors, but teaching young people how to engage in each, how
to conceal these behaviors from adults, actively encouraging users to engage in these behaviors,
and socially sanctioning those who express a desire for less risky behavior 22. Moreover, in some
cases this content is not removed nor are trigger warnings included to protect vulnerable youth
from the effects that exposure to this content can have on their own behavior. This underscores
the need for platforms to deploy tools to filter content, display warnings, and create reporting
structures to mitigate these harms.
19 Maza MT, Fox KA, Kwon S-J, et al. Association of habitual checking behaviors on social media with longitudinal
functional brain development. JAMA Pediatr. 2023;177(2):160-167. doi:10.1001/jamapediatrics.2022.4924.
20 See above.
21 Lewis, S. P., Heath, N. L., St Denis, J. M., & Noble, R. (2011). The scope of nonsuicidal self-injury on YouTube.
Pediatrics, 127(3), e552–e557. https://doi.org/10.1542/peds.2010-2317.
22 Whitlock JL, Powers JL, Eckenrode J. The virtual cutting edge: the in ternet and adolescent self-injury. Dev
Psychol. 2006 May;42(3):407-17. doi: 10.1037/0012-1649.42.3.407. PMID: 16756433.
12
A second area of concern regarding online content pertains to the frequency of online
discrimination and cyberbullying, including youths’ posts that encourage their peers to attempt
suicide. Research demonstrates that online victimization, harassment, and discrimination against
racial, ethnic, gender, and sexual minorities is frequent online and often targeted at young people
23. LGBTQ+ youth experience a heightened level of bullying, threats, and self-harm on social
media. One in three young LGBTQ+ people have said that they had been sexually harassed online,
four times as often as other young people 24. Brain scans of adults and youths reveal that online
harassment activates the same regions of the brain that respond to physical pain and trigger a
cascade of reactions that replicate physical assault and create physical and mental health damage
25. Moreover, research has revealed that online discrimination often is harsher and more severe
than offline discriminatory experiences. Results reveal that the effects of online discrimination
and bullying on youths’ risk for depression and anxiety are significant above and beyond the
effects of experiences that these same youth experience offline. The permanence, potential for
worldwide dissemination, anonymity, and the like, repost, and comment features afforded on most
social media platforms seem to contribute to youths’ mental health difficulties. As with other forms
of harassment and associated harms, new policies and processes are needed to blunt the impact of
these harms.
The Potential Effects of Digital Stress. Social media platforms frequently include a variety
of features designed to maintain users’ engagement online, or encourage users to return to the app.
Psychological theory and research have begun to reveal that this has become a significant source
23 Moreno, M. A., Chassiakos, Y. R., Cross, C., Hill, D., Ameenuddin, N., Radesky, J., Hutchinson, J., Boyd, R.,
Mendelson, R., Smith, J., Swanson, W. S., & Media, C. C. (2016). Media use in school -aged children and
adolescents. Pediatrics, 138(5). https://doi.org/10.1542/peds.2016-2592; Tynes, B. M., Giang, M. T., Williams, D.
R., & Thompson, G. N. (2008). Online racial discrimination and psychological adjustment among adolescents.
Journal of Adolescent Health, 43(6), 565 -569. https://doi.org/10.1016/j.jadohealth.2008.08.021.
24 Out Online: The Experiences of LGBT Youth on the Internet. (2013). GLSEN. https://www.glsen.org/news/out -
online-experiences-lgbt-youth-internet.
25 Cannon, D. S., Tiffany, S. T., Coon, H., Scholand, M. B., McMahon, W. M., & Leppert, M. F. (2007). The PHQ -9
as a brief assessment of lifetime major depression. Psychological Assessment, 19(2), 247 -251.
https://doi.org/10.1037/1040-3590.19.2.247.
13
of stress. This is highly relevant since stress is one of the strongest predictors of children’s and
adolescents’ mental health difficulties, including suicidal behavior. “Digital stress,” is
characterized by a youth’s a) connection overload (i.e., notification and implicit social
requirements to participate on social media platforms), b) the fear of missing out on conversations
and other social interactions taking place exclusively online, c) the need to remain constantly
available to others online, and d) approval anxiety (i.e., concerns about the response to one’s own
posts) are each notable factors influencing the way youth think about their connection to online
platforms 26. Nearly half of all young people participating in online platforms report experiencing
digital stress. Research demonstrates that higher levels of digital stress are associated with greater
increases in depressive symptoms among adolescents 27.
Social Media Encourages Social Comparisons. The quantitative nature of social media,
combined with the use of visual stimuli, creates a fertile ground for social comparisons.
Adolescence, a period defined by psychologists as a process of identity development via reflected
appraisal processes (i.e., evaluating oneself based on feedback from peers) are especially likely to
engage with social media in ways that allow them to compare their appearance, friends, social
activities with others with what they see online, especially when those in their own social network
are commenting and “liking” these same posts. The opportunity for constant feedback,
commentary, quantitative metrics of approval, and 24-hour social engagement is unprecedented
among our species. Research suggests that these social comparison processes, and youths’
tendency to seek positive feedback or status (i.e., more “likes,” followers, online praise) is
associated with a risk for depressive symptoms 28. In addition, psychological science demonstrates
26 Steele, R. G., Hall, J. A., & Christofferson, J. L. (2020). Conceptualizing Digital Stress in Adolescents and Young
Adults: Toward the Development of an Empirically Based Model. Clinical child and family psychology review,
23(1), 15–26. https://doi.org/10.1007/s10567-019-00300-5.
27 Nick, E. A., Kilic, Z., Nesi, J., Telzer, E. H., Lindquist, K. A., & Prinstein, M. J. (2022). Adolescent Digital
Stress: Frequencies, Correlates, and Longitudinal Association With Depressive Symptoms. The Journal of
adolescent health : official publication of the Society for Adolescent Medicine, 70(2), 336–339.
https://doi.org/10.1016/j.jadohealth.2021.08.025.
28 Choukas-Bradley, S., Nesi, J., Widman, L., & Galla, B. M. (2020). The Appearance-Related Social Media
Consciousness Scale: Development and validation with adolescents. Body Image, 33, 164-174.
14
that exposure to this online content is associated with lower self-image and distorted body
perceptions among young people. This exposure creates strong risk factors for eating disorders,
unhealthy weight-management behaviors, and depression 29. As with other impacts of online
platforms, evidence indicates that these body image issues are particularly prevalent in LGBTQ+
youth. Leaving these youth more predisposed to eating disorders, depression, bullying, substance
abuse and other mental health harms.
Potentially Beneficial Effects of Social Media Use. It is important to acknowledge that
research on social media use and adolescent development is relatively new, as are many social
media platforms. In addition, there has been remarkably little funding designated for research on
this topic. Consequently, the long-term effects of social media use on youth development is
relatively uncharted. For instance, above I discussed some of the potential effects of technology
social media use on brain development. Yet, it is unknown whether adolescent brain
development, known for its plasticity, may “correct” some of the alternations in brain structure
or function, whether compensatory neural processes may develop, or whether these alterations
may confer unknown future strengths.
In addition, there is some research demonstrating that social media use is linked with
positive outcomes that may benefit psychological development among youth. Perhaps most
notably, psychological research suggests that young people form and maintain friendships online.
These relationships often afford opportunities to interact with a more diverse peer group than
offline, and the relationships are close and meaningful and provide important support to youth in
https://doi.org/10.1016/j.bodyim.2020.02.017 ; Hawes, T., Zimmer-Gembeck, M. J., & Campbell, S. M. (2020).
Unique associations of social media use and online appearance preoccupation with depression, anxiety, and
appearance rejection sensitivity. Body Image, 33, 66-76. https://doi.org/10.1016/j.bodyim.2020.02.010 ; Nesi, J.L., &
Prinstein, M.J. (2015). Using social media for social comparison and feedback seeking: Gender and popularit y
moderate associations with depressive symptoms. Journal of Abnormal Child Psychology, 43(8), 1427 –1438.
29 Carrotte, E. R., Vella, A. M., & Lim, M. S. (2015). Predictors of “liking” three types of health and fitness -related
content on social media: A cross-sectional study. Journal of Medical Internet Research, 17(8), e205.
https://doi.org/10.2196/jmir.4803; https://doi.org/10.1016/j.paid.2011.11.011.
15
times of stress 30. The buffering effects of social support from peers has been well documented in
the psychological literature 31. This may be especially important for youth with marginalized
identities, including racial, ethnic, sexual, and gender minorities. Digital platforms provide an
important space for self-discovery and expression for LGBTQ+ youth.
Research also suggests that during the COVID-19 lockdown from 2020-2021, the use of
one-on-one (i.e., direct messaging) on social media and sharing funny content reduced stress
among youth . There also is some evidence that youth are more likely to engage in civic activism
online than off 32.
A growing area of research has also focused on the use of youths’ interest in online
activities as an opportunity for digital-based intervention 33. Adolescents report high levels of com-
fort with, and a preference for, online communication, especially when discussing mental health.
Studies also show that adolescents commonly use the internet for mental health information 34.
30Anderson, M., & Jiang, J. (2018, November 28). 2. Teens, friendships and online groups. Pew Research Center:
Internet, Science & Tech; Pew Research Center: Internet, Science & Tech.
https://www.pewresearch.org/internet/2018/11/28/teens-friendships-and-online-groups/; Charmaraman L, Hodes R,
Richer AM. Young Sexual Minority Adolescent Experiences of Self-expression and Isolation on Social Media:
Cross-sectional Survey Study. JMIR Ment Health. 2021;8(9):e26207. doi:10.2196/26207; Massing-Schaffer M, Nesi
J, Telzer EH, Lindquist KA, Prinstein MJ. Adolescent Peer Experiences and Prospective Suicidal Ideation: The
Protective Role of Online-Only Friendships. J Clin Child Adolesc Psychol. 2022;51(1):49-60.
doi:10.1080/15374416.2020.1750019; Marciano L, Ostroumova M, Schulz PJ, Camerini A-L. Digital Media Use
and Adolescents’ Mental Health During the Covid-19 Pandemic: A Systematic Review and Meta-Analysis. Front
Public Health. 2021;9:793868. doi:10.3389/fpubh.2021.793868; Baskin-Sommers A, Simmons C, Conley M, et al.
Adolescent civic engagement: Lessons from Black Lives Matter. Proc Natl Acad Sci USA. 2021;118(41).
doi:10.1073/pnas.2109860118.
31 Cohen, S., & Wills, T. A. (1985). Stress, social support, and the buffering hypothesis. Psychological Bulletin,
98(2), 310–357. https://doi.org/10.1037/0033-2909.98.2.310.
32 Marciano, L., Ostroumova, M., Schulz, P. J., & Camerini , A. L. (2022). Digital Media Use and Adolescents'
Mental Health During the Covid-19 Pandemic: A Systematic Review and Meta-Analysis. Frontiers in public health,
9, 793868. https://doi.org/10.3389/fpubh.2021.793868.
33 Bradford, S., & Rickwood, D. (2015). Young people’s views on electronic mental health assessment: Prefer to
type than talk? Journal of Child and Family Studies, 24(5), 1213 –1221. https://doi.org/10.1007/s10826-014-9929-0.
34 Intervention and Prevention in the Digital Age. (2022). In J. Nesi, E. Telzer, & M. Prinstein (Eds.), Handbook of
Adolescent Digital Media Use and Mental Health (pp. 363-416). Cambridge: Cambridge University Press.
doi:10.1017/9781108976237.019; Park, E., & Kwon, M. (2018). Health-Related Internet Use by Children and
Adolescents: Systematic Review. Journal of medical Internet research, 20(4), e1 20.
https://doi.org/10.2196/jmir.7731.
16
These elements, taken together, present the possibility that digital modes of treatment and other
health interventions may be particularly effective for young people.
Research into the field of digital mental health interventions is growing and the existing
information is heavily skewed toward more established modalities (e.g., telehealth, online/web-
based interventions). Evidence supports the use of videoconferencing as an effective form of
treatment for youth mental health across a range of problems 35. While many computerized
programs and internet-based treatment programs were found to be of moderate to high quality , a
systematic review of the literature found that the inclusion of a therapist or clinician improved
outcomes in adolescents with depression and anxiety over those that were self-paced 36. Young
people with a history of suicidal ideation often prefer to initially seek and receive healthcare online
37. Even when individuals have strong support systems offline, they may struggle to access that
support in times of need 38. Early indications that online support may be appealing because of its
immediate nature and because the interactions are among peers with shared experience and
35 Myers, K. M., Valentine, J. M., Melzer, S. M. (2007, Nov). Feasibility, acceptability, and sustainability of
telepsychiatry for children and adolescents. Psychiatric Services, 58(11), 1493 -1496.
https://doi.org/10.1176/ps.2007.58.11.1493 ; Nelson, E. L., Cain, S., & Sharp, S. (2017, Jan). Considerations for
conducting telemental health with children and adolescents. Child Adolescent Psychiatric Clinics of North Amer ica,
26(1), 77-91. https://doi.org/10.1016/j.chc.2016.07.008.
36 Clarke, T. C., Black, L. I., Stussman, B. J., Barnes, P. M., & Nahin, R. L. (2015). Trends in the us e of
complementary health approaches among adults: United States, 2002 -2012. National health statistics reports, (79),
1–16.; Wozney L, McGrath P, Gehring N, Bennett K, Huguet A, Hartling L, Dyson M, Soleimani A, Newton A .
eMental Healthcare Technologies for Anxiety and Depression in Childhood and Adolescence: Systematic Review of
Studies Reporting Implementation Outcomes. JMIR Ment Health 2018;5(2):e48. https://mental.jmir.org/2018/2/e48;
Hollis, C., Falconer, C. J., Martin, J. L., Whittington, C., Stockton, S., Glazebrook , C., & Davies, E. B. (2017).
Annual Research Review: Digital health interventions for children and young people with mental health problems -
a systematic and meta-review. Journal of child psychology and psychiatry, and allied disciplines, 58(4), 474 –503.
https://doi.org/10.1111/jcpp.12663.
37 Frost, M., Casey, L. M., & O’Gorman, J. G. (2017). Self-injury in young people and the help-negation effect.
Psychiatry Research, 250, 291–296. https://doi.org/10.1016/j.psychres.2016.12.022.
38 Kruzan, K. P., Whitlock, J., & Bazarova, N. N. (2021). Examining the Relationship Between the Us e of a Mobile
Peer-Support App and Self-Injury Outcomes: Longitudinal Mixed Methods Study. JMIR Mental Health, 8(1),
e21854. https://doi.org/10.2196/21854; Lavis, A., & Winter, R. (2020). #Online harms or benefits? An ethnographic
analysis of the positives and negatives of peer-support around self-harm on social media. Journal of Child
Psychology and Psychiatry, and Allied Disciplines, 61(8). https://doi.org/10.1111/jcpp.13245.
17
experiential knowledge 39. Yet, it is crucial for young people to have access to in-person screenings
and clinician support.
Psychological Effects of Lost Opportunities While Youth Are Online
Every hour youth spend online is an hour that is not being spent on alternative (“in real
life”) activities. In some cases, this may protect adolescents’ exposure to peer contexts in which
substance use and sexually risky behaviors occur. However, youths’ online activities also may
preclude engagement in activities necessary for successful maturation and psychological
adaptation. Perhaps most concerning is the extent to which research has demonstrated that
technology and social media use is interfering with youths’ sleep.
Research has supported the link between technology use and sleep in several ways.
Perhaps most compelling are data from meta-analyses (i.e., a statistical integration of findings from
across an entire body of research) indicating that 60% of adolescents report using technology in
the hour before bedtime, and more screen time is associated with poorer sleep health and failure
to meet sleep duration requirements set by the American Academy of Sleep Medicine, partly due
to delayed melatonin release, delayed bedtimes, and increases in overstimulation and difficulty
disengaging from online social interactions. Interventions to reduce nighttime screen use are
successful in increasing sleep duration 40.
This has critical implications for adolescent development. Research suggests that
insufficient sleep is associated with poor school performance, difficulties with attention, stress
39 Marchant, A., Hawton, K., Stewart, A., Montgomery, P., Singaravelu, V., Lloyd, K., Purdy, N., Daine, K., &
John, A. (2017). A systematic review of the relationship between internet use, self-harm and suicidal behaviour in
young people: The good, the bad and the unknown. PLOS ONE, 12(8), e0181722.
https://doi.org/10.1371/journal.pone.0181722; Thoits, P. A. (2011). Mechanisms Linking Social Ties and Support to
Physical and Mental Health. Journal of Health and Social Behavior, 52(2), 145 –161.
https://doi.org/10.1177/0022146510395592 .
40 Telzer EH, Goldenberg D, Fuligni AJ, Lieberman MD, Gálvan A. Sleep variability in adolescence is associated
with altered brain development. Dev Cogn Neurosci. 2015;14:16-22. doi:10.1016/j.dcn.2015.05.007.
18
regulation, and increased risk for automobile accidents. Neuroscientific research has demonstrated
that inconsistent sleep schedules are associated with changes in structural brain development in
adolescent years. In other words, youths’ preoccupation with technology and social media may
deleteriously affect the size of their brains 41.
In addition, note that youth also engage with online and social media apps while
participating in other activities. Indeed, early studies show that when youth are engaging in
schoolwork, they often are doing so alongside the use of social media platforms, a phenomenon
called “media multitasking” 42. Research clearly demonstrates that most humans cannot multitask,
but rather are rapidly task-shifting – a process associated with poorer memory and comprehension
among youth 43. Evidence shows that these phenomena only worsen with heavier use of social
media, with more common symptoms such as mind wandering and higher levels of impulsivity
among young adults who use social media more frequently 44.
Potential Solutions and Policy Implications
41 Achterberg M, Becht A, van der Cruijsen R, et al. Longitudinal associations between social media use, mental
well-being and structural brain development across adolescence. Dev Cogn Neurosci. 2022;54:101088.
doi:10.1016/j.dcn.2022.101088.
42 Jeong, S.-H., & Hwang, Y. (2012). Does Multitasking Increase or Decrease Persuasion? Effects of Multitasking
on Comprehension and Counterarguing. Journal of Communication, 62(4), 571 –587. https://doi.org/10.1111/j.1460-
2466.2012.01659.x; van der Schuur, W. A., Baumgartner, S. E., Sumter, S. R., & Valkenburg, P. M. (2015). The
consequences of media multitasking for youth: A review. Computers in Human Behavior, 53, 204–215.
https://doi.org/10.1016/j.chb.2015.06.035 ; L. Mark Carrier, Larry D. Rosen, Nancy A. Cheever, Alex F. Lim,
Causes, effects, and practicalities of everyday multitasking, Developmental Review (2015), doi:
10.1016/j.dr.2014.12.005.
43 Ralph, B. C., Thomson, D. R., Cheyne, J. A., & Smilek, D. (2014). Media multitasking and failures of attention in
everyday life. Psychological research, 78(5), 661–669. https://doi.org/10.1007/s00426-013-0523-7.
44 Ophir, E., Nass, C., & Wagner, A. D. (2009). Cognitive control in media multitaskers. Proceedings of the National
Academy of Sciences of the United States of America, 106(37), 15583 –15587.
https://doi.org/10.1073/pnas.0903620106; Ralph, B. C., Thomson, D. R., Cheyne, J. A., & Smilek, D. (2014). Media
multitasking and failures of attention in everyday life. Psychological research, 78(5), 661 –669.
https://doi.org/10.1007/s00426-013-0523-7; Baumgartner, S. E., Weeda, W. D., van der Heijden, L. L., & Huizinga,
M. (2014). The Relationship Between Media Multitasking and Executive Function in Early Adolescents. The
Journal of Early Adolescence, 34(8), 1120–1144. https://doi.org/10.1177/0272431614523133; Baumgartner,
Susanne & van der Schuur, Winneke & Lemmens, Jeroen & te Poel, Fam. (2018). The Relationship Between Media
Multitasking and Attention Problems in Adolescents: Results of Two Longitudinal Studies. Human Comm unication
Research. 44. 3-30. 10.1093/hcre.12111.
19
The internet and the introduction of social media platforms have literally changed our
species through new forms of social interaction, new rules for discourse, the rapid spread of
information, and concomitant changes in the types of relationships that previously had defined the
human race for millennia. This is an extraordinarily high priority area for additional scientific
research; however, this work has been woefully underfunded. Currently, federal agencies lack both
the direction, expertise, and dedicated funding to adequately research both the positive and
negative impacts of online platforms. Tech companies responsible for these platforms employ
dozens of researchers focused on designing products and observing how users engage with them.
The federal government must match or exceed this commitment to ensure the public has an
adequate understanding of how these platforms work and how users, especially children, are using
these platforms and their impact. The research that is needed should be longitudinal to allow for
long-term follow-up. Research should capture the experience of diverse samples, utilize the
benefits of technology to capture objective measures of behavior, include technology (e.g., fMRI)
to study biopsychosocial effects, and importantly, should make use of the data available to social
media companies to fully understand the effects of social media and protect the common good.
This effort must be paired with required increases in transparency and access to data for researchers
to further understand online activity. New transparency and reporting requirements should ensure
user privacy, while creating new mechanisms for researchers and policymakers to understand how
these online spaces operate.
Recently, Congress allocated $15M to research on social media and adolescent mental
health. This is appreciated, yet barely sufficient to fund more than 3-5 individual studies that
would meet the abovementioned specifications. At least $100M in funds will be needed to reflect
a serious commitment to this research area across federal agencies. And, as we are on the precipice
of a new digital age with artificial intelligence (AI) and machine learning directly impacting us
across the lifespan, it is paramount that our country invest in research to protect future generations.
20
Such research also might address the role of social media algorithms on users’ experience.
This requires access to data for independent researchers to understand how algorithms work 45.
Social media companies employing algorithms to display content to users should take steps to
provide explanations on how these technologies work and how they might drive or reward certain
types of posts or behavior. Data from algorithms, along with internal research, should also be made
public to allow researchers and policymakers to achieve a greater understanding of the impacts of
social media on users, particularly children. Federal agencies should prioritize research into the
impacts of social media and provide private researchers with grants and other support to ensure
findings relating to these platforms are made broadly available.
There is much more Congress and federal agencies can do to provide education around
how best to use online platforms to mitigate harmful impacts. A coalition of more than 150
organizations, led by APA, have called on the Surgeon General to create and distribute resources
dedicated to teaching children and caregivers about online social media use 46. There is a clear
need for an education campaign that enhances the public’s understanding of the potential harms
posed by social media and encourages caregivers and children to educate themselves with
evidence-informed suggestions for its appropriate use. At the same time, it is important to
acknowledge social media’s potential to provide children with a healthy space for convening and
companionship. While we recognize the need for additional research in this area, the very real
harms of social media are impacting our children today, and more must be done to communicate
and mitigate the impacts of online social media use. Educating young users and their caregivers
about how best to use the platforms to mitigate negative impacts is an essential intervention t hat
can start today. A public education campaign should include information about the specific
dangers social media poses to adolescents, how parents and caregivers can best navigate learning
45 Epps-Darling, A., Bouyer, R. T., & Cramer, H. (2020, October). Artist gender representation in music streaming.
In Proceedings of the 21st International Society for Music Information Retrieval Conference (Montréal, Canada)
(ISMIR 2020). ISMIR (pp. 248-254); Bravo, D. Y., Jefferies, J., Epps, A., & Hill, N. E. (2019). When things go
viral: Youth’s discrimination exposure in the world of social media. In Handbook of Children and Prejudice (pp.
269-287). Springer, Cham. https://doi.org/10.1007/978-3-030-12228-7_15.
46 (2023). Apaservices.org. https://www.apaservices.org/advocacy/news/surgeon -general-dangers-social-media
21
more about these dangers, how best to communicate the risks with their children, and ultimately
how to educate their children on the best methods for using social media in a safe way.
APA also advocates for Congress and federal agencies to require social media companies
to do more to combat this issue. Platforms can create and provide new tools aimed at mitigating
the harms associated with platform use. Requiring social media companies to provide children and
their caregivers with options to make changes to their social media settings can promote mental
health by protecting their information, disabling features that are particularly addictive, and opting
out of algorithm processes that serve up problematic or harmful content. Social media companies
can also be required to set defaults to address harms to young users.
Warnings on harmful content should also be considered to reduce exposure of young
people to content that may negatively impact their mental health or well-being and companies
should be held accountable for the proliferation of this content. Social medial companies should
acknowledge known impacts of their platforms, providing warnings and resources to parents and
caregivers of young users, develop plans to mitigate known harms, and determine whether these
warnings and plans were effective, with iterative updates based on these findings. Social media
platforms must work to prevent and mitigate harmful content, such as promotion of self-harm,
suicide, eating disorders, substance use and sexual exploitation. Independent audits can assess risks
and determine whether platforms are taking meaningful steps to prevent damage and these must
be paired with enforcement actions and accountability mechanisms for when platforms fail to
effectively mitigate harms to children.
As discussed throughout this testimony, more must be done to specifically protect those
children belonging to traditionally marginalized and minoritized communities. Mental health and
other harms can disproportionately fall on LGBTQ+ youth, and resources should be dedicated to
ensuring a reduction in these harms. More must be required of platforms to discourage and prevent
cyberbullying and other forms of online hate and discrimination. Reporting structures should be
22
more robust to allow for instances to be tracked and discouraged. Reforms to platform user
experience should be prioritized to ensure members of these communities are protected from
disproportionate harm.
Specific legislation has been proposed across the federal government that would take
productive steps in mitigating the known negative impacts of social media. The Kids Online Safety
Act (KOSA) is one such piece of legislation. In 2022, APA CEO Arthur C. Evans Jr., PhD, said,
“The Kids Online Safety Act is an important first step in reining in the harms caused to children
by social media platforms,” and “enacting measures that curtail harmful practices while
authorizing research to understand additional impacts is a thoughtful strategy”47. KOSA and other
previously proposed legislative fixes such as updates to the Children Online Privacy and Protection
Act represent important steps by Congress and I encourage their debate and adoption.
APA is heartened by the focus on mental health in Congress, and eager to work with this
committee and its members to develop legislation and enact the bills cited above. Your actions
now can make all the difference in how our young people interact with and are impacted by online
spaces. Together, psychology, other scientific disciplines, parents, caregivers, teachers, tech
companies, and policymakers can work to solve this serious problem. APA is a ready partner and
looks forward to working with the committee to put in place critical changes to our current system
that improve the lives of our children and the flourishing of online spaces.
47 (2023). Apaservices.org. https://www.apaservices.org/advocacy/news/kids -online-safety-legislation
3/5/23, 5:00 PM Bill Text - AB-638 Mental Health Services Act: early intervention and prevention programs.
https://leginfo.legislature.ca.gov/faces/billNavClient.xhtml?bill_id=202120220AB638 1/2
SHARE THIS:Date Published: 10/07/2021 02:00 PM
AB-638 Mental Health Services Act: early intervention and prevention programs.(2021-2022)
Assembly Bill No. 638
CHAPTER 584
An act to amend Section 5840 of the Welfare and Institutions Code, relating to mental health, and
making an appropriation therefor.
[ Approved by Governor October 06, 2021. Filed with Secretary of State
October 06, 2021. ]
LEGISLATIVE COUNSEL'S DIGEST
AB 638, Quirk-Silva. Mental Health Services Act: early intervention and prevention programs.
Existing law, the Mental Health Services Act (MHSA), an initiative measure enacted by the voters as Proposition
63 at the November 2, 2004, statewide general election, establishes the continuously appropriated Mental Health
Services Fund to fund various county mental health programs and requires counties to spend those funds on
mental health services, as specified. The MHSA requires counties to establish a program designed to prevent
mental illnesses from becoming severe and disabling and authorizes counties to use funds designated for
prevention and early intervention to broaden the provision of those community-based mental health services by
adding prevention and early intervention services or activities.
Existing law authorizes the MHSA to be amended by a 2/3 vote of the Legislature if the amendments are
consistent with, and further the purposes of, the MHSA.
This bill would amend the MHSA by including in the prevention and early intervention services authorized to be
provided, prevention and early intervention strategies that address mental health needs, substance misuse or
substance use disorders, or needs relating to cooccurring mental health and substance use services. By
authorizing a new use for continuously appropriated funds, this bill would make an appropriation. The bill would
state the finding and declaration of the Legislature that this change is consistent with, and furthers the intent of,
the MHSA.
Vote: 2/3 Appropriation: yes Fiscal Committee: yes Local Program: no
THE PEOPLE OF THE STATE OF CALIFORNIA DO ENACT AS FOLLOWS:
SECTION 1. Section 5840 of the Welfare and Institutions Code is amended to read:
5840. (a) The State Department of Health Care Services, in coordination with counties, shall establish a program
designed to prevent mental illnesses from becoming severe and disabling. The program shall emphasize
improving timely access to services for underserved populations.
Home Bill Information California Law Publications Other Resources My Subscriptions My Favorites
3/5/23, 5:00 PM Bill Text - AB-638 Mental Health Services Act: early intervention and prevention programs.
https://leginfo.legislature.ca.gov/faces/billNavClient.xhtml?bill_id=202120220AB638 2/2
(b) The program shall include the following components:
(1) Outreach to families, employers, primary care health care providers, and others to recognize the early
signs of potentially severe and disabling mental illnesses.
(2) Access and linkage to medically necessary care provided by county mental health programs for children
with severe mental illness, as defined in Section 5600.3, and for adults and seniors with severe mental illness,
as defined in Section 5600.3, as early in the onset of these conditions as practicable.
(3) Reduction in stigma associated with either being diagnosed with a mental illness or seeking mental health
services.
(4) Reduction in discrimination against people with mental illness.
(c) The program shall include mental health services similar to those provided under other programs that are
effective in preventing mental illnesses from becoming severe, and shall also include components similar to
programs that have been successful in reducing the duration of untreated severe mental illnesses and assisting
people in quickly regaining productive lives.
(d) The program shall emphasize strategies to reduce the following negative outcomes that may result from
untreated mental illness:
(1) Suicide.
(2) Incarcerations.
(3) School failure or dropout.
(4) Unemployment.
(5) Prolonged suffering.
(6) Homelessness.
(7) Removal of children from their homes.
(e) Prevention and early intervention funds may be used to broaden the provision of community-based mental
health services by adding prevention and early intervention services or activities to these services, including
prevention and early intervention strategies that address mental health needs, substance misuse or substance
use disorders, or needs relating to cooccurring mental health and substance use services.
(f ) In consultation with mental health stakeholders, and consistent with regulations from the Mental Health
Services Oversight and Accountability Commission, pursuant to Section 5846, the department shall revise the
program elements in Section 5840 applicable to all county mental health programs in future years to reflect what
is learned about the most effective prevention and intervention programs for children, adults, and seniors.
SEC. 2. The Legislature finds and declares that this act is consistent with, and furthers the intent of, the Mental
Health Services Act within the meaning of Section 18 of that act.