Loading...
General (23) Susana Barrios From:durfeycraig778@gmail.com Sent:Wednesday, AM To:cadurfey@gmail.com; hauwie.tie@asm.ca.gov; David.Ochoa@sen.ca.gov; CA.GOV; Assemblymember.Davies@assembly.ca.gov; REPLOUCORREA@MAIL.HOUSE.GOV; response@ocgov.com; kim.vandermeulen@mail.house.gov; assemblymember.quirk-silva@assembly.ca.gov; Christopher.Aguilera@asm.ca.gov; 'Teresa Pomeroy'; SENATOR.GONZALEZ@senate.ca.gov; 'GGEA President'; SHEA.Committee@senate.ca.gov; Public Comment; 'Walter Muneton'; senator.nguyen@senate.ca.gov; assemblymember.rendon@assembly.ca.gov; 'Dina Nguyen'; Theresa Bass; 'Gabriela Mafi'; KTLA 5 News; senator.umberg@senate.ca.gov; SEDN.committee@senate.ca.gov; SHEA.Committee@senate.ca.gov; 'Public Records Request'; '"FOX11NEWS@FOXTV.COM"'; FourthDistrict@bos.lacounty.gov; FirstDistrict@bos.lacounty.gov; ocbe@ocde.us; julia.kingsley@asm.ca.gov; response@ocgov.com; cadurfey@gmail.com; KCAL 9; Superintendent@cde.ca.gov Cc:ABC7 General Release; Subject:\[EXTERNAL\] Variety : Instagram Algorithms Connect 'Vast' Network of Pedophiles Seeking Child Pornography, According to Researchers. Warning: This email originated from outside the City of Anaheim. Do not click links or open attachments unless you recognize the sender and are expecting the message. 05-01-2024 (P.R.D.D.C.) PARENTS FOR THE RIGHTS OF DEVELOPMENTALLY DISABLED CHILDREN CRAIG A. DURFEY FOUNDER OF P.R.D.D.C. U.S. HOUSE OF CONGRESS H2404 - HONORING CRAIG DURFEY FOR HIS FIGHT AGAINST AUTISM ... Ms. LORETTA SANCHEZ of California. https://www.govinfo.gov/content/pkg/CREC-2003-03-27/pdf/CREC-2003-03-27.pdf To whom it may concern. Variety : Instagram Algorithms Connect ‘Vast’ Network of Pedophiles Seeking Child Pornography, According to Researchers 1 Instagram Algorithms Connect ‘Vast’ Network of Pedophiles Seeking Child Pornography, According to Researchers Parent company Meta says it has established task force to combat the problems. “Child exploitation is a horrific crime. We work aggressively to fight it on and off our platforms, and to support law enforcement in its efforts to arrest and prosecute the criminals behind it,” the rep said in a statement. “Predators constantly change their tactics in their pursuit to harm children, and that’s why we have strict policies and technology to prevent them from finding or interacting with teens on our apps and hire specialist teams who focus on understanding their evolving behaviors so we can eliminate abusive networks.” Between 2020 and 2022, according to Meta, its policy enforcement teams “dismantled 27 abusive networks” and in January 2023 disabled more than 490,000 accounts for violating child-safety policies. As of the fourth quarter of 2022, Meta’s technology removed more than 34 million pieces of child sexual exploitation content from Facebook and Instagram, more than 98% of which was detected before it was reported by users, the company said. According to the Journal’s report, “Technical and legal hurdles make determining the full scale of the \[pedophile\] network \[on Instagram\] hard for anyone outside Meta to measure precisely.” The article cited the Stanford Internet Observatory research team’s identification of 405 sellers of what the researchers deemed “self-generated” child-sex material (accounts purportedly run by children themselves) using hashtags associated with underage sex. The WSJ story also cited data compiled via network mapping software Maltego that found 112 of those accounts collectively had 22,000 unique followers. The Journal’s report noted that Instagram accounts that offer to sell illicit sex material “generally don’t publish it openly” and that such accounts often link to “off-platform content trading sites.” Researchers found that Instagram enabled people to search “explicit hashtags such as #pedowhore and #preteensex” and then connected them to accounts that used the terms to advertise child-sex material for sale, according to the report. Per the Journal, test accounts set up by researchers that viewed a single such account “were immediately hit with ‘suggested for you’ recommendations of purported child-sex-content sellers and buyers, as well as accounts linking to off-platform content trading sites. Following just a handful of these recommendations was enough to flood a test account with content that sexualizes children.” In addition, certain Instagram accounts “invite buyers to commission specific acts,” with some “menus” listing prices for videos of children harming themselves or “imagery of the 2 minor performing sexual acts with animals,” according to the Journal report, citing the findings of Stanford Internet Observatory researchers. “At the right price, children are available for in-person ‘meet ups,’” the Journal reported. Among other internet platforms, Snapchat and TikTok do not appear to facilitate networks of pedophiles seeking child-abuse content in the way that Instagram has, according to the Journal report. On Twitter, the Stanford Internet Observatory team identified 128 accounts offering to sell child-sex-abuse content; according to the researchers, Twitter didn’t recommend such accounts to the same degree as Instagram and Twitter also removed such accounts “far more quickly,” the Journal reported. (An emailed request for comment to Twitter’s press account resulted in an autoreply with a poop emoji.) Thank You Craig Durfey 3