CLEVELAND — Area police and federal agents believe a Rocky River man raped a 2-year-old girl inside a home day care in Parma Heights. However, their case may have stalled from day one if not for the suspect allegedly recording and photographing his reported crimes.
According to federal court documents, 20-year-old Conner Walker admitted to sharing CSAM images on a messaging app. If you're unfamiliar with the term "CSAM," it stands for "child sexual abuse materials," with the switch to this acronym (a departure from the phrase "child pornography") meant to clearly convey the horrific crimes involved.
"The biggest challenge to the FBI and law enforcement, in general, in concerning these investigations are going to be end-to-end encrypted applications," William Hasty, leader of the Cleveland FBI's Violent Crimes Task Force, says.
Authorities believe Walker shared the CSAM on the encrypted app Session, which in ads promises to "scramble your data." Investigators say these encrypted apps have become a safe haven for predators, but in Walker's case, documents show a foreign partner of the FBI pinpointed the CSAM as North American in origin.
The FBI, working with only half of the toddler victim's face, used its own technology to match it to a photo the girl's mom had posted on Facebook. Once the victim was identified, agents were able to arrest Walker, whom they say confessed to his crimes.
Last month, Meta announced it will soon roll out encrypted messaging on Facebook and Instagram. This comes despite protests from law enforcement, such as the Cuyahoga County Prosecutor's Office's Internet Crimes Against Children Task Force.
"The national Center [For Missing and Exploited Children) is getting upward of 20-30 million cyber tips a year just from Meta platforms, and a lot of that is coming from Messenger and Instagram," ICAC Commander Dave Frattare told 3News Investigates. "Our question to Meta is how much of those tips are going to dry up?
Another new wrinkle: What about artificial intelligence, which is increasingly being used in CSAM?
"Uncharted territory from an enforcement standpoint," Hasty admitted.
Ohio Attorney General Dave Yost recently joined every other state AG in the U.S. in urging Congress to make AI-generated CSAM illegal, saying, "The time to prevent this is now, before it happens." But the technology is already here, with AI face swaps.
For example, I swapped my own face using just a still photo of WKYC news producer Taylor Moore. Looks pretty realistic, right? And it took me just a few minutes.
"It's almost like a Hydra," Hasty lamented. "We can do our good work and cut one head off, but another one is going to prop back up again due to the nature of the internet."
For law enforcement, the battle is fought one case at a time. But for parents?
"There's nothing we can do on the front end, short of continuing our message of prevention," Frattare said. "Talk[ing] to children and adults about what they're putting on the internet."
As for AI-generated CSAM, it's a murky area of the law. We found lawmakers in at least seven states have introduced bills to outlaw such images, and just this week, Ohio state Sens. Louis Blessing (R-Colerain Township) and Terry Johnson (R-McDermott) introduced Senate Bill 217, which would make it a third-degree felony to create or distribute "simulated obscene material."
"Child pornography has long been outlawed in Ohio, but the unchecked rise of AI has created a gray area for predators to fuel their sick fantasies," Yost said in a statement supporting the bill. "We need to act quickly to protect Ohio's children by expanding existing child pornography laws to cover artificial intelligence."