A multi-state investigation into artificial intelligence-generated child sexual abuse material (CSAM) continues to have an impact on victims with Cascade ties, as former classmates reveal shock, uncertainty, and communal ramifications.
LaKie Gardner says she first learned about the situation through a message from a friend.
Quentin Shores reports - watch the video here:
“She called me and told me that there was this thing with these A.I. photos of us from when we were young, and that it was someone in our class, and it still doesn't quite seem like it can be a real thing that's actually happening.”
Gardner described how incredibly upsetting it was to see that photographs from her youth had been perverted using generative AI: "It was so weird. I could not go to sleep. After that. I got up, and I got in my car, and I just drove in circles around my little town because I just didn't know what else to do with myself.”
Gardner contacted police in Great Falls and heard that investigators had identified a suspect: former classmate Dalten Bryne Montana Johnson.
Court documents in the case state:
Upon his arrest, the Defendant was advised of his Miranda rights and agreed to speak with law enforcement. Post waiver, the Defendant admitted he owned the account where the CSAM was discovered. Defendant admitted that he has had a pornography addiction since 2020 and admitted that he has previously saved sexually explicit images depicting people who may have been children. Also in the account were a significant number of images which appear to be CSAM, but where the age of the victim is currently unknown, commonly referred to as “age difficult” CSAM.
In addition, law enforcement discovered that the Defendant possessed approximately 450 images of female children that are not nude, but which depict the children wearing swimsuits, leotards, and other tight, form-fitting, or revealing clothing. The children depicted in these photographs are between the ages of 6 and 17. The Defendant admitted that he had access to these children as they are either relatives or friends of his own children. Furthermore, the Defendant’s wife has operated an in-home daycare at the residence for nearly 10 years.
Gardner stated that she remembered Johnson just faintly from school: "I don't really remember him that much, like, other than he was there. Like he wasn’t weird. I never had any issues with him. Like I probably was in a reading group with him.”
Investigators say Johnson was detained on counts of sexual exploitation of a minor. Authorities said evidence contained a hard drive with a file labeled "Cascade," which featured the names of former classmates as well as altered nude images made from social media photos taken when they were younger.
Other former students voiced similar disbelief as they considered the suspect.
“He was kind of quiet, but he was social," said former Cascade resident Rachel Burk. "He was part of our groups, our clubs. When he did speak, he was hilarious. He was always kind. There was zero sign.”
Johnson is currently in jail in Utah, facing six counts of sexual exploitation of a minor.
For victims, the case has gone beyond individual harm to affect the entire Cascade community.
“Beyond, I think our individual lives, it's just it upsets the whole community because it's not just 1 or 2 of us. It's an entire class system over multiple years. So, it's my whole, it's everybody I knew in high school,” stated Burk.
In a statement about the case, Montana Attorney General Austin Knudsen referenced House Bill 82, which criminalizes grooming and allows prosecutors to pursue cases involving pedophiles who use artificial intelligence to digitally alter photos of minors.
Victims say the event demonstrates how new technology may affect communities of all sizes and locations.
“This is some crazy thing that could happen to you. If it happens in a small place, like Cascade, and if it happens in a place like Utah, that's everywhere. That's everyone.”
As lawmakers continue exploring laws against A.I. exploitation, victims say the consequences are already being seen in Cascade. The case is still in its early stages, with several states participating and further developments expected.