A recent report has confirmed some horrific behavior at Walt Disney World Resort.
Guests visit the “Most Magical Place on Earth” feeling comfortable in the knowledge that it’s a safe space dedicated to family-friendly entertainment. Often the most horrific stories to emerge from Disney’s theme parks stem from the behavior of other guests, as is the case in a recent report.
As per Forbes, Justin Culmo was arrested in mid-2023 after spending over a decade as “one of about 20 high-priority targets” among global child exploitation detectives. After his arrest, he confessed to allegedly using the AI system Stable Diffusion to create thousands of illegal images of children using pictures taken at the Central Florida theme park resort with a GoPro.
Federal agents presented the case to a group of Australian law enforcement officials in early August, and a source later provided Forbes with details of the presentation.
Culmo was indicted in Florida for a range of child exploitation crimes. This includes allegations of abusing his two daughters, secretly filming minors, and distributing child sexual abuse imagery (CSAM) on the dark web, a section of the internet that isn’t visible or accessible to search engines.
A jury trial has since been set for Culmo, who is pleading not guilty, in October. He has not been charged with AI CSAM production, which is also considered a crime under U.S. law.
“This is not just a gross violation of privacy, it’s a targeted attack on the safety of children in our communities,” said Jim Cole, a former Department of Homeland Security agent who tracked Culmo’s online activity. “This case starkly highlights the ruthless exploitation that AI can enable when wielded by someone with the intent to harm.”
Disney has reportedly said that investigators have not contacted the company about Culmo’s alleged activities at the resort, which encompasses four theme parks: Magic Kingdom Park, EPCOT, Disney’s Hollywood Studios, and Animal Kingdom. The U.S. Attorney’s Office for the Middle District of Florida declined to comment any further on the case to Forbes.
This isn’t the first report of Stable Diffusion 1.5 being used to create AI CSAM images. Cole told Forbes that “there are no built-in safeguards. It’s why offenders use it almost exclusively.”
He has since founded Onemi-Global Solutions, a consultancy that assists tech companies and non-profit organizations with child protection. Stability AI previously told Forbes that it was not responsible for Stable Diffusion 1.5, which was released by AI tool developer Runway.
Share your thoughts on this story with Inside the Magic in the comments.