Connect with us

Horoscope

Catholic high school students who made and shared deep-fake nudes of classmates won’t face charges | CBC News

Published

on

Catholic high school students who made and shared deep-fake nudes of classmates won’t face charges | CBC News

Teenage boys who used artificial intelligence software to create nude photos from social media pictures posted by girls at a Catholic high school won’t face any criminal charges, CBC News has learned. 

The London District Catholic School Board won’t say what consequences, if any, the boys faced or will face. The southwestern Ontario board also refused to answer questions about whether any students were identified as being responsible for making and sharing the deep-fake nude images generated by artificial intelligence software.  

“It’s important for the public and the community to understand what measures are being taken, have they done something, is the board actually taking it seriously,” said Kaitlyn Mendes, a Western University sociologist who studies gender inequalities and how they intersect with media. 

“I would hope that in these cases, for if and when it happens again, there’s clear and open communication about what they’re doing because it’s important to have some consistency, to know what the responses are. If I had a child who was going to that school, I would want to know what happened.” 

A spokesperson for the school board referred CBC News to a safe-schools policy that includes an explanation of progressive discipline for elementary and high school students. 

London police say they’ve spoken to the individuals involved “who indicated that this matter would be best resolved without criminal charges.” 

WATCH | Concerns over AI-generated sexual images:

Taylor Swift deepfakes taken offline. It’s not so easy for regular people

Fake, AI-generated sexually explicit images of Taylor Swift were feverishly shared on social media until X took them down after 17 hours. But many victims of the growing trend lack the means, clout and laws to accomplish the same thing.

The incident happened in early April and prompted the school to send an email to parents, warning that those sharing the photos through group chats were bringing “harmful consequences” to the school community.

“The creation and distribution of this material could result in disciplinary measures,” the email to parents said. 

One of the teens whose picture was altered told CBC News at the time she felt humiliated and embarrassed by the pictures. Some students at the school said they knew who was responsible, but the 16-year-old girl did not. 

‘It’s not a joke’

Teens who use the relatively new AI generators to take innocuous images and make them sexual often don’t understand how much of a negative impact they could have, said Lindsay Lobb, a director with the Canadian Centre for Child Protection. 

“There has been messaging to young people about the impact of sharing nudes, but what we’re seeing in many of these cases is that they’re not understanding the impact of sharing what they’re considering is a fake image. It’s important for them to understand that it’s not funny and it’s not a joke,” she said. 

“Whether the images that are being non-consensually distributed are real or whether they’re fake in some capacity, the impact is real.”

Criminal charges, the experts agree, should be laid as a last resort, but education is a good first step.

“In most circumstances there’s not a benefit to criminalizing children,” said Lobb. “We want to educate them on the seriousness of it and how to get help if they are victims.” 

Having your image altered and shared can be devastating, said Mendes, and those who do so need to understand the impact. 

“I would hope there’s some kind of restorative process where hopefully the boys can understand how they violated the girls’ privacy, their bodily autonomy, their sexual integrity.” 

Columnists from CBC Radio5:14The growing problem of AI-created nude images

In December of 2023, a Winnipeg high school warned parents that AI-created nudes of students were being circulated online. The CBC’s Manjula Selvarajah looks at the problem of teens and tech-facilitated sexual violence.

Police officers who deal with non-consensual sharing of nudes have started to see more AI-altered images, said London Police Staff Sgt. Jason Eddy, who has worked with technology in digital forensics for 17 years.

Much of his work centres on working with the Internet child exploitation unit and a province-wide focus to protect children from online exploitation and abuse.  

“London Police Service has seen several instances of AI-generated content. It’s a world-wide issue,” he said.

Some nude images or child sexual abuse images are generated completely by artificial intelligence, while others use real people’s pictures. “There are those who are actively harming children and then there are teens not knowing the repercussions, not understanding the harm that could result,” Eddy said. 

Criminal charges can be laid but in instances like the one at St. Thomas Aquinas Catholic Secondary School, education goes a long way, he said. “The harms may not be apparent to them but we need to articulate to them that their actions are doing significant damage.” 

Parents also need to talk to their kids and teens about not sharing nudes or sexual imagery and to report such activity to their parents or school.

“The biggest messaging is take an active role in your child’s use of technology. We’re not trying to say that everything is bad, but don’t be afraid to report stuff, that way we can intervene,” Eddy said. 

Apple and Google both have technology that parents can enable on their kids’ devices that allow them to monitor usage, web searches, apps and social media.

The Google app is called Family Link and Apple has child safety features that can be enabled in the iOS system.  

Continue Reading