Kids want social media apps to do more to protect them from the spread of deepfake nudes

A close-up of a young girl using a smartphone.
Kids are using AI to create nude deepfakes of people they know.
  • Kids as young as 9 are sending nude photos, and Gen-AI is making the problem worse.
  • 1 in 10 kids have friends who have used AI to create nudes of their peers, a survey found.
  • Faced with online sexual encounters, kids prefer social media reporting over talking to adults.

It's problematic enough that some kids are sending nude photos of themselves to friends and even online strangers. But artificial intelligence has elevated the problem to a whole new level.

About 1 in 10 children say their friends or peers have used generative AI to create nudes of other kids, according to a new report from Thorn. The nonprofit, which fights child sex abuse, surveyed over 1,000 children ages 9 to 17 in late 2023 for its annual survey.

Thorn found that 12% of children ages 9 to 12 knew of friends or classmates who had used AI to create nudes of their peers, and 8% preferred not to answer the question. For 13 to 17-year-olds surveyed, 10% said they knew of peers who had used AI to generate nudes of other kids, and 11% preferred not to answer. This was Thorn's first survey that asked children about the use of generative AI to create deepfake nudes.

"While the motivation behind these events is more likely driven by adolescents acting out than an intent to sexually abuse, the resulting harms to victims are real and should not be minimized in attempts to wave off responsibility," the Thorn report said.

Sexting culture is hard enough to tackle without AI being added to the mix. Thorn found that 25% of minors consider it to be "normal" to share nudes of themselves (a slight decrease from surveys dating back to 2019), and 13% of those surveyed reported having done so already at some point, a slight decline from 2022.

The nonprofit says sharing nude photos can lead to sextortion, or bad actors using nude photos to blackmail or exploit the sender. Those who had considered sharing nudes identified leaks or exploitation as a reason that they ultimately chose not to.

This year, for the first time, Thorn asked young people about being paid for sending naked pictures, and 13% of kids surveyed said they knew of a friend who had been compensated for their nudes, while 7% did not answer.

Kids want social media companies to help

Generative AI allows for the creation of "highly realistic abuse imagery from benign sources such as school photos and social media posts," Thorn's report said. As a result, victims who may have previously reported an incident to authorities can easily be revictimized with customized, new abusive material. For example, actor Jenna Ortega recently reported that she was sent AI-generated nudes of herself as a child on X, formerly Twitter. She opted to delete her account entirely.

It's not far off from how most kids react in similar situations, Thorn reported.

The nonprofit found that kids, one-third of whom have had some sort of online sexual interaction, "consistently prefer online safety tools over offline support networks such as family or friends."

Children often just block bad actors on social media instead of reporting them to the social media platform or an adult.

Thorn found kids want to be informed on "how to better leverage online safety tools to defend against such threats," which they perceive to be normal and unremarkable in the age of social media.

"Kids show us these are preferred tools in their online safety kit and are seeking more from platforms in how to use them. There is a clear opportunity to better support young people through these mechanisms," Thorn's analysis said.

In addition to wanting information and tutorials on blocking and reporting someone, over one-third of respondents said they wanted apps to check in with users to see how safe they feel, and a similar number said they wanted the platform to offer support or counseling following a bad experience.

Read the original article on Business Insider


from Business Insider https://ift.tt/EAUVnpI
via IFTTT

Comments

Popular posts from this blog

Tesla’s Dojo, a timeline

Autonomation

Time for moderation?