How is AI being used to sexually exploit youth online?
Technology is changing all the time, and in the last decade we have seen Artificial Intelligence (AI) being used in so many new ways from smart devices, to language translators, to chatbots, to virtual assistants like Alexa or Siri and so many others. It can help many get their job done, be more creative, and talk about new ideas. However, although there are benefits to using AI, let’s talk about some potential ways AI is used that might be negative, harmful, or even illegal. Being informed helps you make decisions about how you can benefit from AI safely.
What is Generative Intelligence AI (GAI)?
According to the National Center for Missing and Exploited Children (NCMEC), GAI “allows a user to create new images, videos, audio and text based on user requests or prompts” and is being used as a way to sexually exploit children. Someone may use a child or youth’s image pulled from social media or other places online, and change the image in order to create child sexual abuse material (CSAM) showing children or youth in situations where they are nude or engaging in sexual behaviors. The child may have clothes on in the original image, but the image can be changed to portray the child in situations that are not real.
Anyone could be the person creating these fake images - an adult or another youth. Sometimes it could be created for personal use because someone is sexually interested in or attracted to children or younger kids, and sometimes a youth might do this as a way to bully or harass another person.
These kinds of abusive and illegal images could also be created to blackmail or manipulate a youth by requesting that they pay money so that the generated images that show that youth in embarrassing, and possibly sexual situations doesn’t get shared with others, or as a way to manipulate or coerce them into engaging in other sexual behaviors. This is known as “sextortion.”
The number of cases involving GAI is rising, and NCMEC has reported that over the last 2 years they have received over 7,000 reports about someone using GAI to create sexualized images of children and youth. This can happen to anyone.
Why are GAI images of children abusive?
It may seem like creating fake sexual images of someone isn’t harmful because these images aren’t real, but doing this is abusive and illegal and can harm a youth in many ways. Permission was not given for these images to be created, and can cause feelings of being violated and/or victimized. This can be scary and stressful, and create issues around self-esteem and safety.
Creating these fake images can also make a youth more vulnerable for additional abuse. Additionally, each time these images are shared or viewed, a youth can experience further trauma.
Youth are especially vulnerable to sexual extortion, and are more and more the target of money-making schemes that involve blackmailing them with these fake images - threatening the young person that everyone will see these sexual images of them and no one will know that they are fake. Youth might worry even more than adults if this happens to them because of fears that they will be in trouble, that no one will believe that they aren’t real pictures and that they will become further shamed with their friends and family
The impact of being the subject of sexualized GAI images can be long-term for some; survivors of this type of abuse have shared the impact it has had on them years after GAI images were created of them. These impacts aren’t always immediate, and can affect a youth’s mental health in adulthood.
GAI that is done without someone’s permission, that is downloaded and shared, and that is done with the intention of causing harm or extortion is illegal. The best way to help stop abusive GAI is to learn more about it, let someone know if you are aware that it is being used in a harmful way, and to practice safe behaviors yourself - do not create sexualized GAI and do not download or share sexualized GAI.
Someone used my picture to create CSAM, and now they want me to give them money or they will post it online. What do I do?
Recently there has been an increase in reports to the National Center for Missing and Exploited Children (NCMEC) about youth, particularly males between the ages of 14-17, who are experiencing sextortion. This is when someone uses AI to create fake sexualized images of someone and threatens to share those images online or with others unless that person pays money or engages in other sexual behaviors, like sending them real sexualized pictures or videos.
Generative AI (GAI) is a critical issue that is getting more and more attention due to the harm it can cause when used in harmful ways. It’s important that people feel like they understand what GAI is, why it is abusive, and what steps they can take to stay safe. It can be incredibly scary and overwhelming to be in a situation where your images have been used to create sexually explicit content. No one deserves to have that happen, and it is illegal and abusive for someone to do this.
Getting help if images of you have been created using GAI
If you are under 18, it will be important to get an adult involved who can help you take some of these next steps, like reporting. You don’t have to figure this out on your own. This could be any safe adult that you feel comfortable talking to about difficult or scary topics and who would be able to speak up on your behalf - a parent, a friend’s parents, a teacher, counselor, doctor, or coach. This isn’t something you should get in trouble for - remember, people can use any public image posted online to create these kinds of images without someone’s consent. You did not do anything to deserve this happening, and you did not ask for this to happen. A safe adult can help make sure that you are safe, especially if the person who created these images is threatening, blackmailing, or manipulating you into paying them money or doing anything else that may be unsafe.
It might seem scary to talk to a parent or another safe adult about this; it is understandable that you might be afraid that you’ll be blamed, that people won’t believe that these images aren’t real, that rumors will start. But you deserve help and you don’t need to go through this alone. Our helpline can talk you through how to ask an adult in your life for help.
Making a report about GAI
If you believe that your images have been used to create sexualized images, which is actually considered child sexual abuse material (CSAM), or you know someone who is creating images like this using AI, you can make a report to Cybertipline. When you report, include information (if you know) about who created these images (whether it is their name, or their phone number/screenname they are using to contact you). By sharing also where the images have been shared online you can help authorities make sure that the person responsible is held accountable and that these images do not continue to be viewed and shared.
Another very helpful resource is Take It Down. This site is for young people concerned that sexually explicit images of them have been shared online to make an anonymous request to have these images removed. And Your Explicit Content Out There? shares information about how to report sexually exploitive content to specific platforms or internet service providers. Both of these resources are from NCMEC.
If you are creating GAI images children yourself
Maybe you didn’t realize that creating sexualized images of someone using AI was abusive and illegal, and that specifically creating images of a minor could be considered sexual abuse. Or maybe you did but felt like because they aren’t real images they weren’t hurting anybody. No matter what the reason, using GAI to create sexualized images of a youth does cause harm and it is something that you can face serious consequences for. We don’t share this to shame or scare people who are creating images like this, but it is important to be clear that this is not okay and if you are creating images like this there IS help available to make safer choices.