The Dark Side Of Generative AI — Are You Unknowingly Exposing Your Child To Online Predators & Pedophiles?

All age groups

Zahirah

24.3K views

1 weeks ago

The Dark Side Of Generative AI — Are You Unknowingly Exposing Your Child To Online Predators & Pedophiles?
safety
Child Abuse
Child Sexual Abuse

From flipping through lovingly curated old photo albums to sharing pictures online and tagging our loved ones, we have come a long way. If you’re a millennial, you know exactly what I am talking about. Babies, nowadays, have digital footprints even before they take their first footstep. Many parents love to celebrate every special moment and milestone of their little one by posting pictures and videos online, but little do they realize that this seemingly harmless habit could be putting their child at a grave risk. If you have been doing it too, here’s why you need to stop it right away. 

Advertisement - Continue Reading Below

The Dark Side Of Sharenting & Generative AI

The term sharenting comes from two words, sharing & parenting, and it refers to the practice of sharing personal information about your children with people beyond your close circle. Sharing their pictures and videos online is also a part of it. Now, the question is, how can it possibly be dangerous? 

Generative AI was meant to be a tool that makes our lives easier, but over time, it has started to overpower how we communicate. You must have seen the Studio Ghibli AI trend that has taken social media by storm. Even celebrities couldn’t hold themselves back from jumping in. This is just one of the many trends that have been powered by generative AI. Our social media display pictures are now being replaced by AI-generated Avatars. 

While we often tend to look at the positive aspects, we simply cannot ignore the dark side of generative AI, for instance, the use of ‘nudifier’ apps to convert innocent pictures into sexually explicit images. It’s true. International Law Enforcement warns that predators and pedophiles are using AI to transform innocent child photos into CSAM and that the volume of such illegal pictures has been increasing. 

The threat was identified by Internet Watch Foundation in October 2023, when they found more than 20,000 AI generated images on Dark Web in a single month with over 3000 of them depicting child sexual abuse activities. Pedophiles use AI tools to add a child’s face to adult porn videos. 

You May Also Like To Read:

Advertisement - Continue Reading Below

In 2024, it was found that 3,500 moreAI generated images were added on the same forum on dark web. In fact, the degree of abuse in these fake images was even more severe, classified as Category A abuse.  

Mike Prado, Deputy Chief at the DHS ICE Cyber Crimes Unit revealed that predators are now taking pictures of children on the streets and modifying them to generate CSAM. According to a 

In 2023, Justin Clumo, a pedophile, took pictures of children at Disney World with his GoPro and later converted them into CSAM using an AI tool — Stable Diffusion.  When he was later arrested, he admitted to creating thousands of such images. This is disgusting and scary at the same time. 

Although this may seem new, it has been happening for quite some time now and AI generative tools are not the only means by which pedophiles are targeting our children. In 2016 BBC found out that Pedohiles used secret Facebook groups to exchange explicit images of children. 

Next time you post your child’s pictures online, think twice, because you never know what price you would be paying for it.

 

Be the first to support

Be the first to share

support-icon
Support
share-icon
Share

Comment (0)

share-icon

Related Blogs & Vlogs

No related events found.

Loading more...