Children’s commissioner: Pornography affecting 8-year-olds’ behaviour

Perhaps the most important part of the Ashcroft decision for emerging issues around AI-generated child sexual abuse material was part of the statute that the Supreme Court did not strike down. That provision of the law prohibited “more common and lower tech means of creating virtual (child sexual abuse material), known as computer morphing,” which involves taking pictures of real minors and morphing them into sexually explicit depictions. Learning that someone you know has been viewing child sexual abuse material (child pornography) must have been very shocking, and it’s normal to feel angry, disgusted, scared, or confused – or all of these things at once. Even though this person is not putting their hands on a child, this is child sexual abuse and yes, it should be reported.

child porn

Hundreds of these videos are offered freely via social media and payment is via digital wallet or bank. This situation shows the vulnerability of children to become victims of networks of pornographic criminals who make huge profits from their innocence. While children grow up, it is quite normal for there to be an element of sexual experimentation and body-curiosity; that is not what we find in these ‘self-generated’ images and videos of child sexual abuse.

child porn

How is CSAM Harmful for Viewers?

child porn

In Brazil, the Statute of the Child and Adolescent defines the sale or exhibition of photos and videos of explicit sex scenes involving children and adolescents as a crime. It is also a crime to disseminate these images by any means and to possess files of this type. In SaferNet’s view, anyone who consumes images of child sexual violence is also an accomplice to child sexual abuse and exploitation. However, web crimes against children have become more sophisticated over time, Safernet explained during an event in São Paulo.

A review of the research on children and young people who display harmful sexual behaviour online (HSB)

Of these active links, we found 41 groups in which it was proven there was not only distribution of child sexual abuse images, but also buying and selling. It was a free market, a trade in images of child sexual abuse, with real images, some self-generated images, and other images produced by artificial intelligence,” said Thiago Tavares, president of SaferNet Brasil. Some adults may justify looking at CSAM by saying to themselves or others that they would never behave sexually with a child in person or that there is no “real” child being harmed. However, survivors have described difficulty healing when their past abuse is continuing to be viewed by strangers, making it hard for them to reclaim that part of their life.

  • Many people don’t realize that non-touching behaviors including taking photographs of a child in sexual poses or exposing your genitals to a child for sexual arousal are child sexual abuse.
  • In some cases, sexual abuse (such as forcible rape) is involved during production.
  • In most situations you do not need to wait to have “evidence” of child abuse to file a report to child protective services of police.
  • The website has “failed to properly protect children and this is completely unacceptable”, a spokesperson said.
  • DeMay’s father said adults have to be warned that their children will have access to the whole planet with a phone device.
  • OnlyFans was a big winner during the pandemic, exploding in popularity as much of the world was housebound.

child porn

There were 356 Category A, ‘self-generated’ images or videos of 3–6-year-olds hashed this year. Most of the Category A material involved children penetrating themselves, or another child. Prosecutors said the site had offered videos of sex acts involving children, infants and toddlers – and specifically asked users child porn not to upload videos featuring adults-only pornography.

child porn

The city of Lancaster, Pennsylvania, was shaken by revelations in December 2023 that two local teenage boys shared hundreds of nude images of girls in their community over a private chat on the social chat platform Discord. Witnesses said the photos easily could have been mistaken for real ones, but they were fake. The boys had used an artificial intelligence tool to superimpose real photos of girls’ faces onto sexually explicit images. We know that seeing images and videos of child sexual abuse online is upsetting. It is perhaps surprising that there is not a higher ratio of multiple child images in the ‘self-generated’ 3-6 age group. It would be easy to assume that a child of that age would only engage in this type of activity on camera with the encouragement in person of an older child, leading the way, but shockingly this is not what we have seen.