So I had a conversation on Reddit the other day with a misinformed person and they told me they had gotten this misinformation from ChatGPT. I told them what the actual truth was and that ChatGPT was wrong and I’m not sure they believed me. But blog readers know me as a source of reliable information so maybe you guys will believe me.
We were talking about the definition of CSAM, child sexual abuse material. They used to call it child pornography but lately the wording has switched to CSAM. I really hate talking about this but I don’t want people going around relying on ChatGPT’s wrong definition. So here goes, a myth-buster for you:
It is perfectly legal to take photos or videos of children without any clothes on. It is also perfectly legal to post said images onto the internet.
Mere nudity is not enough to make a photo or video CSAM. Context is key: there has to be a sexual element to the image, something meant to arouse the viewer.
To give some examples of photos of nude children which are not CSAM:
The photographer Anne Geddes has taken numerous artsy photos of babies and toddlers who are often naked, and her photos can be seen on calendars and in books. This photo for example. That’s not CSAM. Or this one which was on the cover of a magazine.
Photographs taken in a medical or forensic context are also not CSAM. I am still posting medical case reports to Reddit daily and I sometimes encounter such images in the reports.
For example, I found a case report of a very rare condition called Caudal Duplication Syndrome where structures in the lower part of the body are duplicated, including the gastrointestinal and genitourinary organs. The patient in question was born with multiple sets of genitalia and the case report described and showed, with photos, how they removed one of his two penises and surgically turned the two scrotums into one scrotum. The final image was of the patient as a toddler, standing there nude (and picking his nose), and you would never have guessed he had been born with anything unusual “down there” because it all looked normal. The photos in the case report were necessary, to teach other doctors how to fix this particular problem, and there was nothing sexual about it, not any more than if they had been images of them removing a superfluous arm. (Also, I feel obliged to say the patient’s parents would’ve had to consent for the publication of those images.)
Family photos of a child in the bathtub, or a nude baby having some tummy time, or a nude baby nursing? Also not CSAM. There’s a photo like that at my mom’s house, of me. One time I scanned it into my computer and decided to send it to any man who sent me unsolicited requests for nudes (something that happens to a lot of women online). Fortunately I’ve never gotten any such requests.
If it’s zoomed in on the child’s genital area, that might be CSAM. Although again, it depends on the context. I read an article, I think in the Washington Post, a few years ago where they interviewed a man whose young son developed a rash down there and this was during covid lockdown when it was hard to see a doctor in person. So the man took a photo of the rash and emailed it to a telehealth doctor who prescribed medication for it. Google, like many online platforms, has AI technology designed to automatically check images for CSAM, and the Google decided this photo was CSAM and banned the father’s Gmail account and also notified the police in his jurisdiction. The police investigated and quickly decided no crime had taken place, but the guy did not get his Gmail account back. The father said he understood why Google reported him since it was a close-up photo of a child’s genital area and the modbot didn’t understand it was for medical purposes, but he thinks Google should have given him his Gmail account back once he was exonerated.
Now, if the image shows a child with an erect penis, or if the child in the photo is touching themselves down there, that’s CSAM cause there is a sexual element to the image in that case. If there is an adult in the image also with an erect penis, that’s CSAM. If there is sexual activity happening in the image, well, obviously that’s CSAM.
I feel really gross having to explain all this. But apparently ChatGPT is telling people, or at least it told that Redditor I was talking to, that taking or posting any child nudity images in any context whatsoever is a crime and that’s simply not true. You can make your own decisions as to whether it’s wrong or exploitative to post such images online, but it’s not a criminal offense.
Now I’m going off to take a shower and try to think about anything other than this, for the rest of the day.