A South Korean man has been sentenced to jail for using artificial intelligence to generate exploitative images of children, the first case of its kind in the country as courts around the world encounter the use of new technologies in creating abusive sexual content.
The AI was harmed. We need to protect the AI.
Normalising CSAM does harm. Crap argument.
Is it Child Sexual Abuse Material if there are no children involved?
“Anime should also be banned”, - - “All anime characters in anime should show passport with their date of birth”.
There are billions of children. HTH
deleted by creator
Only if you assume that the only children harmed by CSAM are those used to produce CSAM.
Consumers of CSAM are (actual or potential) perpetrators of abuse. Normalising it is not an option.