Research by The Lucy Faithfull Foundation found out that 40% of the people in the UK thought that sexual abuse material generated by AI (Artificial Intelligence) is legal in the UK and they didn’t know that it was illegal. The Lucy Faithfull Foundation is a UK based foundation that provides prevention and awareness against sexual abuse. The research also found out that 66% of people in the UK believe that AI is going to have harmful effects on children. On the other hand, 70% of the individuals in UK didn’t know that AI is being used for generating child sexual abuse material (CSAM).
88% of people in UK said that sexual or abusing images of people under 18 generated by AI should be illegal in the UK, 40% said that they assumed that it is legal in the UK. Keep in mind that it is completely illegal to generate, distribute and view sexual images of children in the UK. The foundation is working on raising awareness about how the offenders are using pictures of real children to make CSAM but there are serious consequences for these offenders in the UK.
According to the foundation, AI isn’t the only one responsible for making sexual images of children. The offenders also use faces of children who have been previously abused to turn their images into CSAM. Children who have been abused in the past go through the trauma again after their images turn into what they have faced in the past. The director of Stop It Now helpline, Doald Findlater, says that the public isn’t fully aware of how AI is being used to create sexual images of children. As AI is advancing, the public should make them educated on what harmful effects AI can cause to other people. There are also some speculations that some certain machine learning models are trained on CSAM. But this can only be done by combining two concepts like “child” and “explicit content”.
Image: Digital Information World - AIgen
Read next: Meta In The Hotseat As EU Watchdogs Express Concern Over Ad-Free Subscription Model
by Arooj Ahmed via Digital Information World
No comments:
Post a Comment