- cross-posted to:
- stablediffusion@lemmit.online
- cross-posted to:
- stablediffusion@lemmit.online
A sex offender convicted of making more than 1,000 indecent images of children has been banned from using any “AI creating tools” for the next five years in the first known case of its kind.
Anthony Dover, 48, was ordered by a UK court “not to use, visit or access” artificial intelligence generation tools without the prior permission of police as a condition of a sexual harm prevention order imposed in February.
The ban prohibits him from using tools such as text-to-image generators, which can make lifelike pictures based on a written command, and “nudifying” websites used to make explicit “deepfakes”.
Dover, who was given a community order and £200 fine, has also been explicitly ordered not to use Stable Diffusion software, which has reportedly been exploited by paedophiles to create hyper-realistic child sexual abuse material, according to records from a sentencing hearing at Poole magistrates court.
Just like how you can’t generate a child without pictures of children to base it on you can’t generate them naked without pictures of their bodies. There is a reason pedos are attracted to those bodies and not women with no curves/small men.
I work with children, I see them everyday. The difference is so massive that an ai would not be able to approximate it with just photos of adults. Ai doesn’t “know” anything it just has photos that it uses to approximate what is being asked based off it’s data. Even if you kept describing in more detail what those bodies looked like it wouldn’t be able to create it without anything to base it on. It’d be like creating a van gogh style picture with no van gogh training data, no matter how much you try to describe the details of his style you’ll never get the ai to make something like it without the training data.
You can keep disagreeing, keep saying “but with more data” but ai can’t make anything original, that is a fundamental misunderstanding of it’s abilities. If it doesn’t have the data it can’t accurately do it.