this post was submitted on 03 Nov 2023
285 points (96.1% liked)

Technology

59314 readers
5725 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Teen boys use AI to make fake nudes of classmates, sparking police probe::Parents told the high school "believed" the deepfake nudes were deleted.

you are viewing a single comment's thread
view the rest of the comments
[–] r3df0x@7.62x54r.ru -5 points 1 year ago (3 children)

If you're making porn of real underage people, I have no problem with you being put on the pedo registry.

If no serious harm was done, I'm fine with convicting them and then doing full expungement after 5-10 years.

[–] wildginger@lemmy.myserv.one 26 points 1 year ago (2 children)

And youre proof that the pedo registry shouldnt exist as is.

Teenagers being sexually interested in their peers is not pedophilia, and you want to ruin a decade of their life guaranteed, with the """"""promise""""""" of an expungement that would never actually happen thanks to the permanent nature of the internet for it.

This misuse of AI is a crime and should be punished and deterred, obviously. But labeling children about to enter the world as pedophiles basically for the rest of their lives?

Youre kind of a monster.

[–] r3df0x@7.62x54r.ru -1 points 1 year ago (1 children)

What about the fact that the girls who are victims of something like this will have to contend with the pictures being online if someone posts them there? What if people who don't know that the pictures depict minors re-post them to other sites, making them very difficult to remove? That can cause very serious employablity problems. It doesn't matter how open minded people are, they don't want porn coming up if someone googles one of their employees.

[–] wildginger@lemmy.myserv.one 4 points 1 year ago

The creation is still a crime, no one said otherwise.

It is just not an act of pedophilia.

[–] r3df0x@7.62x54r.ru -5 points 1 year ago (1 children)

If you produce CP, you should be on a registry for producing and distributing CP. If you create CP, you are enabling pedophilia.

[–] wildginger@lemmy.myserv.one 6 points 1 year ago (1 children)

They are children. Being horny about classmates.

Being sexually aroused by people your own age and wishing to fantasize about it is not enabling pedophilia, you literal psychopath.

[–] r3df0x@7.62x54r.ru -4 points 1 year ago (1 children)

Circulating porn of minors is a crime and enables pedophiles. Not to mention teenage girls could easily commit suicide over something like this.

[–] Fades@lemmy.world 3 points 1 year ago* (last edited 1 year ago) (2 children)

So does yearbook and any other kind of photos that depict children for that matter

You can’t keep pushing the goal posts, by your logic young people should never date or take photos together because it could enable pedophiles somewhere somehow

These are children with brains still in development, they are discovering themselves and you want to label them forever a pedophile because they didn’t make a conscious effort to research how their spanking material could potentially enable a pedo (because we all know pedos can only be enabled by things produced by kids… yeah that’s the real threat)

Instead of suggesting a way to help the victims you are advocating for the creation of yet more victims

What a pathetic brain dead stance you are defending

[–] eatthecake@lemmy.world 2 points 1 year ago

Abuse and bullying of their classmates is just 'discovering themselves'? Discovering that they're psychopathic little mysoginists I guess. Their 'spanking material' was created in order to demean and huumiliate their victims. There's plenty of porn online and absolutely no need for them to do this. If you actuslly wanted to help the victims you would not be trivialising and excusing this behaviour as 'being horny about classmates'.

[–] r3df0x@7.62x54r.ru -2 points 1 year ago (1 children)

A yearbook photo is not porn.

[–] wildginger@lemmy.myserv.one 1 points 1 year ago (1 children)

And an AI image with a face photoshopped over it isnt a photo of a child.

And a teen being sexually interested in other teens isnt a pedophile.

[–] r3df0x@7.62x54r.ru 1 points 1 year ago (1 children)

It's still child porn and someone getting off to child porn is a pedophile.

[–] wildginger@lemmy.myserv.one 0 points 1 year ago

So, to clarify.

You think 2 15 year olds having sex makes them both pedophiles?

[–] Jolteon@lemmy.zip 12 points 1 year ago (1 children)

I'd argue that someone making porn of someone their own age is not pedophilia.

[–] r3df0x@7.62x54r.ru -3 points 1 year ago

They're still making porn of a minor. That is harmful to them and it enables any pedophiles who find it.

[–] 13esq@lemmy.world 4 points 1 year ago (1 children)

That's an easy enough judgement when the perpetrator is an adult. What do you do when the perpetrator is a minor themselves? As they are in this article.

Of course their still needs to be some sort of recourse, but for every other crime there is a difference between being tried as a child or being tried as an adult.

I find it tough to consider myself.

[–] r3df0x@7.62x54r.ru -1 points 1 year ago

Considering the consequences for a high school student if porn of them gets circulated, I'm fine with putting them on the registry. Expungement can happen later based on the aftermath. Teenage girls have killed themselves over this sort of thing.