A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • ReallyActuallyFrankenstein@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    110
    arrow-down
    3
    ·
    edit-2
    4 months ago

    It’s hard to have a nuanced discussion because the article is so vague. It’s not clear what he’s specifically been charged with (beyond “obscenity,” not a specific child abuse statute?). Because any simulated CSAM laws have been, to my knowledge, all struck down when challenged.

    I completely get the “lock them all up and throw away the key” visceral reaction - I feel that too, for sure - but this is a much more difficult question. There are porn actors over 18 who look younger, do the laws outlaw them from work that would be legal for others who just look older? If AI was trained exclusively on those over-18 people, would outputs then not be CSAM even if the images produced features that looked under 18?

    I’m at least all for a “fruit of the poisoned tree” theory - if AI model training data sets include actual CSAM then they can and should be made illegal. Deepfaking intentionally real under 18 people is also not black and white (looking again to the harm factor), but also I think it can be justifiably prohibited. I also think distribution of completely fake CSAM can be arguably outlawed (the situation here), since it’s going to be impossible to tell AI from real imagery soon and allowing that would undermine enforcement of vital anti-real-CSAM laws.

    The real hard case is producing and retaining fully fake people and without real CSAM in training data, solely locally (possession crimes). That’s really tough. Because not only does it not directly hurt anyone in its creation, there’s a possible benefit in that it diminishes the market for real CSAM (potentially saving unrelated children from the abuse flowing from that demand), and could also divert the impulse of the producer from preying on children around them due to unfulfilled desire.

    Could, because I don’t think there’s studies that answers whether those are true.

    • mpa92643@lemmy.world
      link
      fedilink
      arrow-up
      37
      arrow-down
      9
      ·
      4 months ago

      I mostly agree with you, but a counterpoint:

      Downloading and possession of CSAM seems to be a common first step in a person initiating communication with a minor with the intent to meet up and abuse them. I’ve read many articles over the years about men getting arrested for trying to meet up with minors, and one thing that shows up pretty often in these articles is the perpetrator admitting to downloading CSAM for years until deciding the fantasy wasn’t enough anymore. They become comfortable enough with it that it loses its taboo and they feel emboldened to take the next step.

      CSAM possession is illegal because possession directly supports creation, and creation is inherently abusive and exploitative of real people, and generating it from a model that was trained on non-abusive content probably isn’t exploitative, but there’s a legitimate question as to whether we as a society decide it’s associated closely enough with real world harms that it should be banned.

      Not an easy question for sure, and it’s one that deserves to be answered using empirical data, but I imagine the vast majority of Americans would flatly reject a nuanced view on this issue.

      • MagicShel@programming.dev
        link
        fedilink
        arrow-up
        35
        arrow-down
        6
        ·
        edit-2
        4 months ago

        The problem is empirical data cannot be morally or ethically found. You can’t show a bunch of people porn and then make a statistical observation of whether those shown child porn are more likely to assault children. So we have to go forward without that data.

        I will anecdotally observe anal sex, oral sex, and facials have gone up between partners as prevalence in porn has gone up. That suggests but does not prove a direct statistical harm caused by even “ethically produced CSAM.”

        • usualsuspect191@lemmy.ca
          link
          fedilink
          arrow-up
          29
          arrow-down
          1
          ·
          4 months ago

          I will anecdotally observe anal sex, oral sex, and facials have gone up between partners as prevalence in porn has gone up. That suggests but does not prove a direct statistical harm caused by even "ethically produced CSAM

          Can we look at trends between consenting adults (who are likely watching porn of real people by the way) as an indicator of what pedophiles will do? I’m not so sure. It’s not like step sibling sex is suddenly through the roof now with it being the “trend” in porn.

          Looking specifically at fake rape porn maybe and seeing if it increases rates of rape in the real world might be a better indicator.

          • MagicShel@programming.dev
            link
            fedilink
            arrow-up
            13
            arrow-down
            1
            ·
            edit-2
            4 months ago

            That’s fair. I tried to make clear that my interpretation is not in any way scientific or authoritative. Better correlations are probably possible.

            ETA on further thought: I wonder if prevalence of incest porn has had an effect on actual incest rates. That might be a much closer correlation due to the similar social taboo. But I’m not sure we have good data on that, either.

          • LustyArgonian@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            4 months ago

            Do you think people like Andrew Tate have caused more rapes to occur? Like do you think his rhetoric encourages a rapist mindset in his listeners?

        • mpa92643@lemmy.world
          link
          fedilink
          arrow-up
          10
          arrow-down
          2
          ·
          edit-2
          4 months ago

          True, it wouldn’t be ethical to conduct an experiment, but we can (and probably do) collect lots of observational data that can provide meaningful insight. People are arrested at all stages of CSAM related offenses from just possession, distribution, solicitation, and active abuse.

          While observation and correlations are inherently weaker than experimental data, they can at least provide some insight. For example, “what percentage of those only in possession of artificially generated CSAM for at least one year go on to solicit minors” vs. “real” CSAM.

          If it seems that artificial CSAM is associated with a lower rate of solicitation, or if it ends up decreasing overall demand for “real” CSAM, then keeping it legal might provide a real net benefit to society and its most vulnerable even if it’s pretty icky.

          That said, I have a nagging suspicion that the thing many abusers like most about CSAM is that it’s a real person and that the artificial stuff won’t do it for them at all. There’s also the risk that artificial CSAM reduces the taboo of CSAM and can be an on-ramp to more harmful materials for those with pedophilic tendencies that they otherwise are able to suppress. But it’s still way too early to know either way.

          • HelixDab2@lemm.ee
            link
            fedilink
            arrow-up
            7
            arrow-down
            1
            ·
            4 months ago

            the thing many abusers like most about CSAM is that it’s a real person and that the artificial stuff won’t do it for them at all.

            Perhaps. But what about when they can’t tell the difference between real and virtual? It seems like the allure of all pornography is the fantasy, rather than the reality. That is, you may enjoy extreme BDSM pornography, and enjoy seeing a person flogged until they’re bleeding, or see needles slowly forced through their penis, but do you really care that it’s a real person that’s going to end the scene, take a shower, and go watch a few episodes of “The Good Place” with their dog before bed? Or is it about the power fantasy that you’re constructing in your head about that scene? How important is the reality of the scene, versus being able to suspend your disbelief long enough to get sexual gratification from it? If the whole scene was done with really good practical effects and CG, would your experience, as a user–even if you were aware–be different?

          • Cryophilia@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            4 months ago

            True, it wouldn’t be ethical to conduct an experiment

            I think it would be ethical for researchers to go onto the boards of these already-existing CP distribution forums and conduct surveys. But then the surveyors would be morally obligated to report that board to the authorities to get it shut down. Which means that no one would ever answer surveyor questions because they knew the board would be shut down soon so they’d just find a new site ugh…

            Yeah nvm I don’t see any way around this one

      • HelixDab2@lemm.ee
        link
        fedilink
        arrow-up
        19
        arrow-down
        1
        ·
        4 months ago

        CSAM possession is illegal because possession directly supports creation

        To expound on this: prior to this point, the creation of CSAM requires that children be sexually exploited. You could not have CSAM without children being harmed. But what about when no direct harms have occurred? Is lolicon hentai ‘obscene’? Well, according to the law and case law, yes, but it’s not usually enforced. If we agree that drawings of children engaged in sexual acts aren’t causing direct harm–that is, children are not being sexually abused in order to create the drawings–then how much different is a computer-generated image that isn’t based off any specific person or event? It seem to me that, whether or not a pedophile might decide that they eventually want more than LLM-generated images is not relevant. Treating a future possibility as a foregone conclusion is exactly the rationale behind Reefer Madness and the idea of ‘gateway’ drugs.

        Allow me to float a second possibility that will certainly be less popular.

        Start with two premises: first, pedophilia is a characteristic that appears to be an orientation. That is, a true pedophile–a person exclusively sexually attracted to pre-pubescent children–does not choose to be a pedophile, any more than a person chooses to be gay. (My understanding is that very few pedophiles are exclusively pedophilic though, and that many child molesters are opportunistic sexual predators rather than being pedophiles.) Secondly, the rates of sexual assault appear to have decreased as pornography availability has increased. So the question I would have is, would wide availability of LLM-generated CSAM–CSAM that didn’t cause any real, direct harm to children–actually decrease rates of child sexual assault?

        • RandomlyNice@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          4 months ago

          With regards to your last paragraph: Pedophiles can indeed by straight, gay or bi. Pedophiles may also not become molesters, and molesters of children may not at all be pedophilic. It’s seems you understand this. I mentioned ITT that I read a newspaper article many years ago that was commissioned to show the access to cp would increase child abuse, it seemed to show the opposite.
          If persons could use AI to generate their own porn of their own personal fantasies (whatever those might be) and NOT share that content what then? Canada allows this for text (maybe certain visuals? Audio? IDK). I don’t know about current ‘obscene’ laws in the USA, however, I do recall reading about an art exhibit in NY which featured an upside down urinal that was deemed obscene, than later deemed a work or art. I also recall seeing (via an internet image) a sculpture of what seemed to be a circle of children with penises as noses. Porn? Art? Comedy?

          • HelixDab2@lemm.ee
            link
            fedilink
            arrow-up
            6
            ·
            4 months ago

            My understanding was that ‘pure’ pedophiles–ones that have no interest at all in post-pubescent children or any adults whatsoever–tend to be less concerned with sex/gender, particularly because children don’t have defined secondary sex characteristics. I don’t know if this is actually correct though. I’m not even sure how you could ethically research that kind of thing and end up with valid results.

            And honestly, not being able to do solid research that has valid results makes it really fuckin’ hard to find solutions that work to prevent as many children from being harmed as possible. In the US at least research about sex and sexuality in general-much less deviant sexualities–seems to be taboo, and very difficult to get funding for.

        • 2xsaiko@discuss.tchncs.de
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          edit-2
          4 months ago

          Hard to say. I generally agree with what you’ve said though. Also, lots of people have other fantasies that they would never enact in real life for various reasons (e.g. it’s unsafe, illegal, or both; edit: I should also absolutely list non-consensual here). I feel like pedophilia isn’t necessarily different.

          However part of the reason loli/whatever is also illegal to distribute (it is, right? I assume it is at least somewhere) is that otherwise it helps people facilitate/organize distribution of real CSAM, which increases demand for it. That’s what I’ve heard at least and it makes sense to me. And I feel like that would apply to AI generated as well.

          • HelixDab2@lemm.ee
            link
            fedilink
            arrow-up
            7
            ·
            4 months ago

            It’s obvs. very hard to get accounts of what pedophiles are doing; the only ones that you can survey are ones that have been caught, which isn’t necessarily a representative sample. I don’t think that there are any good estimates on the rate of pedophilic tendencies.

            the reason loli/whatever is also illegal to distribute

            From a cursory reading, it looks like possession and distribution are both felonies. Lolicon hentai is pretty widely available online, and prosecutions appear to be very uncommon when compared to the availability. (Low priority for enforcement, probably?)

            I’m not sure that increasing the supply of CSAM would necessarily increase demand for CSAM in people that aren’t already pedophiles though. To put it another way, I’m sure that increasing the supply of gay porn would increase consumption of gay porn, but I am pretty sure that it’s not going to make more people gay. And people that aren’t gay (or at least bi-) aren’t going to be interested in gay porn, regardless of how hard up (heh) they might be for porn, as long as they have any choices at all. There’s a distinction between fetishes/paraphilia, and orientations, and my impression has been that pedophilia is much more similar to an orientation than a paraphilia.

            • 2xsaiko@discuss.tchncs.de
              link
              fedilink
              arrow-up
              2
              arrow-down
              2
              ·
              4 months ago

              I’m not sure that increasing the supply of CSAM would necessarily increase demand for CSAM in people that aren’t already pedophiles though.

              No, but allowing people to organize increases demand because then those who would want CSAM have a place to look for it and ask for it where it’s safe for them to do so, and maybe even pay for it to be created. It’s rather the other way around, the demand increases the supply if you want to put it like that. I’m not saying lolicon being freely available turns people into pedophiles or something like that, at all.

              • HelixDab2@lemm.ee
                link
                fedilink
                arrow-up
                5
                ·
                4 months ago

                I guess where I come down is that, as long as no real people are being harmed–either directly, or because their likeness is being used–then I’d rather see it out in the open than hidden. At least if it’s open you can have a better chance of knowing who is immediately unsafe around children, and easily using that to exclude people from positions where they’d have ready access to children (teachers, priests, etc.).

                Unfortunately, there’s also a risk of pedophilia being ‘normalized’ to the point where people let their guard down around them.

      • ObjectivityIncarnate@lemmy.world
        link
        fedilink
        arrow-up
        12
        ·
        edit-2
        4 months ago

        Downloading and possession of CSAM seems to be a common first step in a person initiating communication with a minor with the intent to meet up and abuse them.

        But this is like the arguments used to say that weed is a “gateway drug” by talking about how people strung out on harder drugs almost always have done weed as well, ignoring everyone who uses only weed. But this is even hazier because we literally have no real idea how many people consume that stuff but don’t ‘escalate’.

        I remember reading once in some research out of Japan that child molesters consume less porn overall than the average citizen, which seems counter-intuitive, but may not be, if you consider the possibility that maybe it (in this case, they were talking primarily about manga with anime-style drawings of kids in sexual situations) is actually curbing the incidence of the ‘real thing’, since the ones actually touching kids in the real world are reading those mangas less.

        I’m also reminded of people talking about sex dolls that look like kids, and if that’s a possible ‘solution’ for pedophiles, or if it would ‘egg on’ actual molestation.

        I think I lean on the side of ‘satiation’, from the limited bits of idle research I’ve done here and there. And if that IS in fact the case, then regardless of if it grosses me out, I can’t in good conscience oppose something that actually reduces the number of children who actually get abused, you know?

        • LustyArgonian@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          4
          ·
          edit-2
          4 months ago

          It’s less that these materials are like a “gateway” drug and more like these materials could be considered akin to advertising. We already have laws about advertising because it’s so effective, including around cigarettes and prescriptions.

          Second, the role that CP plays in most countries is difficult. It is used for blackmail. It is also used to generate money for countries. And it’s used as advertising for actual human trafficking organizations. And similar organizations exist for snuff and gore btw. And ofc animals. And any combination of those 3. Or did you all forget about those monkey torture videos, or the orangutan who was being sex trafficked? Or Daisy’s Destruction and Peter Scully?

          So it’s important to not allow these advertisers to combine their most famous monkey torture video with enough AI that they can say it’s AI generated, but it’s really just an ad for their monkey torture productions. And they do that with CP, rape, gore, etc, too.

          • tamal3@lemmy.world
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            4 months ago

            People, please don’t just downvote with no comment. Why is this being downloaded? The comparisons to advertisements have validity. And, if you disagree, be productive and tell us why.

            • LustyArgonian@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              4 months ago

              Because a huge percentage of Lemmy is sexist and I am openly a woman. You’ll know because this comment will get nuked also.

      • Cryophilia@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        4 months ago

        but there’s a legitimate question as to whether we as a society decide it’s associated closely enough with real world harms that it should be banned.

        Why should that be a question at all? If it causes harm, ban it. If not, don’t. Being “associated with” should never be grounds for a legal statute.

      • 9bananas@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        4 months ago

        generally a very good point, however i feel it’s important to point out some important context here:

        the pedophiles you’re talking about in your comment are almost always members of tight knit communities that share CSAM, organize distribution, share sources, and most importantly, indulge their fantasies/desires together.

        i would think that the correlation that leads to molestation is not primarily driven by the CSAM itself, but rather the community around it.

        we clearly see this happening in other similarly structured and similarly isolated communities: nazis, incels, mass shooters, religious fanatics, etc.

        the common factor in radicalization and development of extreme views in all these groups is always isolation and the community they end up joining as a result, forming a sort of parallel society with it’s own rules and ideals, separate from general society. over time people in these parallel societies get used to seeing the world in a way that aligns with the ideals of the group.

        nazis start to see anyone not part of their group as enemies, incels start to see “females” instead of women, religious fanatics see sinners…and pedophiles see objects that exist solely for their gratification instead of kids…

        I don’t see why molesters should be any different in this aspect, and would therefore argue that it’s the communal aspect that should probably be the target of the law, i.e.: distribution and organization (forums, chatrooms, etc.)

        the harder it is for them to organize, the less likely these groups are to produce predators that cause real harm!

        if on top of that there is a legally available outlet where they can indulge themselves in a safe manner without harming anyone, I’d expect rates of child molestation to drop significantly, because, again, there’s precedence from similar situations (overdoses in drug addicts, for example)

        i think it is a potentially fatal mistake to think of pedophiles as “special” cases, rather than just another group of outcasts, because in nearly all cases of such pariahs the solutions that prove to work best in the real world are the ones that make these groups feel less like outcasts, which limits avenues of radicalization.

        i thought these parallels are something worth pointing out.

    • snooggums@midwest.social
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      2
      ·
      4 months ago

      Even worse, you don’t need CSAM to start with. If a learning model has regular porn and nude reference model photography of people under 18 that are used for drawing anatomy, then they have enough information to combine the two. Hell, it probably doesn’t even need the people under 18 to actually be nude.

      Hell, society tends to assume any nudity inder 18 to be CSAM anyway, because someone could see it that way.

    • ObjectivityIncarnate@lemmy.world
      link
      fedilink
      arrow-up
      14
      ·
      4 months ago

      I don’t know if it’s still a thing, but I’m reminded of some law or regulation that was passed a while back in Australia, iirc, that barred women with A-cup busts from working in porn, the “reasoning” being that their flatter chests made them look too similar to prepubescent girls, lol…

      Not only stupid but also quite insulting to women, imo.

    • HelixDab2@lemm.ee
      link
      fedilink
      arrow-up
      13
      ·
      4 months ago

      Because any simulated CSAM laws have been, to my knowledge, all struck down when challenged.

      To the best of my knowledge, calling drawn works obscene has been upheld in courts, most often because the artist(s) lack the financial ability to fight the charges effectively. The artist for the underground comic “Boiled Angel” had his conviction for obscenity upheld–most CSAM work falls under obscenity laws–and ended up giving up the fight to clear his name.

      • ReallyActuallyFrankenstein@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        4 months ago

        Oh, for sure. I’m talking about laws specifically targeted to minors. “Obscenity” is a catch-all that is well-established, but if you are trying to protect children from abuse, it’s a very blunt instrument and not as effective as targeted abuse and trafficking statutes. The statutory schemes used to outlaw virtual CSAM have failed to my knowledge.

        For example: https://en.wikipedia.org/wiki/Ashcroft_v._Free_Speech_Coalition

        That case was statutorily superseded in part by the PROTECT Act, which attempted to differentiate itself by…relying on an obscenity standard. So it’s a bit illusory that it does anything new.

        • HelixDab2@lemm.ee
          link
          fedilink
          arrow-up
          4
          ·
          4 months ago

          The PROTECT Act has been, so far, found to be constitutional, since it relies on the obscenity standard in regards to lolicon hentai. Which is quite worrisome. It seems like it’s a circular argument/tautology; it’s obscene for drawn art to depict child sexual abuse because drawings of child sexual abuse are obscene.

    • brbposting@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      simulated CSAM

      When I used this phrase, someone told me it described a nonexistent concept, and that the CSAM term existed in part to differentiate between content where children were harmed to make it versus not. I didn’t wanna muddy any waters but do you have an opposing perspective?

      Deepfaking intentionally real under 18 people is also not black and white

      Interesting. Sounds real bad. See what you mean about harm factor though.

    • Cryophilia@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      3
      ·
      4 months ago

      I’m at least all for a “fruit of the poisoned tree” theory - if AI model training data sets include actual CSAM then they can and should be made illegal.

      Now all AI is illegal. It’s trained via scraping the internet, which will include CP as well as every other image.