• KillingTimeItself@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    9
    ·
    6 days ago

    i have no problem with ai porn assuming it’s not based on any real identities, i think that should be considered identity theft or impersonation or something.

    Outside of that, it’s more complicated, but i don’t think it’s a net negative, people will still thrive in the porn industry, it’s been around since it’s been possible, i don’t see why it wouldn’t continue.

    • Dr. Moose@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      6 days ago

      Identity theft only makes sense for businesses. I can sketch naked Johny Depp in my sketchbook and do whatever I want with it and no one can stop me. Why should an AI tool be any different if distribution is not involved?

      • KillingTimeItself@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        3
        ·
        6 days ago

        revenge porn, simple as. Creating fake revenge porn of real people is still to some degree revenge porn, and i would argue stealing someones identity/impersonation.

        To be clear, you’re example is a sketch of johnny depp, i’m talking about a video of a person that resembles the likeness of another person, where the entire video is manufactured. Those are fundamentally, two different things.

          • iAvicenna@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            5 days ago

            I guess the point is this enables the mass production of revenge porn essentially at a person on the street level which makes it much harder to punish and prevent distribution. when it is relatively few sources that produces the unwanted product then only punishing the distribution might be a viable method. But when the production method becomes available to the masses then the only feasible control mechanism is to try to regulate the production method. It is all a matter of where is the most efficient position to put the bottle neck.

            For instance when 3D printing allows people to produce automatic rifles in their homes “saying civil use of automatic rifles is illegal so that is fine” is useless.

            • Dr. Moose@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              5 days ago

              I think that’s a fair point and I wonder how will this effect the freedom of expression on the internet. If you can’t find the distributor then it’ll be really tough to get a handle of this.

              On the other hand the sheer over abundance could simply break the entire value of revenge porn as in “nothing is real anyway so it doesn’t matter” sort of thing which I hope would be the case. No one will be watching revenge porn cause they generate any porn they want in a heartbeat. Thats the ideal scenario anyway.

              • iAvicenna@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                5 days ago

                It is indeed a complicated problem with many intertwined variables, wouldn’t wanna be in the shoes of policy makers (assuming that they actually are searching for an honest solution and not trying to turn this into profit lol).

                For instance too much regulation on fields like this essentially would kill high quality open source AI tools and make most of them proprietary software leaving the field in the mercy of tech monopolies. This is probably what these monopolies want and they will surely try to push things this way to kill competition (talk about capitalism spurring competition and innovation!). They might even don the cloak of some of these bad actors to speed up the process. Given the possible application range of AI, this is probably even more dangerous than flooding the internet with revenge porn.

                %100 freedom, no regulations will essentially lead to a mixed situation of creative and possibly ground breaking uses of the tech vs many bad actors using the tech for things like scamming, disinformation etc. how it will balance out on the long run is probably very hard to predict.

                I think two things are clear, 1-both extremities are not ideal, 2- between the two extremities %100 freedom is still the better option (the former just exchanges many small bad actors for a couple giant bad actors and chokes any possible good outcomes).

                Based on these starting with a solution closer to the “freedom edge” and improving it step by step based on results is probably the most sensible approach.

          • KillingTimeItself@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            2
            ·
            6 days ago

            sort of. There are arguments that private ownership of these videos is also weird and shitty, however i think impersonation and identity theft are going to the two most broadly applicable instances of relevant law here. Otherwise i can see issues cropping up.

            Other people do not have any inherent rights to your likeness, you should not simply be able to pretend to be someone else. That’s considered identity theft/fraud when we do it with legally identifying papers, it’s a similar case here i think.

            • Dr. Moose@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              6 days ago

              But the thing is it’s not a relevant law here at all as nothing is being distributed and no one is being harmed. Would you say the same thing if AI is not involved? Sure it can be creepy and weird and whatnot but it’s not inhertly harmful or at least it’s not obvious how it would be.

              • KillingTimeItself@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                2 days ago

                the only perceivable reason to create these videos is either for private consumption, in which case, who gives a fuck. Or for public distribution, otherwise you wouldn’t create them. And you’d have to be a bit of a weird breed to create AI porn of specific people for private consumption.

                If AI isn’t involved, the same general principles would apply, except it might include more people now.

                • Dr. Moose@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  2 days ago

                  I’ve been thinking about this more and I think one interesting argument here is “toxic culture growth”. As in even if the thing is not distributed it might grow undesired toxic cultures througu indirect exposures (like social media or forums discussions) even without the direct sharing.

                  I think this is slippery to the point of government mind control but maybe there’s something valuable to research here either way.

                  • KillingTimeItself@lemmy.dbzer0.com
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    edit-2
                    17 hours ago

                    I’ve been thinking about this more and I think one interesting argument here is “toxic culture growth”. As in even if the thing is not distributed it might grow undesired toxic cultures through indirect exposures (like social media or forums discussions) even without the direct sharing.

                    this is another big potential as well. Does it perpetuate cultural behaviors that you want to see in society at large? Similar things like this have resulted from misogyny and the relevant history of feminism.

                    It’s a whole thing.

                    I think this is slippery to the point of government mind control but maybe there’s something valuable to research here either way.

                    i think there is probably a level of government regulation that is productive, i’m just curious about how we even define where that line starts and ends, because there is not really an explicitly clear point, unless you have solid external inferences to start from.

    • ubergeek@lemmy.today
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      5 days ago

      i have no problem with ai porn assuming it’s not based on any real identities

      With any model in use, currently, that is impossible to meet. All models are trained on real images.

      • KillingTimeItself@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        5 days ago

        With any model in use, currently, that is impossible to meet. All models are trained on real images.

        yes but if i go to thispersondoesnotexist.com and generate a random person, is that going to resemble the likeness of any given real person close enough to perceptibly be them?

        You are literally using the schizo argument right now. “If an artists creates a piece depicting no specific person, but his understanding of persons is based inherently on the facial structures of other people that he knows and recognizes, therefore he must be stealing their likeness”

        • ubergeek@lemmy.today
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 days ago

          No, the problem is a lack of consent of the person being used.

          And now, being used to generate depictions of rape and CSAM.

          • KillingTimeItself@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            2 days ago

            yeah but like, legally, is this even a valid argument? Sure there is techically probably like 0.0001% of the average person being used in any given result of an AI generated image. I don’t think that gives anyone explicit rights to that portion however.

            That’s like arguing that a photographer who captured you in a random photo in public that became super famous is now required to pay you royalties for being in that image, even though you are literally just a random fucking person.

            You can argue about consent all you want, but at the end of the day if you’re posting images of yourself online, you are consenting to other people looking at them, at a minimum. Arguably implicitly consenting to other people being able to use those images. (because you can’t stop people from doing that, except for copyright, but that’s not very strict in most cases)

            And now, being used to generate depictions of rape and CSAM.

            i dont see how this is even relevant, unless the person in question is a minor, a victim, or becoming a victim, otherwise it’s no different than me editing an image of someone to make it look like they got shot in the face. Is that shitty? Sure. But i don’t know of any laws that prevent you from doing that, unless it’s explicitly to do with something like blackmail, extortion, or harassment.

            The fundamental problem here is that you’re in an extremely uphill position to even begin the argument of “well it’s trained on people so therefore it uses the likeness of those people”

            Does a facial structure recognition model use the likeness of other people? Even though it can detect any person that meets the requirements established by its training data? There is no suitable method to begin to breakdown at what point that persons likeness begins, and at what point it ends. it’s simply an impossible task.

            • ubergeek@lemmy.today
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 day ago

              yeah but like, legally, is this even a valid argument?

              Personally, legal is only what the law allows the wealthy to do, and provides punishments for the working class.

              Morally, that’s what you’re doing when you use AI to generate CSAM. Its the same idea why we ban all pre-created CSAM, as well, because you are victimizing the person every single time.

              i dont see how this is even relevant, unless the person in question is a minor, a victim, or becoming a victim,

              It makes them a victim.

              But i don’t know of any laws that prevent you from doing that, unless it’s explicitly to do with something like blackmail, extortion, or harassment.

              The law exists to protect the ruling class while not binding them, and to bind the working class without protecting them.

              Does a facial structure recognition model use the likeness of other people?

              Yes.

              Even though it can detect any person that meets the requirements established by its training data? There is no suitable method to begin to breakdown at what point that persons likeness begins, and at what point it ends. it’s simply an impossible task.

              Exactly. So, without consent, it shouldn’t be used. Periodt.

              • KillingTimeItself@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                1
                ·
                17 hours ago

                Personally, legal is only what the law allows the wealthy to do, and provides punishments for the working class.

                if you have schizoprehnia, sure. Legal is what the law defines as ok. Whether or not people get charged for it is another thing. The question is “do you have the legal right to do it or not”

                Morally, that’s what you’re doing when you use AI to generate CSAM. Its the same idea why we ban all pre-created CSAM, as well, because you are victimizing the person every single time.

                legally, the reasoning behind this is because it’s just extremely illegal, there are almost no if not zero, where it would be ok or reasonable, and therefore the moral framework tends to be developed around that. I don’t necessarily agree with it always being victimization, because there are select instances where it just doesn’t really make sense to consider it that, there are acts you commit that would make it victimization. However i like to subscribe to the philosophy that it is “abusive” material, and therefore innately wrong. Like blackmail, i find that to be a little bit more strict and conducive to that sort of definition.

                It makes them a victim.

                at one point in time yes, perpetually in some capacity, they will exist as having been a victim, or having been victimized at one point. I also don’t really consider it to be healthy or productive to engage in “once a victim always a victim” mentality, because i think it sets a questionable precedent for mental health care. Semantically someone who was a victim once, is still a victim of that specific event, however it’s a temporally relevant victimization, i just think people are getting a little loose on the usage of that word recently.

                I’m still not sure how it makes that person a victim, unless it meets one of the described criteria i laid out, in which case it very explicitly becomes an abusive work. Otherwise it’s questionable how you would even attribute victimization to the victim in question, because there is no explicit victim to even consider. I guess you could consider everybody even remotely tangentially relevant to be a victim, but that then opens a massive blackhole of logical reasoning which can’t trivially be closed.

                To propose a hypothetical here. Let’s say there is a person who we will call bob. Bob has created a depiction of “abuse” in such a horrendous manner that even laying your eyes upon such a work will forever ruin you. We will define the work in question to be a piece of art, depicting no person in particular, arguably barely resembling a person at all, however the specific definition remains to the reader. You could hypothetically in this instance argue that even viewing the work is capable of making people a “victim” to it. However you want to work that one out.

                The problem here, is that bob hasn’t created this work in complete isolation, because he’s just a person, he interacts with people, has a family, has friends, acquaintances, he’s a normal person, aside from the horrors beyond human comprehension he has created. Therefore, in some capacity the influence of these people in his life, has to have influenced the work he engaged in on that piece. Are the people who know/knew bob, victims of this work as well, regardless of whether or not they have seen it, does the very act of being socially related to bob make them a victim of the work? For the purposes of the hypothetical we’ll assume they haven’t seen the work, and that he has only shown it to people he doesn’t personally know.

                I would argue, and i think most people would agree with me, that there is no explicit tie in between the work that bob has created, and the people he knows personally. Therefore, it would be a stretch to argue that because those people were tangentially relevant to bob, are now victims, even though they have not been influenced by it. Could it influence them in some external way, possibly causing some sort of external reaction? Yeah, that’s a different story. We’re not worried about that.

                This is essentially the problem we have with AI, there is no explicit resemblance to any given person (unless defined, which i have already explicitly opted out of) or it has inherently based the image off of via training (which i have also somewhat, explicitly opted out of as well) there are two fundamental problems here that need to be answered. First of all, how are these people being victimized? By posting images publicly on the internet? Seems like they consented to people at least being aware of them, if not to some degree manipulating images of them, because there is nothing to stop that from happening (as everyone already knows from NFTs) And second of all, how are we defining these victims? What’s the mechanism we use to determine the identity of these people, otherwise, we’re just schizophrenically handwaving the term around calling people victims when we have no explicit way of determining that. You cannot begin to call someone a victim, if it’s not even know whether they were victimized or not. You’re setting an impossible precedent here.

                Even if you can summarily answer those two questions in a decidedly explicit manner, it’s still questionable whether that would even matter. Because now you would have to demonstrate some form of explicit victimization and damage resulting from that victimization. Otherwise you’re just making the argument of “it’s mine because i said so”

                The law exists to protect the ruling class while not binding them, and to bind the working class without protecting them.

                again if you’re schizo, sure.

                Yes.

                on a loosely defined basis, yeah, in some capacity it uses the likeness of that person, but to what degree? How significantly? If the woman in the mona lisa picture was 4% some lady the artist saw three times a week due to their habits/routine would that make them suddenly entitled to some of that art piece in particular? What about the rest of it? You’re running down an endless corridor of infinitely unfalsifiable, and falsifiable statements. There is no clear answer here.

                Exactly. So, without consent, it shouldn’t be used. Periodt.

                you need to explicitly define consent, and use, because without defining those, it’s literally impossible to even begin determining the end position here.

                • ubergeek@lemmy.today
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  6 hours ago

                  I refuse to debate ideas on how to make ethical CSAM with you.

                  Go find a pedo to figure out how you want to try to make CSAM, and you can well akshully all tou want.