Greg Rutkowski, a digital artist known for his surreal style, opposes AI art but his name and style have been frequently used by AI art generators without his consent. In response, Stable Diffusion removed his work from their dataset in version 2.0. However, the community has now created a tool to emulate Rutkowski’s style against his wishes using a LoRA model. While some argue this is unethical, others justify it since Rutkowski’s art has already been widely used in Stable Diffusion 1.5. The debate highlights the blurry line between innovation and infringement in the emerging field of AI art.

  • falsem@kbin.social
    link
    fedilink
    arrow-up
    17
    ·
    1 year ago

    If I look at someone’s paintings, then paint something in a similar style did I steal their work? Or did I take inspiration from it?

    • Pulse@dormi.zone
      link
      fedilink
      arrow-up
      11
      ·
      1 year ago

      No, you used it to inform your style.

      You didn’t drop his art on to a screenprinter, smash someone else’s art on top, then try to sell t-shirts.

      Trying to compare any of this to how one, individual, human learns is such a wildly inaccurate way to justify stealing a someone’s else’s work product.

      • falsem@kbin.social
        link
        fedilink
        arrow-up
        13
        ·
        1 year ago

        If it works correctly it’s not a screenprinter, it’s something unique as the output.

        • Pulse@dormi.zone
          link
          fedilink
          arrow-up
          16
          ·
          1 year ago

          The fact that folks can identify the source of various parts of the output, and that intact watermarks have shown up, shows that it doesn’t work like you think it does.

          • FaceDeer@kbin.social
            link
            fedilink
            arrow-up
            10
            ·
            1 year ago

            They can’t, and “intact” watermarks don’t show up. You’re the one who is misunderstanding how this works.

            When a pattern is present very frequently the AI can learn to imitate it, resulting in things that closely resemble known watermarks. This is called “overfitting” and is avoided as much as possible. But even in those cases, if you examine the watermark-like pattern closely you’ll see that it’s usually quite badly distorted and only vaguely watermark-like.

            • Pulse@dormi.zone
              link
              fedilink
              arrow-up
              9
              ·
              1 year ago

              Yes, because “imitate” and “copy” are different things when stealing from someone.

              I do understand how it works, the “overfitting” was just laying clear what it does. It copies but tries to sample things in a way that won’t look like clear copies. It had no creativity, it is trying to find new ways of making copies.

              If any of this was ethical, the companies doing it would have just asked for permission. That they didn’t says a everything you need to know.

              I don’t usually have these kinds discussions anymore, I got tired of conversations like this back in 2016, when it became clear that people will go to the ends of the earth to justify unethical behavior as long as the people being hurt by it are people they don’t care about.

              • FaceDeer@kbin.social
                link
                fedilink
                arrow-up
                4
                ·
                1 year ago

                And we’re back to you calling it “stealing”, which it certainly is not. Even if it was copyright violation, copyright violation is not stealing.

                You should try to get the basic terminology right, at the very least.

                • Pulse@dormi.zone
                  link
                  fedilink
                  arrow-up
                  4
                  ·
                  1 year ago

                  Just because you’ve redefined theft in a way that makes you feel okay about it doesn’t change what they did.

                  They took someone else’s work product, fed it into their machine then used that to make money.

                  They stole someone’s labor.

                  • FaceDeer@kbin.social
                    link
                    fedilink
                    arrow-up
                    3
                    ·
                    1 year ago

                    I haven’t “redefined” it, I’m using the legal definition. People do sometimes sloppily equate copyright violation with theft in common parlance, but they’re in for a rude awakening if they intend to try translating that into legal action.

                    Using that term in an argument like this is merely trying to beg the question of whether it’s wrong, since most everyone agrees that stealing is wrong you’re trying to cast the action of training an AI as something everyone will by default agree is wrong. But it’s not stealing, no matter how much you want it to be, and I’m calling that rhetorical trick out here.

                    If you want to argue that it’s wrong you need to argue against the actual process that’s happening, not some magical scenario where the AI trainers are somehow literally robbing people.

          • jarfil@beehaw.org
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            Does that mean the AI is not smart enough to remove watermarks, or that it’s so smart it can reproduce them?

            • nickwitha_k (he/him)@lemmy.sdf.org
              link
              fedilink
              arrow-up
              3
              ·
              1 year ago

              LLMs and directly related technologies are not AI and possess no intelligence or capability to comprehend, despite the hype. So, they are absolutely the former, though it’s rather like a bandwagon sort of thing (x number of reference images had a watermark, so that’s what the generated image should have).

            • Swedneck@discuss.tchncs.de
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              It’s like staring yourself blind at artworks with watermarks until you start seeing artworks with blurry watermarks in your dreams