• 1 Post
  • 124 Comments
Joined 2 years ago
cake
Cake day: June 23rd, 2023

help-circle










  • Everything is a remix of a copy of derivative works. They learn from other people, from other artists, from institutions that teach entire industries. Some of it had “informed consent”, some of it was “borrowed” from other ideas without anybody’s permission. We see legal battles on a monthly basis as to whether these four notes are too similar to these other four notes. Half of the movies are reboots, and some of them were actually itself another reboot a few times over.

    “Good artist copy, great artist steal.”

    No human has truly had an original thought in their head ever. It’s all stolen knowledge, whether we realize it’s stolen or not. In the age of the Internet, we steal now more than ever. We pass memes to each other with stolen material. Journalists copy entire quotes from other journalists, who then create other articles about some guy’s Twitter post, who actually got the idea from Reddit, and that article gets posted on Facebook. And then when it reaches Lemmy, we post the whole article because we don’t want the paywall.

    We practice Fair Use on a regular basis by using and remixing images and videos into other videos, but isn’t that similar to an AI bot looking at an image, figuring out some weights from it, and throwing it away? I don’t own this picture of Brian Eno, but it’s certainly sitting in my browser cache, and Brian Eno and Getty Images didn’t grant me “informed consent” to do anything with it. Did I steal it? Is it legally or morally unethical to have it sitting on my hard drive? If I photoshop it and turn it into a dragon with a mic on a mixing board, and then pass it around, is that legally or morally unethical? Fair Use says no.

    It’s folly to pretend that AI should be held to a standard that we ourselves aren’t upholding.




  • Essentially OpenAI, Google and the rest of the pack of thieves are lobbying to establish themselves as the rulers of a lawless world

    Which is why open-source models are so critically important. When OpenAI collectively shat their pants at the announcement of DeepSeek, it exposed an obvious weakness in their plans: They can’t sell what people can do for free.

    They. Are. Fucking. Terrified. Of. Open. Source.

    Same thing when Google put out that internal memo a few years ago, criticizing the use of open source, when Stable Diffusion suddenly exploded on the scene.

    Indeed that would be the end of the copyright law, but only for the oligarchs.

    All it takes is one critical case to establish precedent.