A bipartisan group of senators introduced a new bill to make it easier to authenticate and detect artificial intelligence-generated content and protect journalists and artists from having their work gobbled up by AI models without their permission.

The Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act) would direct the National Institute of Standards and Technology (NIST) to create standards and guidelines that help prove the origin of content and detect synthetic content, like through watermarking. It also directs the agency to create security measures to prevent tampering and requires AI tools for creative or journalistic content to let users attach information about their origin and prohibit that information from being removed. Under the bill, such content also could not be used to train AI models.

Content owners, including broadcasters, artists, and newspapers, could sue companies they believe used their materials without permission or tampered with authentication markers. State attorneys general and the Federal Trade Commission could also enforce the bill, which its backers say prohibits anyone from “removing, disabling, or tampering with content provenance information” outside of an exception for some security research purposes.

(A copy of the bill is in he article, here is the important part imo:

Prohibits the use of “covered content” (digital representations of copyrighted works) with content provenance to either train an AI- /algorithm-based system or create synthetic content without the express, informed consent and adherence to the terms of use of such content, including compensation)

  • e$tGyr#J2pqM8v@feddit.nl
    link
    fedilink
    English
    arrow-up
    58
    arrow-down
    8
    ·
    edit-2
    1 month ago

    I don’t like AI but I hate intellectual property. And the people that want to restrict AI don’t seem to understand the implications that has. I am ok with copying as I think copyright is a load of bullocks. But they aren’t even reproducing the content verbatim are they? They’re ‘taking inspiration’ if you will, transforming it into something completely different. Seems like fair use to me. It’s just that people hate AI, and hate the companies behind it, and don’t get me wrong, rightfully so, but that shouldn’t get us all to stop thinking critically about intellectual property laws.

    • just another dev@lemmy.my-box.dev
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      5
      ·
      1 month ago

      I’m the opposite, actually. I like generative AI. But as a creator who shares his work with the public for their (non-commercial) enjoyment, I am not okay with a billionaire industry training their models on my content without my permission, and then use those models as a money machine.

        • just another dev@lemmy.my-box.dev
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          4
          ·
          1 month ago

          What are you basing that on?

          Content owners, including broadcasters, artists, and newspapers, could sue companies they believe used their materials without permission or tampered with authentication markers.

          Doesn’t say anything about the right just applying to giant tech companies, it specifically mentions artists as part of the protected content owners.

          • interdimensionalmeme@lemmy.ml
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            1 month ago

            That’s like saying you are just as protected regardless which side of the mote you stand on.

            It’s pretty clear the way things are shaping up is only the big tech elite will control AI and they will lord us over with it.

            The worst thing that could happen with AI. It falling into the hands of the elites, is happening.

            • just another dev@lemmy.my-box.dev
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              1 month ago

              I respectfully disagree. I think small time AI (read: pretty much all the custom models on hugging face) will get a giant boost out of this, since they can get away with training on “custom” data sets - since they are too small to be held accountable.

              However, those models will become worthless to enterprise level models, since they wouldn’t be able to account for the legality. In other words, once you make big bucks of of AI you’ll have to prove your models were sourced properly. But if you’re just creating a model for small time use, you can get away with a lot.

              • interdimensionalmeme@lemmy.ml
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                1 month ago

                I am skeptical that this is how it will turn out. I don’t really believe there will be a path from 0$ to challenging big tech without a roadblock of lawyers shutting you down with no way out on the way.

                • just another dev@lemmy.my-box.dev
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 month ago

                  I don’t think so either, but to me that is the purpose.

                  Somewhere between small time personal-use ML and commercial exploitation, there should be ethical sourcing of input data, rather than the current method of “scrape all you can find, fuck copyright” that OpenAI & co are getting away with.

                  • interdimensionalmeme@lemmy.ml
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    1
                    ·
                    1 month ago

                    I mean this is exactly the kind of regulation that microsoft/openai is begging for to cement their position. Then is going to be just a matter of digesting their surviving competitors until only one competitor remains, similar to Intel / AMD relationship. Then they can have a 20 year period of stagnation while they progressively screw over customers and suppliers.

                    I think that’s the bad ending. By desperately trying to keep the old model of intellectual property going, they’re going to make the real AI nightmare of an elite few in control of the technology with an unconstrained ability to leverage the benefits and further solidifying their lead over everyone else.

                    The collective knowledge of humanity is not their exclusive property. It also isn’t the property of whoever is the lastest person to lay a claim to an idea in effective perpetuity.

    • rekorse@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      6
      ·
      1 month ago

      Just because intellectual property laws currently can be exploited doesnt mean there is no place for it at all.

      • e$tGyr#J2pqM8v@feddit.nl
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        2
        ·
        1 month ago

        That’s an opinion you can have, but I can just as well hold mine, which is that restricting any form of copying is unnatural and harmful to society.

          • e$tGyr#J2pqM8v@feddit.nl
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 month ago

            That’s right. They can put their art up for sale, but if someone wants to take a free copy nothing should be able to stop them.

                • rekorse@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 month ago

                  That would lead to most art being produced by people who are wealthy enough to afford to produce it for free, wouldn’t it?

                  What incentive would a working person have to work on becoming an artist? Its not like artists are decided at birth or something.

    • Adderbox76@lemmy.ca
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      14
      ·
      1 month ago

      They’re ‘taking inspiration’ if you will, transforming it into something completely different.

      That is not at all what takes place with A.I.

      An A.I. doesn’t “learn” like a human does. It aggregates multiple chunks from multiple sources. It’s just really really tiny chunks so it’s hard to tell sometimes.

      That’s why you can ask two AI’s to write a story based on the same prompt and some of their lines will be exactly the same. Because it’s not taking inspiration from, it’s literally copying bits and pieces of other works and it happens that they both chose that particular bit.

      If you do that when writing a paper in university it’s called plagerism.

      Get the fuck out of here with your “A.I. takes inspiration…” it copies nothing more. It doesn’t add anything new to the sum total of the creative zeitgeist because it’s just remixes of things that already exist.

      • LainTrain@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        4
        ·
        edit-2
        1 month ago

        it copies nothing more

        it’s just remixes of things that already exist.

        So it does do more than copying? Because as you said - it remixes.

        It sounds like the line you’re trying to draw is not only arbitrary, but you yourself can’t even stick with it for more than one sentence.

        Everything new is some unique combination of things that already exist, the elements it draws from are called sources and influences, and rules according to which they’re remixed are called techniques/structures e.g. most movies are three acts, and many feature specific techniques like J-cuts.

        Heck even re-arranging elements of just one thing is a unique and different thing, or is your favourite song and a remix of it literally the same? Or does the remix not have artistic value, even though someone out there probably likes the remix, but not the original?

        I think your confusion stems from the fact you’re a top shelf, grade-A Moron.

        You’re an organic, locally sourced and ethically produced idiot, and you need to learn how basic ML works, what “new” is, and glance at some basic epistemology and metaphysics before you lead us to ruin because you don’t even understand what “new” entails, before your reactionary rhetoric leads us all down straight to cyberpunk dystopias.

      • ObliviousEnlightenment@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        1 month ago

        Consider youtube poop, Im serious. Everyclip in them is sourced from preexisting audio and video, and mixed or distorted in a comedic format. You could make an AI to make youtube poops using those same clips and other “poops” as training data. What it outputs might be of lower quality, but in a technical sense it would be made in an identical fashion. And, to the chagrin of Disney, Nintendo, and Viacom, these are considered legally distinct entities; because I dont watch Frying Nemo in place of Finding Nemo. So why would it be any different when an AI makes it?

      • Richard@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        4
        ·
        1 month ago

        You just reiterate what other anti-ML extremists have said like a sad little parrot. No, LLMs don’t just copy. They network information and associations and can output entirely new combinations of them. To do this, they make use of neural networks, which are computational concepts analogous to the way your brain works. If, according to you, LLMs just copy, then that’s all that you do as well.

      • afraid_of_zombies@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 month ago

        You can do the same thing with the Hardy Boys. You can find the same page word for word in different books. You can also do that with the Bible. The authors were plagiarizing each other.

        It doesn’t add anything new to the sum total of the creative zeitgeist because it’s just remixes of things that already exist.

        Do yourself a favor and never ever go into design of infrastructure equipment or eat at a Pizza Hut or get a drink from Starbucks or work for an American car company or be connected to Boeing.

        Everyone has this super impressive view of human creativity and I am waiting to see any of it. As far as I can tell the less creative you are the more success you will have. But let me guess you ride a Segway, wear those shoes with toes, have gone through every recipe of Julia Childs, and compose novels that look like Finnegan’s Wake got into a car crash with EE Cummings and Gravity’s Rainbow.

        Now leave me alone with I eat the same burger as everyone else and watch reruns of Family Guy in my house that looks like all the other ones on the street