THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • @j4k3@lemmy.world
    link
    fedilink
    65
    edit-2
    8 months ago

    Things have already been moving towards nobody models. I think this will eventually have the consequence of nobodies becoming the new somebodies as this will result in a lot of very well developed nobodies and move the community into furthering their development instead of the deepfake stuff. You’ll eventually be watching Hollywood quality feature films full of nobodies. There is some malicious potential with deep fakes, but the vast majority are simply people learning the tools. This will alter that learning target and the memes.

    • @aesthelete@lemmy.world
      link
      fedilink
      588 months ago

      You’ll eventually be watching Hollywood quality feature films full of nobodies.

      With the content on modern streaming services, I’ve been doing this for a while already.

    • @bradorsomething@ttrpg.network
      link
      fedilink
      108 months ago

      Of course, eventually somebody will look exactly like the nobody, so the owners of the nobody will sue that somebody to block them from pretending to be that nobody in videos.

    • @ColeSloth@discuss.tchncs.de
      link
      fedilink
      88 months ago

      I see it eventually happening in porn, but it isn’t happening in major motion pictures any time too soon. People won’t follow and actively go see a cgi actor in a movie like they would a real one. Hollywood pays big money for A list stars because those people get asses in seats. That, along with tech not being there quite yet, and Hollywood locked under sag contract to only do certain things with AI all adds up to nobody’s being a ways off.

      • Flying Squid
        link
        fedilink
        98 months ago

        I also remember years ago reading scare headlines about how the CG in Final Fantasy: The Spirits Within was so realistic that it would be the end of Hollywood actors, they would all be CG.

        So I’ll take that claim with a huge grain of salt.

        • @felbane@lemmy.world
          link
          fedilink
          28 months ago

          To be fair, The Spirits Within was pretty amazing especially considering when it was made. I briefly had the same thought while I was still in awe of the realism, so I can definitely understand why people would believe those headlines.

          I can imagine a scenario where a real voice actor is paired with a particular model and used in multiple films/roles. That’s not that dissimilar to Mickey Mouse or any other famous animated character.

          The idea that AICG actors will completely replace live actors is clearly ridiculous, but the future definitely has room for fully visually- and vocally-AICG personalities as film “stars” alongside real people.

          • Flying Squid
            link
            fedilink
            48 months ago

            What I am saying is that I think we are a much longer way away from believable photrealistic CG actors than you think. As far as I know, microexpressions have not even been considered, let alone figured out. Microexpressions are really important.

            https://en.wikipedia.org/wiki/Microexpression

            That’s why I think the uncanny valley problem is a lot harder to overcome than people believe.