shared via https://feddit.de/post/2805371

Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

      • lloram239@feddit.de
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        I don’t think just giving up and allowing porn deep fakes and stuff of people is really an acceptable answer here.

        It’s the only sensible answer. Anything else would require an extreme violation of everybody’s privacy and the implementation of total surveillance. See France’s recent attempt at giving police full access to peoples phones, that’s the kind of stuff you end with when going down that route.

        This AI is out there today, can be run on every half descent gaming PC and can generate new images in about 30sec. And it will only get better going forward. Images are as malleable as text now, you can accept that, or keep trying to fight windmills.

        but sites absolutely can manage deepdakes

        Of course they can, and most already do. But on the whole, that really doesn’t have much of an effect, anybody can make their own sites and you don’t even have to go deep down into the dark web for that. It’s the first link on Google when you search for it.

      • davehtaylor@beehaw.org
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        The way people are just throw up their hands at every new problematic issue with AI is not a good way of dealing with them, we don’t have to tolerate this stuff and now is the best time to deal with it and start creating rules and protections.

        Exactly. In another thread on here recently someone said something that basically boiled down to “your protest against AI isn’t going to stop it. There’s too much corporate power behind it. So you might as well embrace it” and I just cannot get my head around that mentality.

        Also, you can absolutely see the models who were used as references in some of the images generated by apps these days. Like that popular one right now that everyone is using to make idealized images of themselves. A few of my family and friends used it recently and you could clearly see in some of the pics the A-list celebs who were used as pose references, like Gal Godot, Scarlett Johansen, etc. It’s creepy as hell.

          • davehtaylor@beehaw.org
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            I never said it was. But like the person I was replying to said: we need to take a good hard look at what the hell these tools are doing and allowing and decide as a society if we’re going to tolerate it.

            The real issue here is what things like deepfakes can do. It’s already starting, and it’s going to continue accelerating, generating mis- and disinformation: for private citizens, celebs, and politicians. While you might say “it’s creepy, but there’s nothing we can do about people deepfaking Nancy Pelosi’s face onto their spank material”, it’s extremely problematic when someone decides to make a video where Joe Biden admits to running a CP ring, or some right wing chud makes a video of Trump appearing to say something they all want to hear, and it leads to a civil war. That’s the real stakes here. How we react to what’s happening with regular folk and celebs is just the canary int he coal mine.