• NOT_RICK@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    ·
    11 hours ago

    Under moderated and under administrated instances have ended up with child porn on them. If that shit gets federated out it’s a real mess for everyone. I think screening tools are more advanced now thankfully because it’s been a while since the last incident.

      • Leraje@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        9
        ·
        10 hours ago

        That means the CSAM (its not ‘child porn’ its child abuse) remains on the server, which means the instance owner is legally liable. Don’t know about you but if I was an instance owner I wouldn’t want the shame and legal consequences of leaving CSAM up on a server I control.

      • PhobosAnomaly@feddit.uk
        link
        fedilink
        arrow-up
        16
        ·
        11 hours ago

        Make a reliable way to automate that, and you’ll make a lot of money.

        Rely on doing it for yourself, and… well good luck with the mental health in a few years time.

        • FaceDeer@fedia.io
          link
          fedilink
          arrow-up
          4
          arrow-down
          3
          ·
          10 hours ago

          AI would be able to do a good first pass on it. Except that an AI that was able to reliably recognize child porn would be a useful tool for creating child porn, so maybe don’t advertise that you’ve got one on the job.

        • Spiderwort@lemmy.dbzer0.comOP
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          19
          ·
          edit-2
          11 hours ago

          So that’s the indispensable service that admin provides. Childporn filtering.

          I didn’t realize it was such a large job. So large that it justifys the presence of a cop in every conversation? I dunno.

          • PhobosAnomaly@feddit.uk
            link
            fedilink
            arrow-up
            15
            arrow-down
            1
            ·
            11 hours ago

            I’ve read through a few of your replies, and they generally contain a “so, …” and a generally inaccurate summary of what the conversation thread is about. I don’t know whether there’s a language barrier here or you’re being deliberately obtuse.

            It would appear to be that your desire for a community without moderators is so strong, that a platform like Lemmy is not suitable for what you want, and as such you are likely not going to find the answer you want here and spend your time arguing against the flow.

            Good luck finding what you’re looking for 👍

          • Zak@lemmy.world
            link
            fedilink
            arrow-up
            6
            ·
            edit-2
            10 hours ago

            If your questions are concrete and in the context of Lemmy or the Fediverse more broadly, admins provide the service of paying for and operating the servers in addition to moderation.

            If it’s more abstract, i.e. “can people talk to each other over the internet without moderators?” then my experience is that they usually can when the group is small, but things deteriorate is it grows larger. The threshold for where that happens is higher if the group has a purpose or if the people already know each other.

      • partial_accumen@lemmy.world
        link
        fedilink
        arrow-up
        10
        ·
        11 hours ago

        Surely filtering out childporn is something that I can do for myself.

        Even if that was a viable option as a solution (it isn’t), humans that are employed to filter out this disgusting content (and worse) are frequently psychologically damaged by the exposure. This includes online content moderation companies and those in law enforcement that have to deal with that stuff for evidentiary reasons.

        The reason its not a viable solution is if YOU block it out because YOU don’t want to see it but its still there, it becomes a magnet for those that DO want to see it because they know its allowed. The value of the remaining legitimate content goes down because more of your time is spent blocking the objectionable material yourself, until its too much for anyone that doesn’t want that stuff and they leave. Then the community dies.

        • Spiderwort@lemmy.dbzer0.comOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          6
          ·
          10 hours ago

          Personal cp filtering automation and a shared blacklist. That would take care of the problem. No moderator required.

          • xmunk@sh.itjust.works
            link
            fedilink
            arrow-up
            3
            ·
            3 hours ago

            If you can write an automated filter to block CSAM then Apple, Meta, Alphabet and others would happily shovel billions at you. Blocking CSAM is a constant and highly expensive job… and when they fuck up it’s a PR shit storm.

            • Spiderwort@lemmy.dbzer0.comOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              1 hour ago

              Maybe keeping it off the network is a lost cause. If we each block it with personal filtering then that changes the face of the issue.

              • xmunk@sh.itjust.works
                link
                fedilink
                arrow-up
                1
                ·
                44 minutes ago

                If lemmy is a hub for those who want to to trade CSAM then it will be taken down by the government. This isn’t something that can be allowed onto the system.

          • db0@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            4
            arrow-down
            1
            ·
            3 hours ago

            Personal cp filtering automation and a shared blacklist

            Oh just those, eh?

            Just goes to show how little idea you have how difficult this problem is.

            • Spiderwort@lemmy.dbzer0.comOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              1 hour ago

              This is starting to sound like, “we need constant control and surveillance to protect us from the big bad”.

              You know, for the children.

              • db0@lemmy.dbzer0.com
                link
                fedilink
                arrow-up
                3
                ·
                1 hour ago

                Mate, if you don’t like the way we run things, go somewhere else. You’re not forced to be here.

                  • db0@lemmy.dbzer0.com
                    link
                    fedilink
                    arrow-up
                    2
                    ·
                    edit-2
                    51 minutes ago

                    Of course I see your point you’re trying to make , but I also think you’re naive and don’t understand the repercussions of what you’re suggesting

                  • sigmaklimgrindset@sopuli.xyz
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    2 minutes ago

                    And you are magnanimous for doing so. If someone came into my house and tried to dictate their rules to me, I’d be fuming.

                    This is why I’m not a mod 🤡

                  • sigmaklimgrindset@sopuli.xyz
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    7 minutes ago

                    The culture of…having admins to maintain our servers? Dawg just make your own instance then, problem solved. Why is this even up for discussion? The whole point of the fediverse is you can set up your own server and connect with others under your own rules. If you want a mod-free space, you can literally make that.

                    If you believe in unmoderated spaces that much to argue about it, but aren’t actually willing to make it yourself when it’s simple enough to do so, you’re just another social media agitator wasting time.

      • NeoNachtwaechter@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        10 hours ago

        filtering out […] I can do for myself.

        It still means too much legal trouble for the admin if the offending data would be on the server.