Sorry for the short post, I’m not able to make it nice with full context at the moment, but I want to quickly get this announcement out to prevent confusion:

Unfortunately, people are uploading child sexual abuse images on some instances (apparently as a form of attack against Lemmy). I am taking some steps to prevent such content from making it onto lemm.ee servers. As one preventative measure, I am disabling all image uploads on lemm.ee until further notice - this is to ensure that lemm.ee can not be used as gateway to spread CSAM into the network.

It will not possible to upload any new avatars or banners while this limit is in effect.

I’m really sorry for the disruption, it’s a necessary trade-off for now until we figure out the way forward.

  • HelloHotel@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago
    CSAM sourcing?

    Where do these people get that much CSAM. somebody once said that to the best of their understanding, it was new CSAM images each time, meaning not many repeats. My collection of reddit memes costs me ~15-30GB, all of sbubby costs ~5GB. where is it pooled from?

    The most fried part of my brain says “One of the big companies trying to absorb the fediverse is doing this to undermine their competition,” but I have zero evidence

    Most companies that build CSAM detectors, by nature of their work, have a lot of it. likely thousands of photos and videos were willingly handed over to put into some vault to fight against it’s existance. If its a large corperation attacking is, it nesisarly means a leak from a CSAM vault wether it was intentional (an authorized attack) or not (opsec mistakes or insiders). Or it means there was no vault (negligence) or it wasnt tranfered securely (opsec mistakes).

    it’s just the only motive I can even think of beyond it being a rogue crank.

    Its not hard to build a bot that scrapes a webpage of its images, they can easly aggrogate that much content over decades.