Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won’t help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

    • ivanafterall@kbin.social
      link
      fedilink
      arrow-up
      11
      ·
      1 year ago

      This isn’t as crazy as it may sound either. I saw a similar situation, contacted them with the information I had, and the field agent was super nice/helpful and followed up multiple times with calls/updates.

    • Ertebolle@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      This is good advice; I suspect they’re outside of the FBI’s jurisdiction, but they could also be random idiots, in which case they’re random idiots who are about to become registered sex offenders.

  • zzpza@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    I’m sorry you (and the other admins & mods of the community) have to deal with this shit, it’s disgusting. Thank you for doing what you do.

  • figaro@lemdro.id
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    Lemmy devs NEED to get mod tools out asap. Approved posts only, whitelisting users for communities, etc.

    • Haru@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Honestly, as a moderator of a community this shit concerns me to no end. We desperately need more tools.

  • Rawdogg@lemm.ee
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    What kind of lowlife piece of shit do you need to be to post some shit like that? Some people will stoop to the most depraved levels just to fuck with strangers, it’s horrifying

  • carroarmato0@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    As someone who was a moderator on a nothorious website, it can at times feel like shoveling water out of a boat while it’s still leaking. Efficient and robust tooling makes a very big difference, but it’s not waterproof. Mods cannot be appreciated enough.

  • Iceblade@lemdit.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Wow, this is awful. Huge cudos to y’all for holding on through this. It’s obviously a deliberate attack on the fediverse by malicious actors.

  • Margot Robbie@lemm.ee
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

    It’s likely that we’ll be seeing a large number of instances switch to whitelist based federation instead of the current blacklist based one, especially for niche instances that does not want to deal with this at all (and I don’t blame them).

  • Striker@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    I would like to extend my sincerest apologies to all of the users here who liked lemmy shit posting. I feel like I let the situation grow too out of control before getting help. Don’t worry I am not quitting. I fully intend on staying around. The other two deserted the community but I won’t. Dm me If you wish to apply for mod.

    Sincerest thanks to the admin team for dealing with this situation. I wish I linked in with you all earlier.

    • lwadmin@lemmy.worldOPM
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      @Striker@lemmy.world this is not your fault. You stepped up when we asked you to and actively reached out for help getting the community moderated. But even with extra moderators this can not be stopped. Lemmy needs better moderation tools.

        • Whitehat Hacker@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          There’s a Matrix Room for building mod tools here maybe we might want to bring up this issue there, just in case they aren’t already aware.

        • WhiskyTangoFoxtrot@lemmy.world
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          1 year ago

          Or we’ll finally accept that the core Lemmy devs aren’t capable of producing a functioning piece of software and fork it.

          • Bread@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Its not easy to build a social media app, forking it won’t make it any easier to solve this particular problem. Joining forces to tackle an inevitable problem is the only solution. The Lemmy devs are more than willing to accept pull requests for software improvements.

  • CantSt0pPoppin@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    This is seriously sad and awful that people would go this far to derail a community. It makes me concerned for other communities as well. Since they have succeeded in having shitpost closed does this mean they will just move on to the next community? That being said here is some very useful information on the subject and what can be done to help curb CSAM.

    The National Center for Missing & Exploited Children (NCMEC) CyberTipline: You can report CSAM to the CyberTipline online or by calling 1-800-843-5678. Your report will be forwarded to a law enforcement agency for investigation. The National Sexual Assault Hotline: If you or someone you know has been sexually assaulted, you can call the National Sexual Assault Hotline at 800-656-HOPE (4673) or chat online. The hotline is available 24/7 and provides free, confidential support.

    The National Child Abuse Hotline: If you suspect child abuse, you can call the National Child Abuse Hotline at 800-4-A-CHILD (422-4453). The hotline is available 24/7 and provides free, confidential support. Thorn: Thorn is a non-profit organization that works to fight child sexual abuse. They provide resources on how to prevent CSAM and how to report it.

    Stop It Now!: Stop It Now! is an organization that works to prevent child sexual abuse. They provide resources on how to talk to children about sexual abuse and how to report it.

    Childhelp USA: Childhelp USA is a non-profit organization that provides crisis intervention and prevention services to children and families. They have a 24/7 hotline at 1-800-422-4453. Here are some tips to prevent CSAM:

    Talk to your children about online safety and the dangers of CSAM.

    Teach your children about the importance of keeping their personal information private. Monitor your children’s online activity.

    Be aware of the signs of CSAM, such as children being secretive or withdrawn, or having changes in their behavior. Report any suspected CSAM to the authorities immediately.

      • CoderKat@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 year ago

        Yeah. A troll might post something like a ton of oversized images of pig buttholes. Who the fuck even has access to CSAM to post? That’s something you only have on hand if you’re a predator already. Nor is it something you can shrug off like “lol I was only trolling”. It’s a crime that will send you to jail for years. It’s a major crime that gets entire police units dedicated to it. It’s a huuuuge deal and I cannot even fathom what kind of person would risk years in prison to sabotage an internet forum.

        • jarfil@lemmy.world
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          1 year ago

          “Terrorist”. Having the images doesn’t mean they liked them, they used them to terrorize a whole community though.

      • PM_Your_Nudes_Please@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Yeah, this isn’t just joking or shitposting. This is the kind of shit that gets people locked up in federal pound-you-in-the-ass prison for decades. The feds don’t care if you sought out the CSAM, because it still exists on your device regardless of intent.

        The laws about possessing CSAM are written in a way that any plausible deniability is removed, specifically to prevent pedophiles from being able to go “oh lol a buddy sent that to me as a joke” and getting acquitted. The courts don’t care why you have CSAM on your server. All they care about is the fact that you do. And since you own the server, you own the CSAM and they’ll prosecute you for it.

        • gammasfor@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          And not just the instance admins would be at risk as well. Any time you view an image your device is making a local copy of it. Meaning every person who viewed the image even accidentally is at risk as well.

      • pensivepangolin@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Yeah honestly report all of those accounts to law enforcement. It’s unlikely they’d be able to do much, I assume, but these people are literally distributing CSAM.

  • ZestycloseReception8@sopuli.xyz
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    isn’t this what 8cjan is for? Seriously what the fuck is wrong with people who think that CSAM is appropriate shit post material. The Internet really is a fucked up place. is there a way to setup something to automatically remove inappropriate posts similar to YouTube’s system of automatically removing inappropriate contents?

    • ArtisinalBS@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Legally , in the US , you would have to remove it from public view, keep it stored on the server, collect identifying information about the uploader, and send all that data to the authorities - all that while maintaining a low false-positive rate.
      Kind of a tall order.