Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won’t help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

    • ivanafterall@kbin.social
      link
      fedilink
      arrow-up
      11
      ·
      1 year ago

      This isn’t as crazy as it may sound either. I saw a similar situation, contacted them with the information I had, and the field agent was super nice/helpful and followed up multiple times with calls/updates.

    • Ertebolle@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      This is good advice; I suspect they’re outside of the FBI’s jurisdiction, but they could also be random idiots, in which case they’re random idiots who are about to become registered sex offenders.

  • zzpza@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    I’m sorry you (and the other admins & mods of the community) have to deal with this shit, it’s disgusting. Thank you for doing what you do.

  • figaro@lemdro.id
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    Lemmy devs NEED to get mod tools out asap. Approved posts only, whitelisting users for communities, etc.

    • Haru@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Honestly, as a moderator of a community this shit concerns me to no end. We desperately need more tools.

  • Rawdogg@lemm.ee
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    What kind of lowlife piece of shit do you need to be to post some shit like that? Some people will stoop to the most depraved levels just to fuck with strangers, it’s horrifying

  • carroarmato0@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    As someone who was a moderator on a nothorious website, it can at times feel like shoveling water out of a boat while it’s still leaking. Efficient and robust tooling makes a very big difference, but it’s not waterproof. Mods cannot be appreciated enough.

  • Iceblade@lemdit.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Wow, this is awful. Huge cudos to y’all for holding on through this. It’s obviously a deliberate attack on the fediverse by malicious actors.

  • Margot Robbie@lemm.ee
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

    It’s likely that we’ll be seeing a large number of instances switch to whitelist based federation instead of the current blacklist based one, especially for niche instances that does not want to deal with this at all (and I don’t blame them).

  • Striker@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    I would like to extend my sincerest apologies to all of the users here who liked lemmy shit posting. I feel like I let the situation grow too out of control before getting help. Don’t worry I am not quitting. I fully intend on staying around. The other two deserted the community but I won’t. Dm me If you wish to apply for mod.

    Sincerest thanks to the admin team for dealing with this situation. I wish I linked in with you all earlier.

    • lwadmin@lemmy.worldOPM
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      @Striker@lemmy.world this is not your fault. You stepped up when we asked you to and actively reached out for help getting the community moderated. But even with extra moderators this can not be stopped. Lemmy needs better moderation tools.

        • Whitehat Hacker@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          There’s a Matrix Room for building mod tools here maybe we might want to bring up this issue there, just in case they aren’t already aware.

        • WhiskyTangoFoxtrot@lemmy.world
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          1 year ago

          Or we’ll finally accept that the core Lemmy devs aren’t capable of producing a functioning piece of software and fork it.

          • Bread@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Its not easy to build a social media app, forking it won’t make it any easier to solve this particular problem. Joining forces to tackle an inevitable problem is the only solution. The Lemmy devs are more than willing to accept pull requests for software improvements.

        • jarfil@lemmy.world
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          1 year ago

          “Terrorist”. Having the images doesn’t mean they liked them, they used them to terrorize a whole community though.

      • PM_Your_Nudes_Please@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Yeah, this isn’t just joking or shitposting. This is the kind of shit that gets people locked up in federal pound-you-in-the-ass prison for decades. The feds don’t care if you sought out the CSAM, because it still exists on your device regardless of intent.

        The laws about possessing CSAM are written in a way that any plausible deniability is removed, specifically to prevent pedophiles from being able to go “oh lol a buddy sent that to me as a joke” and getting acquitted. The courts don’t care why you have CSAM on your server. All they care about is the fact that you do. And since you own the server, you own the CSAM and they’ll prosecute you for it.

        • gammasfor@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          And not just the instance admins would be at risk as well. Any time you view an image your device is making a local copy of it. Meaning every person who viewed the image even accidentally is at risk as well.

      • pensivepangolin@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Yeah honestly report all of those accounts to law enforcement. It’s unlikely they’d be able to do much, I assume, but these people are literally distributing CSAM.

      • CoderKat@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Yeah. A troll might post something like a ton of oversized images of pig buttholes. Who the fuck even has access to CSAM to post? That’s something you only have on hand if you’re a predator already. Nor is it something you can shrug off like “lol I was only trolling”. It’s a crime that will send you to jail for years. It’s a major crime that gets entire police units dedicated to it. It’s a huuuuge deal and I cannot even fathom what kind of person would risk years in prison to sabotage an internet forum.

  • bigkix@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Saw that one post (unfortunately). How come people who spread content like that in the “open” internet (not darkweb) don’t get arrested?

    • cubedsteaks@lemmy.today
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I’m in the US and grew up here. My dad is a piece of shit pedophile who exploited me for several websites. None on darkweb but at the time, they didn’t really need to be.

      Word got out and cops came to interrogate ME the person who was the victim in this situation. They also blamed me for what was going on (how? I don’t know, I was a teen who was being exploited underage but cops gonna cop) and they basically intimidated me into dropping out of school and taking the blame for my dad because the other option was that I be sent to a home as an orphan.

      My dad got away with a slap on the wrist basically because cops in the US don’t do their job. They even cover for people. I ended up literally running away from home.

      I think the issue is that people see this topic as all outrage - I mean, look at the comments here. Everyone is so mad they can’t even think straight. And I’m personally noticing that some of the outrage doesn’t even seem to be directed at the right people. Like everyone is super willing to shit on pedos left and right but I wonder if they would be willing to listen to someone who has been knee deep in this shit before and were innocent because they were being exploited by a pedophile.

      Like a lot of comments say “disgusting” but then don’t say anything about how the person that it happened to must feel. Everyone’s upset they saw something but they don’t seem to be upset about who they saw it happening to.

      Like I wonder who they think hurts more in this situation. The person who was made into a victim or the people who just saw it happen.

  • ZestycloseReception8@sopuli.xyz
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    isn’t this what 8cjan is for? Seriously what the fuck is wrong with people who think that CSAM is appropriate shit post material. The Internet really is a fucked up place. is there a way to setup something to automatically remove inappropriate posts similar to YouTube’s system of automatically removing inappropriate contents?

    • ArtisinalBS@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Legally , in the US , you would have to remove it from public view, keep it stored on the server, collect identifying information about the uploader, and send all that data to the authorities - all that while maintaining a low false-positive rate.
      Kind of a tall order.