• bleistift2@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    102
    arrow-down
    2
    ·
    16 days ago

    10% false positives is an enormous rate given how Police like to ‘find’ evidence and ‘elicit’ confessions.

    • snooggums@lemmy.world
      link
      fedilink
      English
      arrow-up
      40
      ·
      16 days ago

      It isn’t predicting individual crimes, just pattern recognition and extrapolation like how the weather is predicted.

      “There are on average 4 shootings in November in this general area so there probably will be 4 again this year.” is the kind of prediction that AI is making.

        • rottingleaf@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          14 days ago

          That’s also how communist attempts at building a better civilization work. They can’t avoid that base of their ideology where one human classification of reality takes precedence over life. So they have plans. Plan for steel production, plan for grain production, plan for convicted criminals.

          You are falling behind the plan? Have to arrest someone. Someone. There’s a weird teen walking there, let’s tie him to a battery and beat him till he signs a paper saying he stole some shit.

          The plan is overshot? Won’t bother if there’s a gang-rape with murder before the police station with some policemen participating.

          What I don’t understand is why people want to do that again, just with clueless (not possessing the necessary information) planners replaced with clueless (for the same reason) machines.

          Even USSR’s problems with planning were mostly not due to insufficient computational resources (people from today think those were miserable, but let’s please remember that they were programmed by better and more qualified people that most of today’s programmers), but due to power balance in hierarchy meaning that planning was bent for the wishes of power. In other words, plans were made for what people on important posts wanted to see, and didn’t account for what people on other important posts didn’t want to share. Just like it’s going to be with any system. Tech doesn’t solve power balance by itself.

            • rottingleaf@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              14 days ago

              Yes, I am.

              There are a few associations between the subject of this thread and marxism.

              The most obvious being the opinion that one can create a machine that will transform society’s hierarchy, and using AI in law enforcement is actually part of that.

              The less obvious being that USSR’s catching up game, where they’d blindly copy (sometimes mostly in appearances) things first popularized in the West via market processes, is similar to what “AI”'s do.

              Another being that preventing a crime via some predictor can’t be verified against reality, because, ahem, it’s preventive. Just like conviction plans for police can’t be checked against reality.

              Another being that police discomforting or even surveilling people based on some predictor is almost a punishment, for something they haven’t done. It’s like a thought crime. It brings one’s rights to the level of what a USSR citizen had, basically.

              • Glytch@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                14 days ago

                Another being that police discomforting or even surveilling people based on some predictor is almost a punishment, for something they haven’t done. It’s like a thought crime. It brings one’s rights to the level of what a USSR citizen had, basically.

                Or to the level of a black person in modern Capitalist America. Your point isn’t about communism, it’s about authoritarianism.

                • rottingleaf@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  14 days ago

                  Of course.

                  However, one can recall that Democrats before the “switch” were kinda racist and kinda liberal at the same time. So it wasn’t a complete switch between the parties. And that this is because that kind of “liberal” is, similarly to marxism, about having some “more fair” society, but only the way you want, impeding your opponents trying to change their parts of society the way they want.

                  That my point is more about fragmentation. Say, from what I’ve read, it seems that somewhere in 70s you could find towns and districts where you’d be absolutely abused for being Black and there, but you could also find normal ones.

                  The issue with trying to solve the problem of the former by some centralized pressure is that it creates the same instruments that could be used to “solve” the “problem” of the latter, and worse - introduce new problems.

                  More than that, structurally it averages to the worse, because crowds are as smart as the dumbest and worst person in them.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        2
        ·
        15 days ago

        So they are using it to try and decide on deployment?

        If that is all they’re using it for I guess it isn’t too bad. As long as it isn’t accusing individuals of planning to commit a crime with a zero evidence.

        • snooggums@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          15 days ago

          It is probably going to be used to justify the disproportionate police attention paid to minority communities and to justify activities similar to stop and frisk.

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            2
            ·
            15 days ago

            The thing is don’t they already have crime stats? Presumably they’re already using them as justification so this won’t change much.

        • rottingleaf@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          14 days ago

          No, it’s bad, because ultimately it’s not leading anywhere, such tools can’t be used by unqualified people not understanding how they work (not many qualified people do too, my teamlead at work, for example, is enthusiastic and just doesn’t seem to hear arguments against, at least those I can make with my ADHD, that is, avoiding detailed explanations to the bone).

          If ultimately it’s not applicable where people want to apply it, it shouldn’t even be tested.

          This is giving such applications credibility.

          It’s the slippery slope that some people think doesn’t exist. Actually they exist everywhere.

    • lugal@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      16 days ago

      That’s in a punitive system. Used in a transformative/preventive manner (which it will not), this can actually save lives and help people in need

  • Pilferjinx@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    1
    ·
    16 days ago

    Israel and China implement sophisticated algorithms to suppress Palestinians and Uyghurs with severe effectiveness. Don’t take this tech lightly.

    • ProgrammingSocks@pawb.social
      link
      fedilink
      English
      arrow-up
      27
      ·
      16 days ago

      If it were my choice I’d have it banned. “90%” accuracy? So 10/100 predictions result in an innocent person getting surveiled for literally no reason? Absolutely the fuck not.

    • aeshna_cyanea@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      edit-2
      15 days ago

      Idk about china but Israel carpet bombs apartment buildings. You don’t need precision ai for that

  • psycho_driver@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    2
    ·
    15 days ago

    I’m pretty sure currrent techBrocracy will implement it such:

    if (isBlack) { willCrime = True; }

    • 0laura@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      15 days ago

      AI is AI. not all AI is AGI, but stable diffusion, LLMs and all the other ones are real AI. the only reason people disagree is because they watched too much sci-fi and think that AI is supposed to be sentient or whatever. hell, even the code controlling the creepers in Minecraft is called AI. in the game. you can spawn a creeper with the noai flag and it’ll make it so the creeper doesn’t do anything. quite a silly take to say it’s not ai just because you don’t like it. there’s many things to dislike about the modern state of AI, your argument is just shooting yourself in the foot.

    • Dragon Rider (drag)@lemmy.nz
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      15 days ago

      Yeah! Real AI is expert systems and fuzzy logic! Generative AI’s capabilities and intelligence fail in comparison to Fuji Electric’s advanced temperature control systems!

  • dgmib@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    14 days ago

    If crime is that predictable, that would mean crime is isn’t caused by people’s choices but something else… like say mental illness, poverty, hunger, lack of social supports and that lots of cops locking up people in prisons as a deterrent won’t work to reduce crime… hmmm wait a second…

  • kubica@fedia.io
    link
    fedilink
    arrow-up
    3
    ·
    16 days ago

    Hey could you just scan me before I continue putting effort in earning money?

  • stupidcasey@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    14 days ago

    As we can see from this advanced simulation, the perpetrator had 13 fingers you are the only person who has 13 fingers the evidence is obvious.

    Mr. Thirteen Fingers, I simply do not understand how an innocent man like yourself can take a dark turn and suddenly commit over 300 crimes scattered throughout every country across the globe, you had every reason not to commit them but you did it anyway, how do you plead?

    Would it matter if I said not guilty?

  • Shardikprime@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    4
    ·
    15 days ago

    I mean you can train ai to look for really early signs of multiple diseases. It can predict the future, sort of