• RotaryKeyboard@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    5 months ago

    Using AI to flag footage for review by a person seems like a good time-saving practice. I would bet that without some kind of automation like this, a lot of footage would just go unreviewed. This is far better than waiting for someone to lodge a complaint first, since you could conceivably identify problem behaviors and fix them before someone gets hurt.

    The use of AI-based solutions to examine body-cam footage, however, is getting pushback from police unions pressuring the departments not to make the findings public to save potentially problematic officers.

    According to this, the unions are against this because they want to shield bad-behaving officers. That tells me the AI review is working!

    • Null User Object@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      5 months ago

      Exactly, and this also contradicts the “few bad apples” defense. If there were only a few bad apples, then the police unions should be bending over backwards to eradicate them sooner than later to protect the many good apples, not to mention improve the long suffering reputation of police.

      Instead, they’re doing the exact opposite, making it clear to anyone paying attention that it’s mostly, if not entirely, bad apples.

      • Rai@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        You’ve got it backwards.

        The phrase is “A few bad apples spoil the bunch”. It means everyone around the bad apples is also bad, because they’re all around and do nothing about it. It’s not a defense, it’s literally explaining what your comment says.

    • jaybone@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      I bet if they made all footage publicly available, watchdog style groups would be reviewing the shit out of that footage. But yeah AI might help too maybe.

      • Scubus@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        While I agree wholeheartedly, that is unrealistic due to laws. You can’t reveal certain suspects identity because for certain crimes, like pedophilia, people will attempt to execute the suspect before they know whether or not they actually did it.

        • LarmyOfLone@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 months ago

          I mean police footage would be privacy invading as hell for victims and even just bystanders.

    • BearOfaTime@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      Yep.

      Ya 'all like surveillance so much, let’s put all government employees under a camera all the time. Of all the places I find cameras offensive, that one not so much.

      • mndrl@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        5 months ago

        I sure hope you get your daily dosis of enjoying people’s misery watching the substitute teacher crying in the teacher’s lounge.

        • lolcatnip@reddthat.com
          cake
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 months ago

          Cameras in a teacher’s lounge would be ridiculous but, in principle, cameras in classrooms make a lot of sense. Teachers are public officials who exercise power over others, and as such they need to be accountable for their actions. Cameras only seem mean because teachers are treated so badly in other ways.

          • mndrl@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            edit-2
            5 months ago

            Sure thing, buddy. They exert such power that they can barely make teens stay put for dice minutes without fucking around with their phones. So much power.

              • mndrl@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                arrow-down
                1
                ·
                5 months ago

                I checked just in case. Exactly as I said, most government workers have no power or means to exact it. You must be thinking of something else.

                Although I can recognize when someone has silly power fantasies. It is wild, man.

  • Dizzy Devil Ducky@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    5 months ago

    I have a sneaking suspicion if police in places like America start using AI to review bodycam footage that they’ll just “pay” someone to train their AI so that way it’ll always say that the police officer was in the right when killing innocent civilians so that the footage never gets flagged That, or do something equally as shady and suspicious.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      These algorithms already have a comical bias towards the folks contracting their use.

      Case in point, the UK Home Office recently contracted with an AI firm to rapidly parse through large backlogs of digital information.

      The Guardian has uncovered evidence that some of the tools being used have the potential to produce discriminatory results, such as:

      An algorithm used by the Department for Work and Pensions (DWP) which an MP believes mistakenly led to dozens of people having their benefits removed.

      A facial recognition tool used by the Metropolitan police has been found to make more mistakes recognising black faces than white ones under certain settings.

      An algorithm used by the Home Office to flag up sham marriages which has been disproportionately selecting people of certain nationalities.

      Monopoly was a lie. You’re never going to get that Bank Error In Your Favor. It doesn’t happen. The House (or, the Home Office, in this case) always wins when these digital tools are employed, because the money for the tool is predicated on these agencies clipping benefits and extorting additional fines from the public at-large.

      • butterflyattack@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        Bank errors in your favour do happen, or at least they did - one happened to me maybe twenty five years ago. I was broke and went to the bank to pay in my last £30-something of cash to cover an outgoing bill. Stopped at the cash machine outside my bank to check my balance was sufficient now, and found that the cashier had put an extra 4 zeros on the figure I’d deposited. I was rich! I was also in my early 20s and not thinking too clearly I guess because my immediate response was to rush home to get my passport with the intention of going abroad and opening an account into which to transfer the funds, never coming back. I checked my balance again at another machine closer to home and the bank had already caught and corrected their mistake. Took them maybe thirty minutes.

        After a bit of occurred to me that I was lucky really, because I didn’t know what the fuck I was doing and the funds would have been traced very easily and I’d have been in deep shit.

        But yeah, anecdotal, but shit like that did happen. I assume it’s more rare these days as fewer humans are involved in the system, and fewer people use cash.

    • Kalkaline @leminal.space
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      5 months ago

      AI can’t be the last word in what gets marked for misconduct etc., however using it as a screening tool for potentially problematic moments in a law enforcement officer’s encounters would be useful. It’s an enormous task to screen through those hours upon hours of video and probably prohibitively expensive for humans to work through.