Most police departments don't have the resources to sift through all their body-cam footage, meaning most of it remains unreviewed and unexamined. According to Axon, a company...
Using AI to flag footage for review by a person seems like a good time-saving practice. I would bet that without some kind of automation like this, a lot of footage would just go unreviewed. This is far better than waiting for someone to lodge a complaint first, since you could conceivably identify problem behaviors and fix them before someone gets hurt.
The use of AI-based solutions to examine body-cam footage, however, is getting pushback from police unions pressuring the departments not to make the findings public to save potentially problematic officers.
According to this, the unions are against this because they want to shield bad-behaving officers. That tells me the AI review is working!
I bet if they made all footage publicly available, watchdog style groups would be reviewing the shit out of that footage. But yeah AI might help too maybe.
While I agree wholeheartedly, that is unrealistic due to laws. You can’t reveal certain suspects identity because for certain crimes, like pedophilia, people will attempt to execute the suspect before they know whether or not they actually did it.
Exactly, and this also contradicts the “few bad apples” defense. If there were only a few bad apples, then the police unions should be bending over backwards to eradicate them sooner than later to protect the many good apples, not to mention improve the long suffering reputation of police.
Instead, they’re doing the exact opposite, making it clear to anyone paying attention that it’s mostly, if not entirely, bad apples.
The phrase is “A few bad apples spoil the bunch”. It means everyone around the bad apples is also bad, because they’re all around and do nothing about it. It’s not a defense, it’s literally explaining what your comment says.
Using AI to flag footage for review by a person seems like a good time-saving practice. I would bet that without some kind of automation like this, a lot of footage would just go unreviewed. This is far better than waiting for someone to lodge a complaint first, since you could conceivably identify problem behaviors and fix them before someone gets hurt.
According to this, the unions are against this because they want to shield bad-behaving officers. That tells me the AI review is working!
I bet if they made all footage publicly available, watchdog style groups would be reviewing the shit out of that footage. But yeah AI might help too maybe.
While I agree wholeheartedly, that is unrealistic due to laws. You can’t reveal certain suspects identity because for certain crimes, like pedophilia, people will attempt to execute the suspect before they know whether or not they actually did it.
I mean police footage would be privacy invading as hell for victims and even just bystanders.
Exactly, and this also contradicts the “few bad apples” defense. If there were only a few bad apples, then the police unions should be bending over backwards to eradicate them sooner than later to protect the many good apples, not to mention improve the long suffering reputation of police.
Instead, they’re doing the exact opposite, making it clear to anyone paying attention that it’s mostly, if not entirely, bad apples.
You’ve got it backwards.
The phrase is “A few bad apples spoil the bunch”. It means everyone around the bad apples is also bad, because they’re all around and do nothing about it. It’s not a defense, it’s literally explaining what your comment says.