2024-10-04
14 分钟In the U.S. criminal justice system, a lot of things hinge on the simple police report. As departments begin to use AI and large language model software to help cops write them, American University law professor Andrew Guthrie Ferguson worries people don’t understand the possible downstream effects.
Back in the spring, Andrew Guthrie Ferguson started a new project about something foundational to our criminal justice system, the police report.
He thought it deserved a second look because police departments across the country are beginning to test AI and large language model software to help in composing them.
We talked about that in Tuesdays episode.
So I did a deep dive into this new technology thats actually being rolled out as we speak, that is using predictive language technology, so large language models to write and draft police reports for police officers on the grounds, in the streets.
Andrew is now a law professor at American University, but before going into the classroom, he was a public defender in DC.
And he saw case after case in which the fates of his clients literally depended on what was in those police reports.
It's what the prosecutor looks at when they're determining whether to bring the case at all.
So the judge looks at determine whether there should be pretrial detention.
It may be what the lawyers look at to determine whether there's a suggestion that the client should plead or whether motion's to be filed.
Typically, for most low level misdemeanors, a police officer doesn't come to court to testify.
They may have 50 other cases they're working on, so their reports are supposed to do the talking for them.
And so the police report plays a very large role in the criminal legal system, in misdemeanors and low level felonies that people don't think about because we're focused on trial and those moments of cross examination and everything else.
So when Andrew heard that some police departments were handing some of the responsibility for drafting reports to AI, all he could think about were the myriad of things that could go wrong, not because he hates tech, but because experience has shown him that the marriage of tech and law enforcement often doesn't have a happy ending.
The story is usually one of error, scandal, problem.
Oh, we didn't think about that.
Now we realize it was a mistake.
I'm Dina Templerast, and this is click.
Here's mic drop, an extended cut of an interview we think you'd like to hear more of in our episode.
On Tuesday, we looked at two police departments that tested a generative AI program called draft one, and they got radically different results.
One department in New Hampshire said the program didnt save their officers any time.