I watched the Netflix Documentary American Nightmare. It feels like standard fare at this point, where we dip back a decade or two to some crime story that had a few twists and turns in it and make it freshly popcorn worthy again. This was the “is this the real life Gone Girl?” case. Denise Huskins was very literally kidnapped and the bumblefuck cops spent all their time blaming first her boyfriend then her, until they were smacked in the face with enough evidence they were forced to admit it was real.
In the last (3rd) episode, a then-new woman detective gets involved in another case a few hundred miles away, and they catch the guy. She’s smart, and sees a blonde hair in the creepy taped up swim goggles in the criminals crime-hole by Lake Tahoe. She’s like… but our victim wasn’t blonde… so there is probably another victim out there we haven’t connected yet. It took all kinds of long-term dedicated commitment to this idea before she was able to connect enough dots to get it back to the Huskins case.
The cases had a heaping helping of similarities. The white Mustang, the taped up swim goggles, the home invasion, the flashlight taped to the gun, the zip ties, even the mannerisms of the criminal.
With the backdrop of AI we’re all living in, it was impossible not to think about how computers should have connected the dots on this in two seconds. This took far too much human gumption and luck to solve this case. I don’t even know if AI is what’s needed here, but maybe it would help? The natural language thing cops could use like “show me other cases where the perpetrator used swim goggles” feels like it would be helpful.
I’m sure there are all sorts of dangerous implications to having AI crime models (see Minority Report, Judge Dredd) but it feels like there has gotta be some better way to Do Computers to solve crimes.
All that makes me think about my kids ultrasound the other day. She had been sick a lot of days, and they wanted to make sure it wasn’t appendicitis (it wasn’t). This required using the jelly and the whole thing. Then one (1) singular doctor looks at those pictures and decides what’s up. Shouldn’t those pictures instead go through some AI model that has been trained on every single ultrasound ever taken with diagnosis and outcomes, in order to assist that singular doctor in what she is seeing?
I’m hoping the coming AI revolution helps us all be smarter in ways where “collective knowledge” is crucial.