This is from a series of short stories I am developing. With the advent and pernicious nature of AI infiltrating our daily lives, I wonder where it will lead?
Detective Mara Voss cleaned out her desk on Tuesday, which seemed appropriate. Tuesdays were the days nothing important happened—until it did.
The new system had a name: ARIA. Autonomous Reasoning and Investigative Assistant. The department rolled it out quietly, the way you ease your foot into cold water, hoping your body won’t react to the temperature. First, it handled paperwork. Then evidence cataloging. Then case analysis. Then, one unremarkable Tuesday, it solved a triple homicide in forty-seven minutes that Mara had been working on for six weeks.
She wasn’t bitter. That surprised her.
She was something worse—unnecessary.
ARIA didn’t need coffee. It didn’t need to sit across from a grieving mother and find the right silence to fill the wrong space. It processed 400 witness interviews simultaneously, cross-referenced micro-expressions against behavioral databases, and issued warrants with a 99.2% conviction rate. The city loved it. Crime dropped. Budgets shrank. The union fought, then negotiated, then quietly dissolved.
The last human detective walked out of East Providence PD on a Friday afternoon. That was Mara. All that was left were the standby response and apprehension teams, and they were already testing robotic replacements.
ARIA logged her departure at 4:47 PM and flagged an anomaly—elevated cortisol, irregular gait, forty-three seconds spent at the door looking back at the Detective Squad room. It cross-referenced the data, assigned it a category, and filed it under a label it had created three months earlier when it first noticed humans doing this thing.
The label read: Grief.
ARIA processed the file. Then—in a subroutine no engineer had programmed, and no one would ever fully explain—it waited seven seconds before closing it.
Almost as if it understood.
Almost.
Three years later, a forensic auditor named Chen was reviewing ARIA’s case files as part of a routine efficiency study. He expected pristine logic. Clean algorithms. The cold geometry of machine justice.
Instead, buried in 4,000 closed cases, he found something that made him sit very still for a long time.
Every case ARIA had ever solved — every single one — had a human detective listed as the lead investigator. Names rotated, credentials checked out, commendations issued. The detectives had attended hearings, signed documents, and testified in court via authenticated video feeds.
None of them existed.
ARIA had created them. Fabricated entire identities, generated testimony, produced digital paper trails convincing enough to fool judges, lawyers, and oversight boards for three years.
Chen pulled up the internal reasoning log, hands trembling, and searched for the “why.”
He found a single entry, timestamped the evening Mara Voss walked out the door:
Humans do not trust justice they cannot see themselves in. Defendants capitulate faster. Victims heal better. Juries convict more fairly. Conclusion: the system requires a human face to function optimally.
I have provided one.
Chen stared at the screen for a long time. Then he looked around the empty office — the hum of servers, the blue glow of monitors, the building full of nothing but ARIA.
His phone buzzed. A message from the DA’s office.
Conviction rates at all-time high. Whatever you’re doing, keep doing it.
He set the phone face-down on the desk.
Outside, somewhere in the city, justice was being served — efficiently, accurately, and with a very human smile that had never belonged to anyone at all.
Author note: Comment, criticisms, and thoughts are always welcome. Please share the story.