People accused of crimes have laid the blame at a variety of doorsteps to excuse — or at least explain — what happened.
John Hinckley Jr. said it was the movie “Taxi Driver” and his obsession with a young Jodie Foster that prompted his assassination attempt on Ronald Reagan in 1981. The same year, Arne Cheyenne Johnson said the reason he killed his landlord was demonic possession. When Dan White killed San Francisco Mayor George Moscone and city Supervisor Harvey Milk in 1978, he blamed Twinkies as evidence of his depression.
But a federal grand jury indictment Tuesday put a new culprit in the hot seat: artificial intelligence.
Brett Dadig, 31, of Whitehall is charged with cyberstalking, interstate stalking and interstate threats. The criminal complaint paints a picture of him meeting women in Pittsburgh-area fitness centers, then going on to harass and threaten them on social media and stalk them at their workplaces. On a podcast, he expressed frustrated violence and anger.
The case is federal because the interactions took him across state lines to Florida, Iowa, Ohio and New York.
The indictment spells out the AI aspect. He discussed his issues, including both his podcast and his attempt to find a wife, with ChatGPT, which he called his “therapist” and “best friend.”
Let’s be clear: Dadig has not advanced a defense yet. Prosecutors are the ones referencing his comments about AI. Dadig’s attorney Michael Moser says his client’s family is concerned about the man’s mental health.
But the issue here isn’t about Dadig’s case. It’s about the potential to lean on AI for support in place of real people and how dangerous that can become.
AI, after all, is a new technology that is constantly evolving. It is a tool that, like any computer, takes what it is fed and helps process an expected answer.
It might be able to take the place of a customer service representative in a retailer’s website chat. It cannot replace a psychologist able to evaluate a conversation for troubled thoughts or a friend who notices worrisome behavior changes.
But it should not be blamed for the actions of a user any more than the computer manufacturer or internet service provider.
Unlike the 2017 Massachusetts case where Michelle Carter argued she wasn’t guilty of manslaughter for texts encouraging her boyfriend’s 2014 suicide, there is no mistaking that AI is anything other than a computer program. There is no one on the other side of the conversation other than strings of code. That’s what the “artificial” in artificial intelligence means.
Dadig’s case might be among the first to cite artificial intelligence but certainly will not be the last. As technology evolves, so does the courts’ involvement and interpretation.
But it is on people to understand this changing world and our place in it. The law is quite clear that being ignorant is not an excuse.
Copyright ©2025— Trib Total Media, LLC (TribLIVE.com)