Stacey Wales spent two years working on the victim impact statement she planned to give in court after her brother was shot to death in a 2021 road rage incident. But even after all that time, Wales felt her statement wouldn’t be enough to capture her brother Christopher Pelkey’s humanity and what he would’ve wanted to say. So, Wales decided to let Pelkey give the statement himself — with the help of artificial intelligence. She and her husband created an AI-generated video version of Pelkey to play during his killer’s sentencing hearing earlier this month that read, in a recreation of Pelkey’s own voice, a script that Wales wrote. In it, the AI version of Pelkey expressed forgiveness to the shooter, something Wales said she knew her brother would have done but she wasn’t ready to do herself just yet. “The only thing that kept entering my head that I kept hearing was Chris and what he would say,” Wales told CNN. “I had to very carefully detach myself in order to write this on behalf of Chris because what he was saying is not necessarily what I believe, but I know it’s what he would think.” AI is increasingly playing a role in legal and criminal justice processes, although this is believed to be the first time AI has been used to recreate a victim for their own impact statement. And experts say the world will increasingly have to grapple with ethical and practical questions about the use of AI to replicate deceased people — both inside courtrooms and beyond them — as the technology becomes more human-like. “We’ve all heard the expression, ‘seeing is believing, hearing is believing,’” said Paul Grimm, a Duke University School of Law professor and former district court judge in Maryland. “These kinds of technologies have tremendous impact to persuade and influence, and we’re always going to have to be balancing whether or not it is distorting the record upon which the jury or the judge has to decide in a way that makes it an unfair advantage for one side or the other.” Judge Todd Lang of Maricopa County Superior Court ultimately sentenced Pelkey’s killer Gabriel Paul Horcasitas to 10.5 years for manslaughter — although the state had asked for only 9.5 years — and 12.5 years in total, including an endangerment charge. “I love that AI. Thank you for that,” Lang said, a recording of the hearing shows. “As angry as you are and justifiably angry as the family is, I heard the forgiveness.” Pelkey’s story was previously reported by ABC15 Arizona. Bringing Pelkey into the courtroom Pelkey was the youngest of three children, a veteran and, according to Wales, “the most forgiving and the friendliest” member of the family. He was killed in November 2021 in Chandler, Arizona at the age of 37. Pelkey’s autopsy photos and surveillance video of his death were shown during the trial, Wales said. But after a jury found Horcasitas guilty of reckless manslaughter, Wales wanted the judge to see what Pelkey was like when he was alive during the sentencing hearing. Wales and her husband, Todd Wales, work in tech — she said they’d previously created AI video replicas of former CEOs and founders to speak at company conferences — so they decided in the weeks leading up to the sentencing hearing to try replicating Pelkey the same way. They used several software platforms, trained on photos and an old video of Pelkey, to create the AI replica that was shown in the hearing on May 1. And on the day before the sentencing hearing, Wales called her lawyer, Jessica Gattuso, to get her blessing for the plan. “I was concerned, I thought we would get an objection or some kind of pushback … I did what research I could, but I didn’t find anything because I’ve never heard of this being done,” Gattuso told CNN, adding that she ultimately relied on an Arizona law that gives victims discretion in how to deliver their statement. Like other AI videos depicting people, the recreation of Pelkey is somewhat halting and awkward and starts with an acknowledgement that it was made using the technology. But Wales said she believes it captured his essence. “It is a shame we encountered each other that day in those circumstances,” the AI version of Pelkey said in the video. “In another life, we probably could have been friends.” Horcasitas’s lawyer, Jason Lamm, said the defense did not receive advance notice that AI would be used in a victim impact statement. He added: “It appears that the judge gave some weight to the AI video and that is an issue that will likely be pursued on appeal.” How AI is changing law Judges are increasingly facing decisions about AI’s role in the courtroom — including whether it should have one at all. In a separate case in New York last month, an appellate judge quickly shut down an attempt by a plaintiff to have an AI-generated avatar argue his case, without first clarifying that it was not a real person. And just last week, a federal judicial panel advanced a draft rule that would require AI-generated evidence to meet the same reliability standards as evidence from human expert witnesses, according to a Reuters report. AI’s advancement has also raised questions about whether the technology could replace human jobs in the legal field. “It’s not going away, and we’re going to see more instances of this,” said Grimm, who was not involved with the Pelkey case. “Judges tend to be a little nervous about this technology, and so we’ll probably see initially more nos than yeses.” Judges may be especially hesitant to allow AI-generated evidence or visual aids to be presented to a jury, which, unlike a judge in a sentencing case, hasn’t been trained not to let emotion overwhelm the facts of the case, Grimm said. There are also questions around whether AI could inaccurately represent a party to a case, for example, by making them appear more sympathetic. Grimm suggested that, going forward, opposing counsel be given the chance to view AI-generated content and raise potential objections for a judge to review, before it gets shown in court. Even Wales cautioned the technology should be used carefully. “This was not evidence, the jury never saw this. It wasn’t even made before a verdict came down of guilty,” Wales said. “This is an opinion. And the judge was allowed to see a human that’s no longer here for who he was.” Ultimately, she said, replicating her brother with AI was “healing” for her family. After it played in court, she said her 14-year-old son told her: “Thank you so much for making that. I needed to see and hear from Uncle Chris one more time.” –CNN’s Hazel Tang contributed to this report.
He was killed in a road rage incident. His family used AI to bring him to the courtroom to address his killer
TruthLens AI Suggested Headline:
"Family Uses AI to Present Victim's Statement at Killer's Sentencing Hearing"
TruthLens AI Summary
Stacey Wales dedicated two years to crafting a victim impact statement following the tragic death of her brother, Christopher Pelkey, who was killed in a road rage incident in 2021. Despite her efforts, she felt that her words alone could not fully encapsulate her brother's essence or convey his sentiments. To overcome this challenge, Wales, along with her husband, decided to employ artificial intelligence to create a video representation of Pelkey. This AI-generated version, which utilized Pelkey's likeness and voice, delivered a heartfelt message during the sentencing hearing for his killer, Gabriel Paul Horcasitas. In the video, Pelkey expressed forgiveness towards his assailant, a sentiment that resonated with Wales as she believed it reflected her brother's true character, even if she herself was not ready to forgive. This innovative approach marked a notable first in the intersection of AI technology and the legal system, raising significant ethical and practical questions about the use of AI to replicate deceased individuals in judicial settings.
The use of AI in such a sensitive context has sparked discussions among legal experts regarding its implications. Paul Grimm, a law professor, highlighted the potential influence of AI-generated content on judicial outcomes, emphasizing the need for careful consideration to prevent distortion of the truth. During the sentencing, Judge Todd Lang acknowledged the impact of the AI video and the forgiveness conveyed within it, ultimately sentencing Horcasitas to 10.5 years for manslaughter. Wales noted that while the technology is still in its infancy, it could serve as a powerful tool in expressing the voices of those who can no longer speak for themselves. However, she also cautioned that such technology must be approached with care to ensure it does not misrepresent individuals or affect the fairness of legal proceedings. The case illustrates not only the evolving role of AI in the courtroom but also the emotional healing it can provide to families grappling with loss, as evidenced by her son’s reaction to seeing his uncle brought to life through technology.
TruthLens AI Analysis
The article presents a unique and poignant narrative surrounding the use of artificial intelligence in a courtroom setting, specifically to recreate the voice of a deceased victim for a victim impact statement. This marks a significant intersection of technology and justice, raising numerous ethical questions about how we remember and represent individuals who have passed away.
Purpose of the Article
The article aims to highlight the innovative use of AI technology to provide a voice to a victim who can no longer speak for themselves, showcasing the emotional depth and complex implications of such an act. It serves to inform the public about the evolving role of AI in legal settings, while also prompting discussions on the ethical ramifications of this technology.
Public Perception
By presenting the emotional story of Stacey Wales and her brother, the article seeks to evoke empathy and reflection among readers about the impact of road rage incidents and the grieving process. It also aims to raise awareness about how AI could be utilized in similar contexts, potentially leading to varied reactions based on people's views on technology's role in society.
Hidden Agendas
While the article primarily focuses on the emotional story, it could be argued that there is an underlying agenda to normalize the use of AI in sensitive situations. By illustrating a heartfelt narrative, the article may be downplaying the potential concerns about the ethical implications of recreating deceased individuals, thus steering public opinion towards acceptance of this practice.
Manipulative Aspects
The article carries a manipulative quality through its emotional appeal. By focusing on the victim's story and the forgiveness expressed through AI, it may inadvertently create a narrative that suggests AI can provide closure or healing, which may not be universally accepted. This emotional framing could influence public sentiment toward the integration of AI in personal and judicial matters.
Credibility of the Article
The report appears credible, primarily due to its sourcing from known news outlets and the inclusion of expert opinions. However, the sensational aspect of AI recreating a voice raises questions about the reliability of such representations and the potential for misinterpretation by audiences unfamiliar with the technology.
Societal Impact
The implications of this article could extend into broader societal discussions about the use of AI in various sectors, including the legal system, mental health, and even entertainment. It may lead to increased advocacy for regulations governing AI, especially regarding the recreation of deceased individuals, igniting debates on privacy, consent, and the sanctity of memory.
Target Audience
This article resonates more with communities interested in technological advancements, legal reform, and those who have experienced loss. The emotional narrative may particularly appeal to individuals who value personal stories within the context of broader societal issues.
Market Implications
While this news piece may not directly affect stock markets, it highlights a growing sector focused on AI technologies. Companies involved in AI development may see increased interest and investment, particularly those that align with ethical standards in the use of technology.
Global Context
This article connects to ongoing discussions about AI's role in society, reflecting current trends toward digitization and the ethical challenges that accompany technological advancements. The narrative fits into larger conversations about how societies remember and confront loss in an increasingly digital age.
Use of AI in the Article
The article does not explicitly state that AI was used in its writing, but it reflects the themes of AI's integration into human experiences. The writing style and focus on emotional storytelling may be influenced by AI-generated content, although this is speculative.
Manipulation Considerations
There are elements of manipulation present, particularly in the emotional framing of AI's role in delivering a victim's message. The language used is designed to evoke sympathy and support for the use of technology in sensitive contexts, which could lead to a skewed understanding of its implications.
In summary, while the article presents an innovative use of AI technology, it also prompts critical reflection on the ethical considerations and societal impacts of such advancements. The emotional narrative serves to engage readers but may also obscure deeper issues concerning the representation of deceased individuals.