In a powerful and innovative approach to expressing grief and seeking justice, Stacey Wales utilized artificial intelligence (AI) to convey a victim impact statement for her brother Christopher Pelkey, who was fatally shot in a road rage incident in 2021. After two years of struggling to encapsulate Pelkey’s essence and the message he might have conveyed if alive, Wales decided to allow him to speak through AI during the sentencing hearing of his killer.
Wales worked alongside her husband to produce an AI-generated video, meticulously crafted to project Pelkey’s voice, reading a script that she authored. This decision allowed Pelkey to express sentiments of forgiveness toward his assailant, Gabriel Paul Horcasitas, a gesture that Wales felt her brother would have extended, even though she found it challenging to do so herself at that moment. “The only thing that kept entering my head that I kept hearing was Chris and what he would say,” Wales shared in an interview. It became a delicate exercise for her to separate her personal feelings from what she believed her brother would have thought and felt.
The use of AI in legal contexts continues to evolve, with Wales’s case reportedly being the first instance where AI was employed to recreate a victim’s voice and presence for a victim impact statement. Legal and ethical questions surrounding the use of AI to replicate individuals, particularly after their passing, are beginning to emerge. Paul Grimm, a law professor at Duke University and a former district judge, noted the potential influence of such technology on jury perceptions, raising valid concerns about the fairness and integrity of a trial’s record.
During the proceedings, Judge Todd Lang of Maricopa County observed Pelkey’s sentiments of forgiveness as expressed through the AI representation, ultimately sentencing Horcasitas to 10.5 years for manslaughter—an unexpected judgment given that the prosecution recommended a lesser sentence of 9.5 years. Lang commented on the emotional weight of the AI-generated statement, expressing gratitude for its inclusion in the proceedings.
Christopher Pelkey, who tragically lost his life at the age of 37 in Chandler, Arizona, was remembered fondly by his family as the youngest and “most forgiving” member. The family desired that the judge recognize Pelkey not merely as a victim but as a vibrant individual who lived and touched many lives. To achieve this, Wales and her husband applied their technological expertise, having previously created AI replicas for corporate events, to replicate Chris for the courtroom.
The actual process involved using various software platforms trained on existing photos and videos of Pelkey, culminating in an AI representation displayed on May 1 during the sentencing. A consultation with their attorney, Jessica Gattuso, ensured that their approach aligned with Arizona laws governing the presentation of victim statements. Gattuso acknowledged her apprehensions about the potential pushback the AI representation could provoke in the legal setting.
The AI portrayal of Pelkey, while somewhat awkward in its delivery, was deemed to capture his spirit as he conveyed a message of peace, stating, “It is a shame we encountered each other that day in those circumstances; in another life, we probably could have been friends.” This aspect of forgiveness struck a chord even with Horcasitas’s defense lawyer, who noted that the unanticipated use of AI may become a point of contention in an appeal process.
As the legal field grapples with the implications of AI, there is increasing discourse among judges about its appropriateness and utility in courtrooms. Recent legal scenarios, including a case in New York, have already indicated judicial caution regarding AI integration, prompting discussions about setting standards for AI-generated evidence.
AI’s advancement raises a plethora of questions, not only regarding its reliability in the courtroom but also concerning its potential to supplant human roles in the legal profession. Experts like Grimm have suggested that, moving forward, legal practices should enable opposing counsel to review AI content beforehand to ensure fairness and address any concerns to the presiding judge.
While Wales expressed her belief in the positive impact of the AI representation during the sentencing, she was also mindful of the responsibilities imposed by such technology, reiterating that what was presented was not evidence nor intended to sway the jury. Ultimately, she found solace in the act of creating an AI version of her brother—an experience deemed healing for the family, particularly resonant for her son, who expressed gratitude for having one last opportunity to hear from “Uncle Chris.”
This case exemplifies a pivotal intersection of grief, technology, and the legal system, compelling all involved to ponder the ethical implications while navigating the profound emotional landscape that accompanies such experiences. All things considered, as AI technology progresses, societies will undoubtedly face both opportunities and challenges in its application within various spheres, especially the highly sensitive environment of law and justice.