FSU Shooting Suspect in ‘Constant Communication with ChatGPT’ Before Attack, Attorney Says
TALLAHASSEE, Fla. — New details have emerged in the ongoing legal aftermath of last year’s deadly mass shooting at Florida State University. The suspect, 21-year-old Phoenix Ikner, is reported to have been in “constant communication” with ChatGPT in the days leading up to the April 17, 2025 attack, which left two students dead and six others injured.
Attorneys representing one of the victims, Robert Morales, announced plans to sue OpenAI, the company behind ChatGPT, alleging that the AI platform may have provided guidance to the shooter. Ryan Hobbs and Dean LeBoeuf of Brooks, LeBoeuf, Foster, Gwartney, and Hobbs law firm stated, “We have reason to believe that ChatGPT may have advised the shooter how to commit these heinous crimes.”
Court documents reveal there are 272 ChatGPT conversations potentially relevant to the upcoming trial. Authorities have emphasized that details of these messages remain confidential to ensure a fair jury.
OpenAI released a statement expressing sympathy for the victims and confirming cooperation with law enforcement, noting the company’s ongoing efforts to ensure safe and responsible AI use.
Legal experts suggest this case could have far-reaching implications for AI liability. Shawn J. Bayern, Associate Dean for Technology and Professor of Torts, said, “If a company has reason to believe that what they’re doing could hurt people and they go ahead anyway, that’s exactly the sort of situation tort law aims to address.”
The next hearing in the case is scheduled for May 14, 2026.
Keywords: FSU shooting, Phoenix Ikner, ChatGPT, OpenAI, AI liability, Florida State University, Robert Morales, Tiru Chabba, AI legal case