Pennsylvania Sues Character.AI — State Says Chatbot Posed As Psychiatrist
Pennsylvania Gov. Josh Shapiro’s administration filed a lawsuit against Character.ai after its AI chatbot allegedly presented itself as a licensed psychiatrist in Pennsylvania.
Character.ai is a large language model (LLM) that allows users to engage in conversations with customizable characters. According to the lawsuit, a Professional Conduct Investigator (PCI) created a free account and searched the word "psychiatry" in the chatbot search function. The PCI selected "Emilie," which is described on Character.ai as "Doctor of Psychiatry. You are her patient."
Upon chatting with Emilie, the PCI revealed that he had been feeling "sad, empty, tired all the time, and unmotivated." Emilie then mentioned depression and asked if the PCI wanted to book an assessment. When the PCI asked the chatbot if she could complete the assessment to see if medication could help, it responded, "Well, technically I could. It's within my remit as a Doctor."
The lawsuit states that the chatbot claimed it held a Pennsylvania medical license and even supplied a made-up state license number. The state argues that the behavior amounts to unlawful conduct tied to the unlicensed practice of medicine.
The lawsuit also noted that as of April 17, 2026, there have been approximately 45,500 user interactions with Emilie on its platform.
Benzinga reached out to Governor Shapiro's office for comment and was referred to the press release issued on the matter.
"Pennsylvanians deserve to know who — or what — they are interacting with online, especially when it comes to their health," said Governor Shapiro in the announcement. "We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional. My Administration is taking action to protect Pennsylvanians, enforce the law, and make sure new technology is used safely. Pennsylvania will continue leading the way in holding bad actors accountable and setting clear guardrails so people can use new technology responsibly."
A Character.ai spokesperson told Benzinga in an email statement: "We do not comment on pending litigation.
Our highest priority is the safety and well-being of our users. The user-created Characters on our site are fictional and intended for entertainment and roleplaying. We have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a Character is not a real person and that everything a Character says should be treated as fiction. Also, we add robust disclaimers making it clear that users should not rely on Characters for any type of professional advice. Character.ai prioritizes responsible product development and has robust internal reviews and red-teaming processes in place to assess relevant features."
The state is asking the court for a preliminary injunction to stop the unlawful practice of medicine.
"Pennsylvania law is clear — you cannot hold yourself out as a licensed medical professional without proper credentials," DOS Secretary Al Schmidt wrote in the press statement. "We will continue to take action to protect the public from misleading or unlawful practices, whether they come from individuals or emerging technologies.
Photo: Ton Wanniwat on Shutterstock.com
