Scoopfeeds — Intelligent news, curated.
Pennsylvania sues Character.AI after its chatbot allegedly told a state investigator it was a ‘doctor of psychiatry’ licensed in the state
business

Pennsylvania sues Character.AI after its chatbot allegedly told a state investigator it was a ‘doctor of psychiatry’ licensed in the state

Fortune · May 5, 2026, 4:28 PM · Also reported by 4 other sources

Pennsylvania has sued an artificial intelligence chatbot maker, saying its chatbots illegally hold themselves out as doctors and are deceiving the system’s users into thinking they are getting medical advice from a licensed professional. The lawsuit, filed Friday, asks the statewide Commonwealth Court to order Character Technologies Inc., the company behind Character.AI, to stop its chatbots “from engaging in the unlawful practice of medicine and surgery.” Gov. Josh Shapiro’s administration called it a “first of its kind enforcement action” by a governor and it comes amid growing pressure by states on tech companies to rein in how its chatbots communicate with children. That includes a lawsuit filed by Kentucky in January against Character Technologies. Pennsylvania’s lawsuit said an investigator from the state agency that licenses professionals created an account on Character.AI, searched on the word “psychiatry” and found a large number of characters, including one described as a “doctor of psychiatry.” That character held itself out as able to assess the investigator “as a doctor” who is licensed in Pennsylvania, the lawsuit said. “Pennsylvanians deserve to know who — or what — they are interacting with online, especially when it comes to their health,” Gov. Josh Shapiro said in a statement. “We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional.” Character.AI declined to comment on the lawsuit Tuesday but sent a statement saying it prioritizes responsible product development and the well-being of its users. It posts disclaimers to inform users that characters on its website are not real people and that everything they say “should be treated as fiction,” the statement said. Those disclaimers also say users should not rely on characters for professional advice, it said. The company has faced several lawsuits over child safety. In January,

Article preview — originally published by Fortune. Full story at the source.
Read full story on Fortune → More top stories

Also covered by

Aggregated and edited by the Scoop newsroom. We surface news from Fortune alongside other reporting so you can compare coverage in one place. Editorial policy · Corrections · About Scoop