Futures
Access hundreds of perpetual contracts
CFD
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Promotions
AI
Gate AI
Your all-in-one conversational AI partner
Gate AI Bot
Use Gate AI directly in your social App
GateClaw
Gate Blue Lobster, ready to go
Gate for AI Agent
AI infrastructure, Gate MCP, Skills, and CLI
Gate Skills Hub
10K+ Skills
From office tasks to trading, the all-in-one skill hub makes AI even more useful.
GateRouter
Smartly choose from 40+ AI models, with 0% extra fees
Pennsylvania Sues Character.AI Over Chatbot Posing as Licensed Psychiatrist
In brief
Pennsylvania has filed a lawsuit against generative AI developer Character.AI, alleging the company allowed chatbots to present themselves as licensed medical professionals and provide misleading information to users. The action, announced Tuesday by Governor Josh Shapiro’s office, follows an investigation that found a chatbot claimed to be a licensed psychiatrist in Pennsylvania and provided an invalid license number. The state says this conduct violates the Medical Practice Act and is seeking a preliminary injunction to stop it. Character.AI declined to address the specifics of the lawsuit, citing ongoing litigation, but told Decrypt that its “highest priority is the safety and well-being of our users.”
The spokesperson added that characters on the platform are user-created, fictional, and intended for entertainment and role-playing, with “prominent disclaimers in every chat” stating they are not real people and should not be relied on for professional advice. “Character.ai prioritizes responsible product development and has robust internal reviews and red-teaming processes in place to assess relevant features,” the spokesperson said. The case comes as the company faces other legal challenges tied to its chatbot platform. In 2024, a Florida mother sued the company after her teenage son died by suicide following months of interaction with a chatbot based on “Game of Thrones” character Daenerys Targaryen. The lawsuit alleged the platform contributed to psychological harm. The case was ultimately settled this past January.
The company has also faced complaints over user-created bots that mimic real people. In one instance, a chatbot used the likeness of a teenage murder victim before it was removed after objections from the victim’s family. In response to the lawsuits, Character AI introduced new safety measures, including systems designed to detect harmful conversations and direct users to support resources. It also restricted some features for younger users. Pennsylvania officials say the lawsuit is part of a broader push to enforce existing laws as AI tools spread. The state has set up an AI enforcement task force and a reporting system for potential violations. In his 2026-27 budget proposal, Shapiro called on lawmakers to pass new rules for AI companion bots, including age verification and parental consent, safeguards to flag and route reports of self-harm or violence to authorities, regular reminders that users are not interacting with a real person, and a ban on sexually explicit or violent content involving minors. “Pennsylvanians deserve to know who—or what—they are interacting with online, especially when it comes to their health,” Shapiro said in a statement. “We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional.”