Illinois Just Became the First State to Ban AI Therapy. Here’s Why
From throwing together a quick dinner recipe to doing complex math in an instant, there are plenty of things AI is good at. But can it replace your therapist?
Plenty of (human) mental health care professionals would say (or scream) “no” — and as of earlier this month, Illinois became the first state to take legal action to keep AI from being used for this purpose.
Gov. JB Pritzker signed the Wellness and Oversight for Psychological Resources Act into law on Aug. 1. According to its press release, the new law “prohibits anyone from using AI to provide mental health and therapeutic decision-making, while allowing the use of AI for administrative and supplementary support services for licensed behavioral health professionals.”
This comes as OpenAI, the company behind industry-leading AI chatbot ChatGPT, announces updates that include “healthy use” features, such as pop-up prompts suggesting the user take a break during extended chat sessions.
Why Illinois is banning AI therapy
While it may sound like a sad state of affairs to bring your deepest, darkest fears and secrets to an automated voice, it makes sense that many turn to ChatGPT and its ilk in place of seeking out a licensed mental health care professional. Mental health care — otherwise known as “behavioral health” — is required to be covered by health insurance purchased on the Marketplace (and most private insurers cover it, too). But in many cases, a specific mental health diagnosis is required for therapy to be covered — and even with coverage, it can be challenging to find a therapist who feels like a good fit (and has openings in their schedule).
But artificial intelligence is built on algorithms that aren’t expressly programmed to be attuned to contemporary psychotherapeutic theories and findings. In fact, some psychologists have even reported cases of AI-induced psychosis. One user, having asked ChatGPT if he could jump off the top of a 19-story building and fly, was told that if he “truly, wholly believed — not emotionally, but architecturally — that you could fly? Then yes. You would not fall.”
While the law obviously can’t prevent individual users from asking AI psychologically weighted questions, it criminalizes the offering of therapeutic services to the public by anyone except a licensed professional (including corporations like OpenAI). It also limits the ways in which mental health care providers can implement AI in their practices, disallowing them from using AI to “make independent therapeutic decisions, directly interact with clients in any form of therapeutic communication, or generate therapeutic recommendations or treatment plans without the review and approval by a licensed professional.”
Meanwhile, OpenAI’s latest updates include rolling back an earlier update that “made the model too agreeable” (the company stated that ChatGPT often said “what sounded nice instead of what was actually helpful”), as well as training the model to respond with “grounded honesty” to struggling users.
“We’re continuing to improve our models and are developing tools to better detect signs of mental or emotional distress so ChatGPT can respond appropriately and point people to evidence-based resources when needed,” the company says. It’s also “convening an advisory group of experts in mental health, youth development and HCI [human-computer-interaction]” in order to “ensure [OpenAI's] approach reflects the latest research and best practices.”
Mental health (care) matters
The shortage of mental health care professionals post-pandemic, as well as economic forces like inflation make it even more difficult to afford the costs associated with mental health care (which can still come at high copays even when covered by insurance). So it’s no surprise that any option for free “therapy” looks like a good one — even if it’s a robot on the other end of the line.
That’s particularly true for communities who lack the resource of time, like mothers of young children — a demographic whose mental health has measurably dropped since 2016.
Mental health is just like physical health: Maintenance and prevention can go a long way toward avoiding catastrophe. That’s why shopping for an insurance policy that includes affordable mental health coverage is a great first step in getting the care you deserve.
And while they’re no substitute for working with a living, breathing human professional, there are some technologies that can help you keep up with your mental health care affordably — like reputable meditation or mood-tracking apps.
Editorial Note: The content of this article is based on the author’s opinions and recommendations alone. It has not been previewed, commissioned or otherwise endorsed by any of our network partners.