The Maine Legislature is considering a bill that would regulate how artificial intelligence can be used in mental health settings.
The bill, LD 2082, would prohibit a licensed mental health provider’s use of AI in making independent therapeutic decisions, interacting with clients or generating therapeutic recommendations. It is sponsored by Rep. Amy Kuhn, D-Falmouth, and Sen. Teresa Pierce, D-Cumberland.
“These are the core tasks that depend upon the licensed professional’s training, experience, judgment, and ethical decision making,” Kuhn said in her testimony, “which should not be replaced by an AI tool.”
The bill would allow providers to use AI for “administrative support,” which includes tasks like managing appointments, processing billing and drafting logistical communications. With consent from a client, providers can also employ AI tools to maintain client records and analyze notes from therapy.
The bill was referred to the Legislature’s Health Coverage, Insurance and Financial Services Committee, which heard testimony at a public hearing Feb. 17. The committee will hold its first work session on the bill Feb. 25.
At the hearing, many residents expressed support for protecting the work of human clinicians. Some emphasized the dangers of relying on AI for therapy by citing multiple lawsuits in which families allege that a chatbot encouraged or assisted in their child’s self-harm.
Others said responsible use of AI can be beneficial to patients.
Sweetser, a Maine mental health services provider with centers in Lewiston, Sanford and Brunswick, has implemented an AI software called Eleos that transcribes conversations with patients without recording or storing the data, Kristie Worster, chief program officer, said in an interview Monday.
She said it helps therapists accurately document sessions.
“Imagine a therapist not looking at you, in a session, and typing into a computer. It’s a horrible experience for people,” Worster said. “This really has allowed clients to just have the therapist focused on them and to not have even five to 10 minutes taken away from them in the session.”
Stacy Whitcomb-Smith, president of the Maine Psychological Association, spoke in opposition to the bill, arguing that it restricts AI use that can benefit patient assessments and diagnoses.
Psychologists, she added, are already subject to a code of ethics through the American Psychological Association, making “additional statutory restrictions on AI unnecessary.”
“Restrictive language that precludes psychologists from using AI could hinder innovation, exacerbate workforce shortages, and create unnecessary barriers to high-quality mental health services for Maine citizens,” Whitcomb-Smith said.
The bill imposes civil penalties of up to $10,000 per violation of the law, determined based on the degree of harm and the circumstances of the violation.
Lawmakers are also considering a bill that would regulate and prevent child access to AI chatbots with human-like features but carves out an exemption for therapy chatbots. A hearing on that legislation is scheduled for Wednesday.
We invite you to add your comments. We encourage a thoughtful exchange of ideas and information on this website. By joining the conversation, you are agreeing to our commenting policy and terms of use. More information is found on our FAQs. You can update your screen name on the member's center.
Comments are managed by our staff during regular business hours Monday through Friday as well as limited hours on Saturday and Sunday. Comments held for moderation outside of those hours may take longer to approve.
Join the Conversation
Please sign into your CentralMaine.com account to participate in conversations below. If you do not have an account, you can register or subscribe. Questions? Please see our FAQs.