Articles

The Chatbot Will See You Now: Protecting Mental Health Confidentiality In Software Applications

Stiefel, Scott

Chatbots are a form of artificial intelligence that can read text, or translate voice to text, and provide a response. While they have existed since the 1960’s, they have recently been used to provide a form of therapy through software applications (“apps”). However, unlike licensed professionals who provide traditional mental health services, chatbots are not subject to confidentiality obligations. Currently, federal and state regulations that impose confidentiality obligations in the healthcare context do not apply to chatbots, and the regulations that do apply to chatbots do not impose such obligations. Because users engaging with these apps disclose information similar to information disclosed during a therapy session, this Article proposes a new regulatory framework, through new legislation, to limit the use and disclosure of information received by softwarebased therapy technologies.

Files

Also Published In

Title
Science and Technology Law Review
DOI
https://doi.org/10.7916/stlr.v20i2.4774

More About This Work

Published Here
August 19, 2022