ADVERTISEMENT
Tech Products Add Value to Field, but Also Raise Questions
We should always ask why and how we intend to use a new product. These questions sometimes reveal problems or concerns. Asking “why” could lead to rejecting use of a product for certain purposes. Asking “how” could reveal the difficulty of fitting a product into the process of care. Products not fitting well into clinical workflows have diminished value.
A product’s value is derived largely from solving problems. Many behavioral healthcare products aim to improve clinical work. Technical capabilities often facilitate those solutions. Because our field rests on personal conversations and clinical judgment, computers might seem of little use. Yet computers are shaping important new products, while at the same time stirring some polarizing questions.
Let us highlight these issues with 3 types of products relying on computer innovation: measurement systems, digital therapeutics, and artificial intelligence.
Measurement-Based Care
Measurement-based care (MBC) can aid key clinical decisions, especially when a comprehensive system is integrated into a company’s workflow. Ideally, this starts with automated delivery of self-report questionnaires to clients before visits, built-in scoring of the instruments, and clinical alerts as needed.
Some providers use measurement tools without really implementing MBC. Clients may complete them, but results are not analyzed for use during treatment. Moreover, some lack the most useful alerts—those gained by comparing a client’s scores to norms for expected change. This is feasible, but many systems lack such statistical calculations. Other missing metrics relate to aggregated clinical change.
We certainly want to know the aggregated change for a given population of patients. It is also possible to establish the average change achieved by each clinician. Do we want to know this? Should we take action to reward high-performing clinicians and counsel low performers? Is this a solution we want?
Digital Therapeutics
The past decade has seen the growth of digital products using techniques from cognitive behavioral therapy (CBT), mindfulness, and other clinical models. We should be debating when and why these digital therapeutic tools are used. They were initially marketed as supplemental to existing clinical services, able to help with program waiting lists, aid clients in practicing new skills, and support skill maintenance after discharge.
How might these products address growing demands for more affordable and accessible care? The pandemic compounded our longstanding access problems. Do we want digital therapeutics to become first-line care? Research has validated the clinical efficacy of several digital products, but first-line care means digital services would replace in-person care. A therapy referral is made only if digital fails.
Some businesses would endorse this approach, but many individuals would object. Digital companies would see increased profits and health plans could avoid therapy costs. However, digital tools will not help every problem, and some people will dislike using them. How far should we go promoting these tools? Are they a good solution for care access? Should they be ancillary to therapy or a substitute?
Artificial Intelligence
New technologies can exist for years before entrepreneurs design valuable products with them. Artificial intelligence (AI) is ending such a search for strong “use cases.” We now find “conversational AI” in customer service settings where chatbots or virtual agents interact with us about our needs and take requests. This technology can catalogue emotions and themes in speech for virtual use.
The behavioral healthcare field is seeing new products based on AI for solving clinical and administrative tasks. For example, therapy sessions are being analyzed using natural language processing, a type of machine learning that gives computers the ability to understand text and spoken words much as people do. Should we establish parameters for acceptable use cases now?
This technology seems promising for training clinicians in basic therapy skills and for generating notes on therapy sessions to reduce administrative time. This may be scratching the surface of what this technology can do. How far do we want to take it? What business problems do we want to solve? Do we want the therapy experience distilled into a chatbot interaction?
Why Create This?
Questioning why is not new in science. Answers are ominous in some areas (e.g., nuclear weapons, human cloning). Our field now faces once implausible questions. Should next-generation chatbots be created for virtual therapy sessions? Should computers analyze therapy recordings to rate adherence to clinical guidelines? Should we incentivize therapists based on guideline adherence or clinical outcomes?
Many of us might react negatively to these ideas, but we should all become well educated on the nature and limits of new technology. Polarization will remain regardless. Our careers and businesses will surely be impacted by new technology. We should actively debate why we might use a potential new product. Once use is widely approved, we should then ensure that every implementation maximizes its value.
Ed Jones, PhD is currently with ERJ Consulting, LLC and previously served as president at ValueOptions and chief clinical officer at PacifiCare Behavioral Health.
The views expressed in Perspectives are solely those of the author and do not necessarily reflect the views of Behavioral Healthcare Executive, the Psychiatry & Behavioral Health Learning Network, or other Network authors. Perspectives entries are not medical advice.
References