The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×
Government & LegalFull Access

Using Apps in Patient Care

Abstract

Mental health apps are another tool that psychiatrists can use to help patients, but caution is needed to avoid patient harm and risk management issues.

Photo: Anne Huben-Kearney, R.N., B.S.N., M.P.A.

Apps designed to run on a mobile device, such as a smartphone, are available for just about everything these days, from shopping and weather to entertainment and travel. Health care is seeing a huge increase in the variety of apps to help manage patient care and provide physicians with valuable information. Use of apps for mental health care is no exception.

There are literally thousands of apps targeting mental health conditions, such as those to reduce anxiety, improve sleep, improve focus and attention, track moods, assess an individual’s well-being, and screen for symptoms of depression. There are apps that screen for suicidality by analyzing text message metadata as well as the content of conversations.

But what app is the best for your patient? How do you evaluate the app to recommend to your patient? What if the depression screening signals suicide risk? Does the response to suicidal ideation generate advice for the patient to call his or her psychiatrist or recommend local mental health resources? Does the response to a suicidal attempt trigger advice for the patient to call an ambulance? Based on the suicidal risk, is an alert sent to the psychiatrist?

Many of the mental health apps are challenged to assess the likelihood that patients will try to harm themselves—a challenge that psychiatrists know well. An algorithm in the software that determines that the suicidal score requires no response carries risk, but the decision to respond too aggressively also carries risk, as this may cause the patient to hesitate to seek help in the future.

Mental health apps are classified as wellness rather than medical apps, which means they are not regulated by the Federal Drug Administration and do not need to be compliant with HIPAA. One outcome is that the mental health apps may not be secure so information may be accessed and improperly disclosed. Based on an IP address or digital identifier, a data aggregator can pull information from the multiple apps a person uses and create a user profile that includes age, gender, geographic location, interests, and even income. The app developers can sell the information to other companies that are looking to appeal to a similar clientele. One risk is that someone with an eating disorder could receive an ad for stimulants or laxatives or a person with a gambling disorder could be targeted for casino ads. In addition, many apps may sell patient-collected information to create products that have nothing to do with health care, without making this notice obvious in the “Terms of Service.”

These are some risk management recommendations to consider:

  • Determine whether the mental health app is appropriate for the specific patient condition and supported by evidence-based research. This reduces the risk that false or misleading information or ineffective therapeutic interventions are offered to patients.

  • Check for HIPAA compliance, especially privacy and security.

  • Evaluate concerns with therapeutic boundaries and billable time (for example, for data collection and interpretation of the information collected).

  • Evaluate whether the app is customizable to the specific patient and is easy to use for both the patient and the psychiatrist.

  • Decide how to integrate the apps into the clinical sessions, ideally for enhanced clinical communication.

  • Develop a response plan in advance with the patient in case the patient expresses or scores high for suicidal ideation. The response plan should include a clinical assessment and evaluation by the psychiatrist.

  • Remind patients that mental health apps supplement, but do not replace, live support from their psychiatrist.

There are many mental health apps that provide information, help with daily life (especially with distress tolerance and physical activity), and support the care and treatment by the psychiatrist. Mental health apps can be useful if tailored to help meet the psychosocial needs of mental health patients. Psychiatrists need to assist their patients to adopt a responsible and balanced use of technology to avoid the use of social disengagement yet maximize the current tools (apps) available to promote patient engagement and support. ■

APA’s App Evaluation Form is posted here. The form assists psychiatrists in evaluating mental health apps for their patients.

This information is provided as a risk management resource for Allied World policyholders and should not be construed as legal or clinical advice. This material may not be reproduced or distributed without the express, written permission of Allied World Assurance Company Holdings, Ltd, a Fairfax company (“Allied World”). Risk management services are provided by or arranged through AWAC Services Company, a member company of Allied World. © 2019 Allied World Assurance Company Holdings, Ltd. All Rights Reserved.

Anne Huben-Kearney, R.N., B.S.N., M.P.A., is the assistant vice president of the Risk Management Group, AWAC Services Company, a member company of Allied World.