psychiatrist

This work may not be copied, distributed, displayed, published, reproduced, transmitted, modified, posted, sold, licensed, or used for commercial purposes. By downloading this file, you are agreeing to the publisher’s Terms & Conditions.

Perspective

Closing the Research-To-Practice Gap in Digital Psychiatry: The Need to Integrate Implementation Science

Jessica Lipschitz, PhDa,b,*; Timothy P. Hogan, PhDc,d; Mark S. Bauer, MDb,e; and David C. Mohr, PhDf

Published: May 14, 2019

Digital mental health interventions, which consist of web and mobile applications intended to monitor and treat mental illness, have been met with tremendous enthusiasm over the past decade, and rightfully so. Around the world, mental health treatment is in crisis: we cannot accommodate the majority of those in need with our current service infrastructure. Digital mental health interventions offer a potential solution. They can be widely disseminated with virtually no marginal cost; they promote patient autonomy; they offer convenience, not requiring transportation or daytime appointments; and they can be highly responsive, accessible when patients most need support. As seen in the recent meta-analysis by Wells et al1 published in The Primary Care Companion for CNS Disorders and the review article by Apaydin et al2 and recent meta-analysis by Wright et al3 published in The Journal of Clinical Psychiatry, the efficacy of digital mental health interventions for common psychiatric concerns such as depression is well supported in many randomized controlled trials.

Why, then, are digital mental health interventions not part of routine psychiatric treatment? One possible answer is touched upon in the aforementioned three articles: the literature base establishing efficacy of these interventions has emerged only over the last 5 years, and patient engagement in the context of routine care has not yet been adequately addressed. Put a different way, innovation and efficacy alone do not result in adoption in real-world clinical settings. A study conducted by Gilbody and colleagues4 illustrates this point. Primary care patients were randomized to receive evidence-based computerized cognitive-behavioral therapy (cCBT) for depression or treatment as usual, and impact on depressive symptoms was evaluated. The authors concluded that “while cCBT has been shown to be efficacious in developer led trials, it was not effective in usual NHS [National Health Service] care settings. The main reason for this was low adherence and engagement with treatment, rather than lack of efficacy.”4(p11) Fewer than 20% of participants assigned to cCBT completed the full course of treatment. Importantly, subsequent studies of digital interventions widely deployed in intended-use settings have generated similarly poor levels of engagement.5,6 Perhaps the most important insight from these findings is that deploying digital mental health interventions outside of research settings will require careful consideration and systematic evaluation of not just clinical outcomes, but also uptake.

The field of implementation science, defined as the study of methods to promote the systematic uptake of research findings and evidence-based practices into routine care, provides a guide for this line of inquiry.7 It is corrective to the false assumption that if a digital therapeutic works, stakeholders will adopt it. Efficacy and effectiveness studies ask questions about whether digital mental health interventions can produce worthwhile clinical outcomes when recruitment and delivery are supported by a research team.8 In contrast, implementation studies use increasingly well defined methods and frameworks to ask questions about how to integrate these interventions into health care systems. For example, working within the integrated Promoting Action on Research Implementation in Health Services (i-PARIHS) framework,9 we may look at how the recipients (eg, attitudes and workflows of clinicians, patients, and medical staff), context (eg, clinic culture, sociopolitical climate, reimbursement structures), characteristics of the innovation (eg, the digital mental health intervention), and facilitation (ie, clinic roles and processes established to support implementation) impact uptake.

Studies like that of Gilbody and colleagues4 suggest that, as a community, we must work toward building core knowledge about what facilitates uptake of digital mental health interventions. Some features of successful implementation in health care—engaging stakeholders (and especially leadership),10,11 adapting the new practice to the clinical context and patient population,12,13 and the common use of multicomponent (as opposed to single-component) implementation strategies to support change at multiple levels14,15—are likely to be important for digital mental health interventions.

However, digital mental health interventions also provide some unique challenges. Traditionally, implementation research has focused on the integration of procedures, tests, and clinical practice guidelines in the clinic. Uptake of digital mental health interventions, however, relies heavily on patient behavior outside of the clinic. To the extent that digital mental health interventions are designed to be used by patients in their daily lives, studying and encouraging uptake calls for expanding current implementation frameworks to more heavily consider patient characteristics, needs, and behaviors. Here we propose some implementation science-derived methods that could be integrated into clinical care and research to build our understanding of how to improve uptake.

First, to enhance our understanding of how to effectively implement digital mental health interventions, we need to include a process-oriented perspective. Although descriptions of patient demographics (recipients), clinic setting (context), and the intervention (innovation) are routinely reported in clinical trials, key details on the process of implementation often go undocumented. Process is multifaceted. It involves

  • how the intervention is introduced, such as what is said to patients by clinic staff, what is written in patient handouts, and where patients’ questions are directed
  • the specifics of adoption and use patterns, such as how patients enroll, how the intervention impacts clinicians’ workflow, and when and how patients use it
  • describing any follow-up, such as if clinicians or research staff follow up with patients and if patients receive automated or personalized messages if they disengage
  • a detailed account of any financial or personnel resources provided by research or program evaluation teams.

Monitoring and reporting these types of details require relatively small amounts of added effort, but can provide substantial and sometimes generalizable information.

Second, we need to systematically collect qualitative data to deepen our understanding of how contextual factors, process dynamics, and other nuances impact intervention adoption and use. Conducting semistructured interviews with patients (or subsets of patients), for example, will shed light on the reasons underlying low engagement. Similarly, interviewing clinicians and clinic administrators will generate insights on ways organizations can support improved adoption and sustained engagement. While such interviewing requires significant effort, the data resulting from such interviews will be invaluable.

Third, we need to intentionally apply implementation strategies.16 To date, in digital mental health, we have limited knowledge of what techniques might improve adoption and use. Many techniques have been suggested, from facilitation to contingency management (ie, paying participants and/or providers for engaging). Specifying implementation strategies used in clinical or research settings will be important, as will comparing alternative strategies. We may, for example, evaluate the impact of employing a staff member with 50% dedicated time to following-up with patients and providers who are using the digital intervention and how such follow-up impacts use patterns (an example of facilitation).

Digital mental health has reached a critical juncture. We must ask ourselves whether our almost exclusive focus on intervention effectiveness research is going to advance (and ultimately improve) mental health treatment for patients and public health more broadly. Extending our focus to include more implementation research will require asking different questions, modifying study designs, and striking a more equal balance among clinical and implementation outcomes. It will require us to report the minutiae of how interventions are introduced, leverage qualitative data, and specify implementation strategies used and explore ways to test them rigorously. Only then will we be able to make progress toward the real-world potential that many of us believe exists for digital interventions to transform treatment of mental illness.

Published online: May 14, 2019.

Potential conflicts of interest: Dr Lipschitz has received research support From Actualize Therapy LLC and has served as a consultant for Pear Therapeutics, Inc. Dr Mohr has received consulting fees from Apple Inc and has ownership interest in Actualize Therapy LLC. Drs Hogan and Bauer have no potential conflicts of interest relevant to the subject of this article.

Funding/support: None.

REFERENCES

1. Wells MJ, Owen JJ, McCray LW, et al. Computer-assisted cognitive behavior therapy for depression in primary care: a systematic review and meta-analysis. Prim Care Companion CNS Disord. 2018;20(2):17r02196. PubMed CrossRef

2. Apaydin EA, Maher AR, Raaen L, et al. The use of technology in the clinical care of depression: an evidence map. J Clin Psychiatry. 2018;79(5):18r12118. PubMed CrossRef

3. Wright JH, Owen JJ, Richards D, et al. Computer-assisted cognitive-behavior therapy for depression: a systematic review and meta-analysis. J Clin Psychiatry. 2019;80(2):18r12188.

4. Gilbody S, Littlewood E, Hewitt C, et al; REEACT Team. Computerised cognitive behaviour therapy (cCBT) as treatment for depression in primary care (REEACT trial): large scale pragmatic randomised controlled trial. BMJ. 2015;351:h5627. PubMed CrossRef

5. Anguera JA, Jordan JT, Castaneda D, et al. Conducting a fully mobile and randomised clinical trial for depression: access, engagement and expense. BMJ Innov. 2016;2(1):14-21. PubMed CrossRef

6. Gilbody S, Brabyn S, Lovell K, et al; REEACT collaborative. Telephone-supported computerised cognitive-behavioural therapy: REEACT-2 large-scale pragmatic randomised controlled trial. Br J Psychiatry. 2017;210(5):362-367. PubMed CrossRef

7. Eccles MP, Mittman BS. Welcome to implementation science. Implement Sci. 2006;1(1):1. CrossRef

8. Bauer MS, Damschroder L, Hagedorn H, et al. An introduction to implementation science for the non-specialist. BMC Psychol. 2015;3(1):32. PubMed CrossRef

9. Harvey G, Kitson A. PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci. 2016;11:33. PubMed CrossRef

10. Lukas CV, Holmes SK, Cohen AB, et al. Transformational change in health care systems: an organizational model. Health Care Manage Rev. 2007;32(4):309-320. PubMed CrossRef

11. Leeman J, Baernholdt M, Sandelowski M. Developing a theory-based taxonomy of methods for implementing change in practice. J Adv Nurs. 2007;58(2):191-200. PubMed CrossRef

12. Carvalho ML, Honeycutt S, Escoffery C, et al. Balancing fidelity and adaptation: implementing evidence-based chronic disease prevention programs. J Public Health Manag Pract. 2013;19(4):348-356. PubMed CrossRef

13. Cohen DJ, Crabtree BF, Etz RS, et al. Fidelity versus flexibility: translating evidence-based research into practice. Am J Prev Med. 2008;35(suppl):S381-S389. PubMed CrossRef

14. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8(1):139. PubMed CrossRef

15. Forsner T, Wistedt AA, Brommels M, et al. Supported local implementation of clinical guidelines in psychiatry: a two-year follow-up. Implement Sci. 2010;5(1):4. PubMed CrossRef

16. Powell BJ, Waltz TJ, Chinman MJ, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):21. PubMed CrossRef

aDepartment of Psychiatry, Brigham and Women’s Hospital, Boston, Massachusetts

bDepartment of Psychiatry, Harvard Medical School, Boston, Massachusetts

cCenter for Healthcare Organization and Implementation Research, Edith Nourse Rogers Memorial Veterans Hospital, Bedford, Massachusetts

dDivision of Health Informatics and Implementation Science, Population and Quantitative Health Sciences, University of Massachusetts Medical School, Worcester, Massachusetts

eCenter for Healthcare Organization and Implementation Research, Veterans Affairs Boston, Healthcare System, Boston, Massachusetts

fCenter for Behavioral Intervention Technologies, Northwestern University Feinberg School of Medicine, Chicago, Illinois

*Corresponding author: Jessica M. Lipschitz, PhD, Department of Psychiatry, Brigham and Women’s Hospital, 221 Longwood Ave, Boston, MA 02115 ([email protected]).

J Clin Psychiatry 2019;80(3):18com12659

To cite: Lipschitz J, Hogan TP, Bauer MS, et al. Closing the research-to-practice gap in digital psychiatry: the need to integrate implementation science. J Clin Psychiatry. 2019;80(3):18com12659.

To share: https://doi.org/10.4088/JCP.18com12659

© Copyright 2019 Physicians Postgraduate Press, Inc.

Related Articles

Volume: 80

Quick Links:

References