Psychotherapy training workshops improve skills considerably better if they are linked to subsequent ongoing supervision
Last updated on 12th October 2012
Rinad Beidas & colleagues from the University of Pennsylvania published a paper earlier this year entitled "Training and consultation to promote implementation of an empirically supported treatment: a randomized trial." The abstract reads "OBJECTIVE: The study evaluated the efficacy of three training modalities and the impact of ongoing consultation after training. Cognitive-behavioral therapy (CBT) for anxiety among youths, an empirically supported treatment, was used as the exemplar. Participants were randomly assigned to one of three one-day workshops to examine the efficacy of training modality: routine training (training as usual), computer training (computerized version of training as usual), and augmented training (training that emphasized active learning). After training, all participants received three months of ongoing consultation that included case consultation, didactics, and problem solving. METHODS: Participants were 115 community therapists (mean age of 35.9 years; 90% were women). Outcome measures included the Adherence and Skill Checklist, used to rate a performance-based role-play; a knowledge test; and the Training Satisfaction Rating Scale. RESULTS: All three training modalities resulted in limited gains in therapist adherence, skill, and knowledge. There was no significant effect of modality on adherence, skill, or knowledge from pretraining to posttraining. Participants were more satisfied with augmented and routine training than with computer training. Most important, number of consultation hours after training significantly predicted higher therapist adherence and skill at the three-month follow-up. CONCLUSIONS: The findings suggest that training alone did not result in therapist behavior change. The inclusion of ongoing consultation was critical to influencing therapist adherence and skill. Implications for implementation science and mental health services research are discussed."
Now this issue of what forms of psychotherapy training actually make a difference to therapist effectiveness cries out for better research. As someone whose main psychotherapy qualification is in the proudly-announced "evidence-based" discipline of cognitive behavioural therapy, I'm quietly ashamed of the lack of hard data supporting the helpfulness of the endless round of CBT conferences & workshops. And as I've written before this extends to health professional (and other) education more generally. So I commented "It is of course a worthwhile empirical question - is this (providing lecture-based training workshops) the best way of helping participants become more effective at helping clients, or would outcomes improve more if BABCP workshops were more genuinely "workshops"? My understanding is that the jury is still at least partially out on this one. A major 2007 systematic review - "Effectiveness of continuing medical education" - ended with the usual "More research is needed" comment, although the authors did point out that "Live media was more effective than print. Multimedia was more effective than single media interventions. Multiple exposures were more effective than a single exposure." and that "Based on previous reviews, the evidence indicates that simulation methods in medical education are effective in the dissemination of psychomotor and procedural skills." More recent work continues to underline that lecture-based teaching is very much improvable - see, for example, the 2010 paper "Training therapists in evidence-based practice: A critical review of studies from a systems-contextual perspective" or last year's fascinating "Improved learning in a large-enrollment physics class" with its links to Anders Ericsson's work on the acquisition of expert performance through deliberate practice."
The recent Beidas et al paper is a welcome addition to this limited evidence pool. The authors wrote "Improving the efficacy of brief workshops (for example by encouraging active learning) is desirable but may not be sufficient to change therapist behavior. Ongoing consultation after training may be the critical element in increasing effectiveness of training. For example, one study demonstrated that coaching and performance feedback on cases posttraining changed therapist behavior. The mechanism by which consultation works is unknown, but consultation likely provides therapists with a venue for clarification, and practice of concepts, learning concepts and practicing over time, case consultation, and using problem solving to overcome implementation barriers." Consultation involved the following: "Participants ... were provided weekly consultation via the WebEx virtual conferencing platform for three months after training. Participants could call in via telephone or computer to attend the one-hour weekly virtual group meeting. Those who opted to consult via computer were able to view a whiteboard and the consultant via Web camera. The consultation curriculum was designed with participant input and included case consultation, didactic topics (such as treating a client with comorbid depression), practice with concepts (such as relaxation), and assistance in implementation of the treatment within context (in a psychiatric clinic or school, for example)."
Ideally the research would have looked at client results as the key outcome ... as was done in the fascinating earlier study "Dissemination of cognitive therapy for panic disorder in primary care". What was explored here was how much therapist adherence to an evidence-based treatment protocol and CBT skills changed. They reported "With regard to clinical significance, after a six-hour workshop, approximately 38% of participants were trained to criterion in adherence and 65% were trained to criterion in skill. Ongoing consultation of approximately seven hours over three months brought those numbers up further, to 61% and 85%, respectively. This finding is particularly noteworthy given that other studies have demonstrated only minimal to moderate gains in adherence and skill after training and often a decline at follow-up when training was not followed by consultation." They go on to make the following, fairly game-changing recommendations "Although a one-day workshop can be effective in changing provider knowledge, it is not effective in changing provider behavior. We recommend that training no longer be used as a stand-alone implementation strategy. However, with the combined implementation strategies of brief training and ongoing consultation, community clinicians can be effectively trained in clinical innovations that are complex and multistep. Brief training may be best provided through computer delivery given that it is a permanent product that an agency can purchase that is effective, accessible, and cost efficient and can be used in situations where there is high therapist turnover. Most important, consultation merits as much emphasis as brief training, and we suggest that whatever limited resources are available be used to provide consultation." Interesting and challenging suggestions. They ignore other benefits of training workshops & conferences like the value of meeting with one's colleagues ... but nevertheless this research is of real importance. It calls out for replication and extension and ... that evidence-based therapists & organizations sit up and take notice.