Skip to main content
  • Research article
  • Open access
  • Published:

The survivability of dialectical behaviour therapy programmes: a mixed methods analysis of barriers and facilitators to implementation within UK healthcare settings

Abstract

Background

Dialectical Behaviour Therapy (DBT) is an evidence-based intervention that has been included in the National Institute of Health and Care Excellence guidelines as a recommended treatment for Borderline Personality Disorder in the UK. However, implementing and sustaining evidence-based treatments in routine practice can be difficult to achieve. This study compared the survival of early and late adopters of DBT as well as teams trained via different training modes (on-site versus off-site), and explored factors that aided or hindered implementation of DBT into routine healthcare settings.

Methods

A mixed-method approach was used. Kaplan-Meier survival analyses were conducted to quantify and compare survivability as a measure of sustainability between early and late implementers and those trained on- and off-site. An online questionnaire based on the Consolidated Framework for Implementation Research was used to explore barriers and facilitators in implementation. A quantitative content analysis of survey responses was carried out.

Results

Early implementers were significantly less likely to survive than late implementers, although, the effect size was small. DBT teams trained off-site were significantly more likely to survive. The effect size for this difference was large.  An unequal amount of censored data between groups in both analyses means that findings should be considered tentative. Practitioner turnover and financing were the most frequently cited barriers to implementation. Individual characteristics of practitioners and quality of the evidence base were the most commonly reported facilitators to implementation.

Conclusions

A number of common barriers and facilitators to successful implementation of DBT were found among DBT programmes. Location of DBT training may mediate programme survival.

Peer Review reports

Background

Dialectical Behaviour Therapy (DBT) [1] is a comprehensive cognitive-behavioural treatment originally developed for adult women who meet criteria for Borderline Personality Disorder (BPD), particularly those who engage in suicidal or non-suicidal self-injury. Traditionally, this client group  has been perceived as “treatment resistant” and considered unsuitable candidates for psychotherapeutic intervention [2]. DBT was the first psychological therapy to challenge the culture of therapeutic rejection for individuals with BPD and has become one of the best evidenced treatments for this client group.

Numerous DBT efficacy trials [3,4,5,6,7,8,9,10,11] have demonstrated reductions in suicide attempts, intentional self-injury, anger, depression, hopelessness, and improvements in global functioning [12]. Recent meta-analyses have found moderate to large effect sizes indicating a beneficial effect of DBT when compared to treatment as usual on outcomes such as anger, parasuicidality, and mental health [13, 14]. Furthermore, several randomised controlled trials (RCTs) have examined the application of DBT with other client groups such as older adults with major depressive disorder, eating disorders, and forensic populations [15,16,17,18,19]. Thus, the data on DBT clearly indicate its efficacy for the treatment of BPD and holds promise for a host of other disorders.

In 2009, DBT was included in the National Institute of Health and Care Excellence (NICE) guidelines as a recommended treatment for females with a diagnosis of BPD and a history of repetitive self-harm [20]. Since then, a number of healthcare providers within the United Kingdom (UK) have included the provision of DBT as a quality improvement indicator in an effort to meet national targets in health outcomes for individuals with serious mental illness [21]. Preliminary efficiency research also suggests that DBT has the potential to be a cost-effective treatment for individuals presenting with parasuicidal behaviour [22, 23]. Indeed, it appears that the potential benefits DBT has to offer is gaining traction within routine healthcare settings.

Notwithstanding NICE recommendations, demonstrable treatment efficacy, and potential cost efficiencies, concerns have been raised about the sustainability of DBT programmes within the UK National Health Service (NHS) [24]. Diffusion of Innovations Theory [25] suggests that innovations must be widely adopted in order to self-sustain. Widespread adoption of a new practice depends initially on innovators and early adopters and how quickly the subsequent late majority can be persuaded to shift. Furthermore, it is proposed that ideas not sustained by early adopters are unlikely to spread elsewhere [26]. Thus, effective implementation is relevant not only to long-term sustainability but also subsequent spread of an innovation.

Other factors that can impact sustainability are those directly related to the innovation itself, such as the ease with which it can be implemented and how well treatment effects observed in efficacy trials will generalise to routine healthcare settings. The DBT model entails a comprehensive programme that structures the treatment environment across different modalities to enhance client’s capabilities (skills training groups), improve their motivation (individual therapy), aid generalisation of new skills (telephone skills coaching), and supervise DBT therapists (a consultation team model) [27]. All of the treatment modalities are informed by a coherent theoretical model with associated therapeutic strategies based on cognitive behavioural principles and mindfulness [1, 28]. The programme is delivered by a team of mental health professionals all trained within the DBT model and the rationale for doing so is to alleviate the stress and anxiety of working with a high risk client group in which change is often slow [27]. Nevertheless, the requirement of a specialist trained team usually involves significant reorganisation of existing services and an ongoing commitment to delivering an intensive specialist intervention. This is likely to have an impact on how well DBT is implemented or, indeed, whether it is even considered viable for adoption within a service.

Deciding to implement a new practice is not a discrete event but a set of interactive dynamic processes. The difficulties of translating evidence-based research into real-world settings is widely acknowledged [29], which has led to a growing body of literature examining the various factors involved in the implementation and sustainability of evidence-based practices (EBPs) [30,31,32]. Historically, more attention has been paid to the efficacy of interventions. Whilst such information might help a consumer or agency to select a particular type of intervention, evidence of efficacy alone does not lead to more successful implementation [29], in the same way that simply training practitioners in a new approach does not sufficiently ensure behaviour change [33]. Thus, transfer of innovation needs to be considered within organisational and wider system contexts to ensure that desired change is disseminated, implemented and sustained [34]. However, because organisational restructuring requires changes in service provider behaviour and transformation of systems, translating an EBP into routine practice remains an unquestionably complex and often daunting task.

A number of conceptual frameworks have been developed to aid the process of implementation [29, 31, 35,36,37]. Whilst these frameworks differ somewhat on areas of emphasis and terminology, influences on implementation generally relate to the context (outer and inner), the innovation itself (fit, training, efficacy), implementation processes (planning, selection, evaluation), individual characteristics (motivation, skill), and sustainability factors (fidelity monitoring, penetration, outcomes etc.). These components are considered to be interrelated and a change in one may result in change to others. Therefore, due to the dynamic nature of healthcare systems and their external contexts, a given programme or practice may require more or less of each component at any one time in order to be successfully implemented. This represents a challenge for the implementation and sustainability of innovations, as the relative contribution of each component to overall outcome can change, resulting in the need for ongoing monitoring of processes. Such tasks can be greatly supported by the application of a guiding theoretical framework. Only recently have distinct models for sustainability of evidence-based programmes been produced [38, 39], however, most of the elements of these models (Inner and Outer Contextual Factors, Characteristics of the Interventionists and of the Intervention) are incorporated already in conceptual frameworks of implementation [32, 36].

Considering the above, implementing a comprehensive DBT programme in routine healthcare settings is unlikely to be a straightforward endeavour. Preliminary research into the sustainability of UK DBT programmes that underwent an intensive training programme between 1995 and 2007 confirmed that some teams had difficulty surviving [27]. Highest failure rates were found shortly after training ended (i.e. the second year of the programme) and again in the fifth year. Participants identified a number of challenges associated with implementing DBT in their service, which were generally characterised by an absence of organisational support. Conversely, for teams that had implemented successfully and managed to sustain, the presence of organisational support was identified as a facilitating factor.

In an effort to increase organisational support and promote effective implementation strategies, British Isles DBT (biDBT) have begun to offer an alternative training modality. Typically, training involves teams of practitioners participating in two five-day DBT intensive training events that are delivered off-site, which is known as the ‘open-enrolment route’. Each training event is separated by 8 months during which teams commence the process of setting up and starting a DBT programme. With the new mode, the content and structure of the training is the same; however organisations wishing to deliver DBT programmes are encouraged to host intensive training on-site. This requires a greater financial investment and consideration of how to adapt staff roles in order to successfully deliver treatment, with the idea that greater organisational investment will have a positive influence on the implementation process. This change in training delivery warrants further investigation to examine whether it improves implementation of programmes.

The aims of the present study are threefold: 1) to investigate whether early and late adopters of DBT have differential sustainability, 2) to investigate whether change in training method delivery impacts the sustainability of DBT programmes, and 3) to examine factors that act as a barrier or facilitator to implementation by using a theoretical implementation framework to guide assessment.

Method

Participants

All biDBT programmes that underwent Intensive Training™ between January 1995 and February 2016 were eligible for this study. During this period, whether at on-site or off-site trainings, both the structure and content of the DBT Intensive Training™ remained constant, with only minor modifications to the order of topics taught. All trainings were delivered by two or three members of a six person team who had all been trained to a consistent standard of training, all of whom were adherent DBT therapists. For the sustainability analyses, the unit of analysis was DBT teams. For the survey arm of this study, only one team member from each DBT programme was invited to participate in the study. In the first instance, all DBT team leaders were invited to participate. If a team leader was unavailable, another current team member of an active team, or any former member of inactive teams, was invited to participate.

Design & Procedure

A concurrent mixed-methods approach was employed [40]. Sustainability of DBT programmes was quantified using Kaplan Meier (K-M) [41] survival analysis. biDBT maintain a database to systematically record data on programme start date, activity status (i.e. active or inactive programme), cessation date, and site of training delivery. During the period of the study all programmes were contacted by telephone to establish if they were still active i.e. delivering a DBT programme to clients, consistent with one of Scheirer’s [42] definitions of sustainability. These data were used to analyse survival rates as a proxy for sustainability.

Survival data were triangulated with responses from an online survey to identify factors that may aid or hinder implementation of DBT in routine settings. Initial contact to participate in the survey was made via email to all DBT team leaders registered on the biDBT training database. If an email was returned as undeliverable, an alternative team member was contacted. Participants were provided with information on the purpose of the study and were offered the opportunity to be entered into a prize draw following completion of the survey. A link to the online survey was contained within the body of the initial email.

Measures

A 70-item online questionnaire (Additional file 1) was designed to elicit information regarding DBT teams’ experiences of implementing DBT in their service. The questionnaire consisted of three types of questions (closed, free response, and rating scales) and was conceptually divided into six separate domains. The first domain relates to factors considered to be relevant to practice sustainability and is adapted from Swain and colleagues’ [43] study on the sustainability of EBPs in routine mental health agencies. The remaining five domains are based on Damschroder and colleagues’ [36] Consolidated Framework for Implementation Research (CFIR). The CFIR is an overarching theoretical framework that incorporates common constructs from a range of published theories on implementation and is comprised of five major domains: Intervention Characteristics; Inner Setting; Outer Setting; Individual Characteristics; and Implementation Processes. Each domain includes a constellation of interactive constructs that are purported to influence the implementation process, for a detailed discussion see [36]. Demographic information was also collected.

Analysis

Kaplan-Meier (K-M) [41] survival analyses were carried out to estimate the cumulative survival rates of DBT programmes. Based on the biDBT database teams were ascertained as either active or inactive. Teams that could not be contacted were considered lost to follow-up. Whilst including teams lost to follow-up as censored data is standard practice in K-M analysis, the analysis makes no distinctions within the censored data between teams that cannot be contacted (i.e. lost to follow up) and those that are still functioning. Including teams lost to follow-up as censored (i.e. assuming they are still alive) may make the survival estimate unreliable, we therefore excluded them from the survival analyses.

Study aim 1

To investigate whether there were differences in sustainability between early and late adopters, a K-M analysis comparing survival rates of teams trained between January 1995 and March 2007 (12 years) with teams trained between April 2007 and February 2016 (9 years) was carried out (N = 468). Programme start and cessation dates were used to calculate survival rate. To reduce the potential for unequal amounts of censored data between groups due to differences in duration of cohort timeframes (12 versus 9 years), only the first seven years of a programme within these time frames were analysed. Programmes that survived for at least 2555 days were censored regardless of whether they later became inactive. Teams active at the time of analysis (or active for at least 2555 days) were categorised as censored data. A chi-squared test was used to check for differences in the amount of censored data between groups. A log-rank test was used to test whether the rate of programme closure varied between groups. A Cox regression model was also fitted to estimate a hazard ratio between groups, as log-rank analyses do not yield effect sizes.

Study aim 2

To examine whether training method delivery influenced the sustainability of DBT programmes, a K-M analysis comparing teams trained on-site with teams trained via open-enrolment was carried out. Teams were allocated to their respective study group based on site of training delivery. This information was extracted from biDBT database. Survival rates were calculated using programme start and cessation date. Programmes active at the time of analysis were categorised as censored data. Only DBT programmes that commenced training from January 2009, the date at which the off-site training model was introduced were included in this analysis. A chi-squared test was used to check for differences in the amount of censored data between groups. A log-rank test was used to test whether the rate of programme closure varied between training methods. A Cox regression model was also fitted to estimate a hazard ratio between groups, as log-rank analyses do not yield effect sizes.

Study aim 3

A descriptive content analysis of survey data was carried out by the first author to investigate the frequency with which individual implementation and sustainability constructs were identified as an aid or barrier to a programme’s ability to successfully implement and sustain.

Results

Survival analyses

Study aim 1: Early versus late cohort comparison

A total of 468 teams were included for analysis. Of these, 160 teams were from the pre-April 2007 cohort (inactive n = 55, active n = 46) and 308 teams (inactive n = 157, active n = 55) were from the post-April 2007 cohort. A chi-squared test indicated significant differences in the distribution of active, and inactive teams between the pre and post April 2007 groups (χ2 = .23.164, df = 1, p-value = 1.488e-06), in that the post-April 2007 group had more censored and less inactive data than the pre-April 2007 group. K-M survival curves (Fig. 1) and log-rank test indicated that the pre-April 2007 group had a faster rate of closure than the post-April 2007 group (χ2 = 6.819, p = .009). Cox regression indicated that the hazard ratio was 0.607 (95% CI = 0.416–0.886, reference category = pre-April 2007 group) with a Cohen’s d approximation = −.389. Highest programme failure rates were found in the second year for both cohorts.

Fig. 1
figure 1

Comparison of survival curves between DBT programmes trained prior to and post April 2007

Study aim 2: Training method comparison

A total of 266 teams were included for analysis. Fifty-two teams (active n = 35, inactive n = 17) were trained on-site and 214 teams (active n = 187, inactive n = 27) were trained off-site. A chi-squared test indicated greater levels of censored data in the on-site group (χ2 = 10.802, p = .001). K-M survival curves (Fig. 2) and log rank test showed that teams trained off-site had a significantly higher probability of survival than teams trained on-site (χ2 = 9.801, p = .002). Cox regression indicated that the hazard ratio was 2.554 (95% CI = 1.392–4.688, reference category = off-site) with a Cohen’s d approximation = 0.731). Highest failure rates were found in the second year for teams that trained on-site, compared to the third year for teams trained via open-enrolment.

Fig. 2
figure 2

Comparison of survival curves between DBT programmes trained off-site and onsite

Barriers and facilitators to implementation

Study aim 3

The online questionnaire was completed by 68 respondents. Sixty-two (91%) were from active teams and 6 (9%) were inactive. Of the active teams, the majority of respondents were located in England (n = 38, 61%) and the remainder were located in Wales (n = 8, 13%), Scotland (n = 2, 3%), and Ireland (n = 8, 13%). The proportion of teams containing the following professions were: clinical psychologists (n = 56, 83%), nurses (n = 52, 77%), social workers (n = 22, 33%), psychological therapists (n = 22, 33%), and occupational therapists (n = 13, 21%). The most frequently reported number of DBT trained clinicians within a service was between 4 and 5 (n = 23, 37%), with a range of 2 to 12 trained clinicians. Twenty-nine (46%) respondents worked within community adult mental health services, 12 (19%) within child and adolescent mental health services (CAMHS), and the remainder across a range of learning disability (n = 3, 5%), eating disorders (n = 2, 3%), forensic (n = 7, 10%), youth mental health (n = 1, 2%), personality disorder (n = 1, 2%) and inpatient settings (n = 9, 13%). Fifty-three (85%) active teams fell within the statutory service sector and 9 (15%) within the private sector.

Of the six inactive teams who completed the online survey, the median survival time was 2015 days (5.5 years), range 635–4405 days. All respondents from inactive teams were asked to provide three reasons why they thought their DBT programme discontinued. The most frequently cited reason for programme failure was lack of management support (n = 5, 83%) either due to lack of understanding of how DBT works, insufficient time allocated to deliver DBT, or priority given to competing service demands. Lack of funding (n = 3, 50%), lack of colleague support (n = 3, 50%), and staff turnover (n = 2, 33%) were other reasons reported for programme failure. One respondent also cited high dropout rates as a reason for their programme ending but reflected that this may have been as a result of “overly rigid referral criteria”.

Content analysis

Response frequencies and percentages for each implementation construct were counted for the total online survey sample. Respondents were also invited to leave comments to further elaborate their responses within each implementation domain. All comments were analysed by the lead author and grouped according to the implementation category referred. Due to the small response rate from inactive teams, comparative analyses of response differences between active and inactive programmes could not be carried out.

Barriers to implementation

The most frequently endorsed barrier to implementing DBT was practitioner turnover (n = 40, 59%) followed closely by financing (n = 35, 52%). Other common barriers were availability of resources (n = 28, 41%), the perceived difficulty of implementing DBT (n = 27, 40%), and external change events (n = 23, 34%). No constructs within the Individual Characteristics or Outer Setting domains were strongly endorsed as barriers to implementation. Table 1 provides illustrative comments to the most commonly reported barriers to implementing DBT.

Table 1 Barriers to Implementing DBT

Aids to implementation

There were a number of constructs strongly endorsed as aiding the implementation process, the most common being the quality of the DBT evidence base (n = 60, 88%). Other frequently endorsed constructs were practitioner skills (n = 56, 82%), acceptability of DBT by clients (n = 54, 79%), the perceived advantage to implementing DBT into practice (n = 53, 78%), practitioner attitudes (n = 53, 78%), DBT training (n = 52, 77%), practitioner readiness (n = 51, 75%), and shared willingness among DBT clinicians to implement the programme (n = 51, 75%). All constructs within the Individual Characteristics domain were strongly endorsed as aiding the implementation process. Illustrative comments are provided in Table 2.

Table 2 Aids to Implementing DBT

Sustainability

Frequency and percentage data were collected on a number of factors considered to be related to sustainability of interventions such as collection of client outcome data, extent of programme penetration, ongoing training and consultation, and treatment fidelity. Of the active teams, 51 (82%) collected client outcome data, which was mainly used for tracking client progress and auditing the effectiveness of the programme. Seven (11%) respondents indicated that they were serving considerably fewer clients than when they initially commenced DBT training. Twenty-nine teams reported that they were serving approximately the same (47%) and 26 (42%) said they were serving a lot more clients since initial training. Thirty-seven (60%) respondents had received external consultation. However, only 24 (39%) reported accessing DBT expert supervision. The majority of teams, 43 (69%), carried out new team member training and 34 (55%) had received booster training. With regards to treatment modalities, 61 teams (98%) offered skills training and individual therapy, 60 (96%) ran a consultation group, and 48 (77%) offered telephone support. Finally, 41 teams (66%) had made adaptations to the DBT model and of these, 20 (32%) reported making changes during the initial training phase.

All six inactive teams collected outcome data. Four teams used the data (67%) to demonstrate clinical outcomes and cost effectiveness. One respondent (17%) indicated that they had served considerably fewer clients post initial training phase, with the remaining respondents either having served the same amount (n = 2, 33%) or a lot more clients (n = 3, 50%). Only two teams (33%) did booster training and no teams carried out new team member training. Five teams (83%) had offered all four DBT treatment modalities: individual therapy, group skills training, therapist’s consultation group, and 24-h telephone access. One team (17%) did not offer telephone consultation. Only two teams (33%) reported modifying the DBT model to suit their service needs and of these, one team made modifications during the initial training period whilst the other implemented one full round of DBT before making adaptations.

Discussion

Consistent with earlier data [27], the highest failure rate for DBT programmes was observed in the second year post-training. Despite this early high failure rate the survivability of DBT programmes compares well with other evidence-based programmes reported in the literature. Cooper and colleagues [44] reported that 69% of delinquency and violence prevention programmes in a state-wide implementation sustained at two to four years post-initial seed-funding. Whilst the National Implementing Evidence Based Practices project reported that 80% of sites sustained at two years post implementation [45] and 47% at six years, although, in the six year data, sustainability rates varied between the five interventions studied from 25 to 69%. DBT compares favourably with these figures with survivability rates of 88% at two years and 69% at eight years.

Differences in the survival rates between the early and late implementers is not particularly surprising, although the different rates of censored data between the cohorts means that the result should be interpreted with caution. Several factors might account for this difference. Early adopters are known to be psychologically different from their peers and often in influential positions [46]. Whilst they may have adopted DBT early they may also have been keen to move on to the next innovation. Secondly, DBTs place as an evidence-based intervention within the UK became more solid with the publication of the NICE guidance in 2009 [20]. The advocacy for DBT within the guideline may have provided an ‘outer context’ support to teams training post-2007, just as publication of the guidance also boosted training in DBT [47].

Traditionally, the translation from science to practice has been a passive process that has usually only involved diffusion and dissemination of EBP information, with the hope that this is sufficient to change practitioner behaviour. There is a current shift towards a more active approach whereby outside experts work alongside organisations to help achieve implementation success and assure benefits to consumers [48]. Results from the present study found that on-site training did not increase the probability of survival. Survival curve comparison of training delivery methods indicated programmes trained off-site had a significantly higher probability of surviving. This is a surprising finding, given that on-site training was designed to increase organisational investment in DBT implementation. However, this finding must be interpreted with caution, as the amount of censored data between the comparison groups was found to be significantly different, limiting conclusions that can be drawn about differences between groups. Notwithstanding this caveat, a possible explanation for the differences may be that those attending off-site training have engaged in a substantial amount of pre-planning and assessment of organisational readiness, and in efforts to obtain management buy-in, have identified an explicit need for implementing DBT in their service setting. In doing so, they are possibly more likely to have actively considered how an implementation plan may be executed. Addressing organizational funding and resources and aligning the innovation with organizational goals are factors known to be associated with sustainability [39, 43, 45, 49]. Teams attending off-site training have typically had to actively pursue funding and gain agreements from their organisation to attend. This may indicate that individuals in teams pursuing this route may possess particular leadership skills that may also relate to sustainability [49, 50]. Attending off-site training provides greater opportunities to network with other teams, allowing for the sharing of experiences and ideas, which prove beneficial to implementation and sustainability. During the second week of training teams present their initial implementation efforts for consultation and feedback from trainers and fellow trainees. In off-site trainings trainees are exposed to a wider range of systems and witness trainers applying the components of the model to these different systems. This more expansive experience may increase knowledge of the core components and principles of DBT. Cooper and colleagues [44] similarly highlighted that greater knowledge of a progammes’ logic model increased the likelihood of sustainability.

Practitioner turnover and financing were the most commonly identified barriers to implementing DBT programmes. This is consistent with findings from other studies [43, 45, 50]. Indeed, these constructs may interact, as difficulty financing new team members was one of the main problems identified when practitioner turnover was high. Financing initial training was identified as a key barrier for some programmes. Although, a few overcame this difficulty by securing initial funding from external sources and then using evaluation and outcome data to secure ongoing funding from their organisations. Other programmes identified difficulties with ongoing financing, whether it was for training new team members, booster training, or accessing expert supervision or consultation. Whilst securing financing is a common theme both in this study and in others [43, 45, 50] consideration is rarely given to the costs of de-implementation and, in the case of DBT, failing to provide an intervention that may produce cost-savings [22]. Developing models that highlight the costs of failing to sustain may prove useful to influence leaders both in the inner and outer context or organisations to continue to support an evidence-based intervention.

A number of facilitators to implementation were identified. Most notably, all constructs within the Individual Characteristics domain were strongly endorsed as aiding the implementation process. A number of respondents reported highly motivated or skilled practitioners, effective leadership of the DBT team, or the presence of a DBT champion as key to overcoming barriers encountered to implementation and sustainability of programmes. This finding highlights how a strength in one or more areas can compensate for weaknesses in others [29]. Nevertheless, overreliance on an individual(s) to ensure effective implementation and sustainability leaves a programme particularly vulnerable to practitioner or leadership turnover. Organisations are dynamic and so the relative contribution of implementation constructs can inevitably wax and wane. This poses a difficulty for organisations because changes in one construct requires adjustments in others. Thus, successfully managing such changes will require effective monitoring and feedback systems to keep a programme on track [48], as well as ongoing availability of resources to do so.

Characteristics of the intervention, a feature in many implementation and sustainability models [31, 36, 38, 39] in particular the quality of the evidence base for DBT, was strongly endorsed as aiding the implementation process. Whilst efficacy data alone maybe insufficient for changing practice, findings from this study indicated that for some programmes research data played a crucial role in securing management commitment to delivering DBT. The quality of the evidence base may be of particular relevance during pre-planning and preparation stages, allowing for organisations to weigh up the suitability of DBT for their service and make as assessment of perceived benefits and ‘fit’ with the context [38]. For populations where the evidence base for DBT is not as extensive or robust, the lack of efficacy data may present a barrier to implementation. In this instance, the opportunity to trial a DBT programme and collect effectiveness data may prove beneficial.

Over half of survey respondents indicated that their programme engaged in practices which are considered pertinent to sustainability, with the exception of receiving supervision from a DBT expert. This is an encouraging finding and suggests that teams are aware of the need for continuous monitoring and collection of outcome data as an aid to sustainability [43]. Given that the highest failure rates for programmes are found within the active implementation stage (i.e. first two years), programmes should also consider identifying and monitoring implementation outcomes, distinct from service and treatment outcomes. Evaluation of implementation outcomes will provide an indicator of implementation success and yield an index of implementation processes. Also, because treatment effectiveness requires successful implementation, monitoring implementation outcomes is a necessary intermediate step to obtaining desired clinical and service outcomes [51].

There are a number of limitations to the study. The first being the small number of survey respondents from inactive teams, which prevented comparative analyses, and limits the conclusions that can be drawn from the findings. Second, the method of data collection prevented exploration of research participants’ interpretation of questions or the opportunity to clarify responses. Although a summary question was included at the end of each survey domain, not all respondents chose to elaborate their responses, limiting the amount of qualitative data collected. Lastly, the retrospective accounts from individual team leaders/members must be interpreted with caution due to problems inherent with self-report, such as post-hoc rationalisation. Future research should endeavour to recruit multiple respondents from programmes to reduce the likelihood of methodological bias, as well as recruit greater numbers of inactive teams to ensure a representative sample of respondents.

Despite these limitations, the present study possessed a number of strengths. There are few studies in the literature studying sustainability beyond the early stages of implementation (post-two years) and none, to our knowledge, that allow the comparison of two different types of training delivery that may have implications for sustainability. In addition, the use of a concurrent mixed-methods approach allowed quantitative findings to be complimented with qualitative information providing greater insight into the complexities of implementation and sustainability processes. The existing implementation literature utilizes a wide range of definitions and terminologies rendering extrapolation of constructs across settings difficult. By using the CFIR as a scoping tool to guide assessment of the barriers and facilitators to DBT, a number of constructs salient to implementing DBT in routine healthcare settings were identified, allowing for refinement of more relevant assessment tools for future research.

Conclusions

Successful implementation and sustainability of healthcare innovations into routine settings poses a challenge; DBT is no exception. However, since the onset of biDBT intensive training in 1995, the sustainability of DBT programmes has remained stable and similar to the rates of other innovations, and higher than others. Given the ever-changing landscape and finite resources of healthcare systems, this is an encouraging finding. Nevertheless, a number of programmes struggle to effectively implement and sustain DBT within their organisation. The particular adaptation to the location of training trialed here did not improve the probability of programme survival. Further augmenting on-site training with additional interventions for both inner and outer-context leadership [49, 50] could potentially improve the outcome of such training . A number of factors hindering or facilitating implementation of DBT were reported. Whilst these factors can vary between and within organisations, comparison with previous research suggests that the main barriers or aids to implementation have remained fairly consistent. Future research should include evaluation of predictive models that allow for testing the relative contribution of each implementation component, in order to identify what works in which contexts.

Abbreviations

biDBT:

British Isles Dialectical Behaviour Therapy training team

BPD:

Borderline Personality Disorder

CAMHS:

Child and Adolescent Mental Health Service

CFIR:

Consolidated Framework for Implementation Research

DBT:

Dialectical Behaviour Therapy

EBP:

Evidence Based Practice

K-M:

Kaplan Meier survival analysis

NHS:

National Health Service

NICE:

National Institute for Health and Care Excellence

RCT:

Randomised Controlled Trial

UK:

United Kingdom

References

  1. Linehan M. Cognitive-behavioral treatment of borderline personality disorder: Guilford press; 1993.

  2. Fonagy P, Bateman A. Progress in the treatment of borderline personality disorder. Br J Psychiatry. 2006;188(1):1–3.

    Article  Google Scholar 

  3. Clarkin JF, Levy KN, Lenzenweger MF, Kernberg OF. Evaluating three treatments for borderline personality disorder: a multiwave study. Am J Psychiatr. 2007;

  4. Koons CR, Robins CJ, Tweed JL, Lynch TR, Gonzalez AM, Morse JQ, Bishop GK, Bastian LA. Efficacy of dialectical behavior therapy in women veterans with borderline personality disorder. Behav Ther. 2001;32(2):371–90.

    Article  Google Scholar 

  5. Linehan MM, Armstrong HE, Suarez A, Allmon D, Heard HL. Cognitive-behavioral treatment of chronically parasuicidal borderline patients. Arch Gen Psychiatry. 1991;48(12):1060–4.

    Article  CAS  Google Scholar 

  6. Linehan MM, Schmidt H, Dimeff LA, Craft JC, Kanter J, Comtois KA. Dialectical behavior therapy for patients with borderline personality disorder and drug-dependence. Am J Addict. 1999;8(4):279–92.

    Article  CAS  Google Scholar 

  7. Linehan MM, Dimeff LA, Reynolds SK, Comtois KA, Welch SS, Heagerty P, Kivlahan DR. Dialectical behavior therapy versus comprehensive validation therapy plus 12-step for the treatment of opioid dependent women meeting criteria for borderline personality disorder. Drug Alcohol Depend. 2002;67(1):13–26.

    Article  Google Scholar 

  8. Linehan MM, Comtois KA, Murray AM, Brown MZ, Gallop RJ, Heard HL, Korslund KE, Tutek DA, Reynolds SK, Lindenboim N. Two-year randomized controlled trial and follow-up of dialectical behavior therapy vs therapy by experts for suicidal behaviors and borderline personality disorder. Arch Gen Psychiatry. 2006;63(7):757–66.

    Article  Google Scholar 

  9. McMain SF, Links PS, Gnam WH, Guimond T, Cardish RJ, Korman L, Streiner DL. A randomized trial of dialectical behavior therapy versus general psychiatric management for borderline personality disorder. Am J Psychiatr. 2009;166:1365–1374.

    Article  Google Scholar 

  10. Turner RM. Naturalistic evaluation of dialectical behavior therapy-oriented treatment for borderline personality disorder. Cogn Behav Pract. 2000;7(4):413–9.

    Article  Google Scholar 

  11. Verheul R, van den Bosch LM, Koeter MW, De Ridder MA, Stijnen T, Van Den Brink W. Dialectical behaviour therapy for women with borderline personality disorder. Br J Psychiatry. 2003;182(2):135–40.

    Article  Google Scholar 

  12. Lynch TR, Trost WT, Salsman N, Linehan MM. Dialectical behavior therapy for borderline personality disorder. Annu Rev Clin Psychol. 2007;3:181–205.

    Article  Google Scholar 

  13. Kliem S, Kröger C, Kosfelder J. Dialectical behavior therapy for borderline personality disorder: a meta-analysis using mixed-effects modeling. J Consult Clin Psychol. 2010;78(6):936.

    Article  Google Scholar 

  14. Stoffers JM, Völlm BA, Rücker G, Timmer A, Huband N, Lieb K. Psychological therapies for people with borderline personality disorder. Cochrane Database Syst Rev. 2012;(8):Art. No.: CD005652. https://doi.org/10.1002/14651858.CD005652.pub2.

  15. Lynch TR, Morse JQ, Mendelson T, Robins CJ. Dialectical behavior therapy for depressed older adults: a randomized pilot study. Am J Geriatr Psychiatry. 2003;11(1):33–45.

    Article  Google Scholar 

  16. Lynch TR, Cheavens JS, Cukrowicz KC, Thorp SR, Bronner L, Beyer J. Treatment of older adults with co-morbid personality disorder and depression: a dialectical behavior therapy approach. Int J Geriatr Psychiatry. 2007;22(2):131–43.

    Article  Google Scholar 

  17. Masson, P. C., von Ranson, K. M., Wallace, L. M., & Safer, D. L. (2013). A randomized wait-list controlled pilot study of dialectical behaviour therapy guided self-help for binge eating disorder. Behav Res Ther, 51(11), 723–728.

    Article  Google Scholar 

  18. Robinson AH, Safer DL. Moderators of dialectical behavior therapy for binge eating disorder: results from a randomized controlled trial. Int J Eat Disord. 2012;45(4):597–602.

    Article  Google Scholar 

  19. Shelton D, Sampl S, Kesten KL, Zhang W, Trestman RL. Treatment of impulsive aggression in correctional settings. Behav Sci Law. 2009;27(5):787–800.

    Article  Google Scholar 

  20. National Institute for Health and Clinical Excellence. Borderline personality disorder: recognition and management. [CG78]. London: National Institute for Health and Care Excellence; 2009.

  21. Services registered for CQUIN mental health. (n.d.). https://www.rcpsych.ac.uk/workinpsychiatry/qualityimprovement/cquin/cquinfaq.aspx. Accessed 14 Sept 2018.

  22. Brazier JE, Tumur I, Holmes M, Ferriter M, Parry G, Dent-Brown K, Paisley S. Psychological therapies including dialectical behaviour therapy for borderline personality disorder: a systematic review and preliminary economic evaluation. Health Technol Assess. 2006;10(35):23–48.

    Article  Google Scholar 

  23. Priebe S, Bhatti N, Barnicot K, Bremner S, Gaglia A, Katsakou C, Molosankwe I, McCrone P, Zinkler M. Effectiveness and cost-effectiveness of dialectical behaviour therapy for self-harming patients with personality disorder: a pragmatic randomised controlled trial. Psychother Psychosom. 2012;81(6):356–65.

    Article  Google Scholar 

  24. Pitman A, Tyrer P. Implementing clinical guidelines for self harm –highlighting key issues arising from the NICE guideline for self-harm. Psychol Psychother Theory Res Pract. 2008;81(4):377–97.

    Article  Google Scholar 

  25. Rogers, E. M. (2003). Diffusion of innovations (5th Ed.). https://books.google.co.uk/books?id=9U1K5LjUOwEC&printsec=frontcover&dq=editions:XNqTTc9h0ngC&hl=en&sa=X&ved=0ahUKEwjq0uWRu4HNAhVJDMAKHfKdC_MQ6wEIHTAA#v=onepage&q&f=false. Accessed 30 May 2016.

  26. Buchanan DA, Fitzgerald L, Ketley D. The sustainability and spread of organizational change: modernizing healthcare. Routledge: 2006.

  27. Swales MA, Taylor B, Hibbs RA. Implementing dialectical behaviour therapy: Programme survival in routine healthcare settings. J Ment Health. 2012;21(6):548–55.

    Article  Google Scholar 

  28. Linehan MM. Skills training manual for treating borderline personality disorder: Guilford Press; 2015.

  29. Fixsen, D. L., Naoom, S. F., Blase, K. A., & Friedman, R. M., & Wallace, F. (2005). Implementation research: a synthesis of the literature. Tampa, FL: the National Implementation Research Network, Louis de la parte Florida mental health institute, University of South Florida.

  30. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629.

    Article  Google Scholar 

  31. Rycroft-Malone J. The PARIHS framework—a framework for guiding the implementation of evidence-based practice. J Nurs Care Qual. 2004;19(4):297–304.

    Article  Google Scholar 

  32. Stirman SW, Kimberly J, Cook N, Calloway A, Castro F, Charns M. The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implement Sci. 2012;7(1):17.

    Article  Google Scholar 

  33. Swales MA. Implementing dialectical behaviour therapy: organizational pre-treatment. Cogn Behav Ther. 2010;3(04):145–57.

    Google Scholar 

  34. Amodeo M, Storti SA, Larson MJ. Moving empirically supported practices to addiction treatment programs: recruiting supervisors to help in technology transfer. Subst Use Misuse. 2010;45(6):968–82.

    Article  Google Scholar 

  35. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health Ment Health Serv Res. 2011;38(1):4–23.

    Article  Google Scholar 

  36. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50.

    Article  Google Scholar 

  37. Glisson C, Schoenwald SK. The ARC organizational and community intervention strategy for implementing evidence-based children's mental health treatments. Ment Health Serv Res. 2005;7(4):243–59.

    Article  Google Scholar 

  38. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8:117–27.

    Article  Google Scholar 

  39. Shelton RC, Cooper BR, Stirman SW. The sustainability of evidence-based interventions and practices in public health and health care. Annu Rev Public Health. 2018;39:55–76.

    Article  Google Scholar 

  40. Johnson RB, Onwuegbuzie AJ, Turner LA. Toward a definition of mixed methods research. J Mixed Methods Res. 2007;1(2):112–33.

    Article  Google Scholar 

  41. Kaplan EL, Meier P. Nonparametric estimation from incomplete observations. J Am Stat Assoc. 1958;53(282):457–81.

    Article  Google Scholar 

  42. Scheirer MA. Is sustainability possible? A review and commentary on empirical studies of program sustainability. Am J Eval. 2005;26(3):320–47.

    Article  Google Scholar 

  43. Swain K, Whitley R, McHugo GJ, Drake RE. The sustainability of evidence-based practices in routine mental health agencies. Community Ment Health J. 2010;46(2):119–29.

    Article  Google Scholar 

  44. Cooper BR, Bumbarger BK, Moore JE. Sustaining evidence-based prevention programs: correlates in a large-scale dissemination initiative. Prev Sci. 2015;16(1):145–57.

    Article  Google Scholar 

  45. Bond GR, Drake RE, McHugo GJ, Peterson AE, Jones AM, Williams J. Long-term sustainability of evidence-based practices in community mental health agencies. Adm Policy Mental Health. 2014;41:228–36.

    Article  Google Scholar 

  46. Gallo KP, Barlow DH. Factors involved in clinician adoption and nonadoption of evidence-based interventions in mental health. Clin Psychol: Scie Prac. 2012;19:93–106.

    Google Scholar 

  47. Swales MA, Hibbs RAB. (2014). DBT in the UK: updated dissemination and implementation data. 3rd international congress on borderline personality disorder & allied disorders. Rome. 2014:16–8.

  48. Fixsen DL, Blase KA, Naoom SF, Wallace F. Core implementation components. Res Soc Work Pract. 2009;19(5):531–40.

    Article  Google Scholar 

  49. Aarons GA, Green AE, Trott E, Willging CE, Torres EM, Ehrhart MG, Roesch SC. The roles of system and organizational leadership in system-wide evidence-based intervention sustainment: a mixed method study. Adm Policy Mental Health. 2016;43(6):991–1008.

    Article  Google Scholar 

  50. Aarons GA, Wells RS, Zagursky K, Fettes DL, Palinkas LA. Implementing evidence-based practice in community mental health agencies: a multiple stakeholder analysis. Am J Public Health. 2009;99(11):2087–95.

    Article  Google Scholar 

  51. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health Ment Health Serv Res. 2011;38(2):65–76.

    Article  Google Scholar 

Download references

Acknowledgements

We are grateful to all the teams who participated in the study for providing information on their experience of implementation.

Funding

This research was funded by NWCPP. Survey incentives were funded by biDBT Training.

Availability of data and materials

The dataset supporting the conclusions of this article can be found at North Wales Clinical Psychology Programme (NWCPP), School of Psychology, Brigantia Building, Penarallt Road, Bangor, Gwynedd, LL57 2AS.

The research in this paper is based on a thesis submitted by J. C. King to Bangor University (http://e.bangor.ac.uk/9766/1/King%20thesis%202016.pdf).

Author information

Authors and Affiliations

Authors

Contributions

JCK is the principle researcher and was responsible for the design of the study, data collection and analyses. MAS supervised the study. RH and CWNS provided support with statistical analysis of data. JCK drafted the initial manuscript. All other authors (MAS, RH, CWNS) read and contributed to modified drafts. All authors have approved the final manuscript.

Corresponding authors

Correspondence to Joanne C. King or Michaela A. Swales.

Ethics declarations

Ethics approval and consent to participate

Ethical approval was obtained from Bangor University Ethics Committee – Reference number: 2015–15,499-A13485.

Consent to participate was indicated by completion of the survey. Respondents could request for their survey data to be excluded from the study at any point.

Consent for publication

Not applicable.

Competing interests

M. A. Swales is the Director of the British Isles DBT Training Team that trains practitioners in DBT with a licensed training programme. R. A. Hibbs is the Managing Director of Integral Business Support Ltd. that delivers licensed training in DBT. M. A. Swales and R. A. Hibbs are married.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional file

Additional file 1:

Online Survey Questionnaire. (PDF 10203 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

King, J.C., Hibbs, R., Saville, C.W.N. et al. The survivability of dialectical behaviour therapy programmes: a mixed methods analysis of barriers and facilitators to implementation within UK healthcare settings. BMC Psychiatry 18, 302 (2018). https://doi.org/10.1186/s12888-018-1876-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12888-018-1876-7

Keywords