January 19, 2017

Fundamental Considerations for the Implementation of Evidence in Practice

Implementation science is the scientific study of methods that support the adoption of evidence based interventions into a particular setting (e.g., health, mental health, community, education, global development).  Implementation methods take the form of strategies and processes that are designed to facilitate the uptake, use, and ultimately the sustainability – or what I like to call the ‘evolvability’ – of empirically-supported interventions, services, and policies into a practice setting (Palinkas & Soydan, 2012 ; Proctor et al., 2009); referred to herein as evidence-based practices (EBPs).

The National Implementation Research Network refers to implementation as a specified set of activities designed to put into practice an activity or program of known dimensions.  NIRN makes the point that implementation processes are purposeful and are described in sufficient detail such that independent observers can detect the presence and strength of the “specific set of activities” related to implementation.  The intent is that the activity or program being implemented can be described in sufficient detail so that independent observers can detect its presence and strength and its implementation can be replicated in similar contexts.

Implementation is commonly considered one of the last stages of the intervention research process that follows the results of effectiveness studies. It sets out to explore the role of context, how best to prepare for evidence adoption and scale up, the practical considerations of implementation, and how best to facilitate sustainability.

Implementation focuses on taking interventions that have been found to be effective using methodologically rigorous designs (e.g., randomized controlled trials, quasi-experimental designs, hybrid designs) under real-world conditions, and integrating them into practice settings (not only in the health sector) using deliberate strategies and processes (Powell et al., 2012 ; Proctor et al., 2009; Cabassa, 2016).  Hybrid designs have emerged relatively recently to help us explore implementation effectiveness alongside intervention effectiveness to different degrees (Curran et al,  2012).

Cumulative experience and evidence in implementation science suggests several fundamental considerations for success. What follows are my own musings about what these fundamental considerations are, and these will likely evolve as I progress in my own research and follow the work of my implementation colleagues worldwide.

Fundamental Considerations for Evidence Implementation

  1. The implementation of empirically supported interventions or practice innovations is a dynamic social process.  One does not manage this in isolation of colleagues, partners, stakeholders, leaders or champions.
  2. Implementation is shaped by the context in which the practice innovation takes place, by the people involved in this process (those who provide the new intervention and those who receive it), the characteristics of the intervention, and the characteristics of the inner and outer systems (see Damschroder et al., 2009).
  3. Implementation unfolds over time through stages, requiring transformation of the practice context and, often, some degree of adaptation of the innovation (Cabassa & Baumann, 2013). Transformation of the setting (readiness or preparation) takes time, to build the organizational conditions for practice change success before training takes place, and when this is disregarded (as it often is), it leads to unsuccessful change.  System change has implications for leadership expectations in rolling out and scaling up EBPs across a system. Organizations and individuals reach absorptive capacity for implementation, and so timelines and system expectations need to be paced accordingly.
  4. One must consider the costs of implementation and allow organizations and system to budget for the changes that will take place, often over several years including and beyond the initial implementation.  The context in which implementation occurs has important implications for cost and sustainability.  Consider who is driving the change and paying for it: a research grant, an organization, a government, or a purveyor.
  5. Adaptation of EBPs requires knowledge of the active ingredients of the intervention and fidelity (competence and adherence) to these elements.
  6. Implementation often requires co-creation (see Metz 2015), involving the interaction, collaboration, and participation of stakeholders or knowledge users at multiple levels of an organization and system of care (Aarons, Horowitz, et al., 2012). Implementation teams require the engagement of organizational leaders, directors, managers, administrators, service providers, frontline staff, clients, and their family members, as implementation entails a multitude of social processes, including planning, decision making, negotiating, prioritizing, problem solving, service delivery, restructuring, and the allocation of resources (Cabassa 2016). Implementation complexity calls for greater social interaction and involvement of stakeholders who can facilitate the process and can bring knowledge and expertise about the intervention, and locally grounded knowledge, skills, and understanding about the settings and communities in which the intervention will be used. Implementation science is a collaborative endeavor (Cabassa 2016).
  7. Implementation is inherently about change management and involves a new way of doing things within an organization or system.It overlaps with but differs in important ways from quality improvement (see Bauer et al, 2015). Implementation involves a change in the status quo that requires adaptations and adjustments in attitudes, social norms, practices, procedures, work flow, behaviors, and policies. The change process is is guided by a combination of project management and implementation strategies and processes  (Powell et al., 2012).
  8. Implementation of evidence-based care/practices fundamentally involves two content areas: the evidence related to the innovation and the evidence on implementation science. Relatedly, this means that evaluation of implementation efforts and indicator tracking fundamentally requires evaluation of the innovation target (i.e., health outcome) and implementation outcomes (see Proctor et al., 2012).
  9. Implementation science is an emerging and dynamic field. Current best evidence about IS can be instructive but it is a moving target, and we are still learning a range of things in this field, including:
    1. What constructs/factors are important to consider for implementation success
    2. How to measure these key factors
    3. What processes work best in what contexts and for what types of evidence
    4. Identify the best research methods for IS and train health scientists
    5. Optimal reporting of IS research (StaRI Guidelines); important so we can learn what works (See BMJ Open, in print, Pinnock et al; two papers)
    6. Development of methods/measures for tracking and measuring implementation costs
    7. Research what implementation success looks like, including expectations of sustainability (and what I call “evolvability”)
    8. How best to measure implementation outcomes
    9. Develop resources and tools to support implementation, done independently or with technical assistance.
  10. There is growing global expertise in implementation practice and science, as manifest in theGlobal Implementation Initiative, the Global Implementation Conference, the Society for Implementation Research Collaboration, and  jurisdictional implementation communities.
  11. We need to build capacity for implementation within systems and within organizations, and this includes building organizational competencies and a workforce of implementation specialists (requiring the development of requisite training programs and curricula). Some thinking is occurring around this issue, such as the Global Implementation Society hosted by the Global Implementation Initiative.
  12. Implementation of evidence involves embedded implementations – at the clinical level, organizational, and systems level. This means implementing change at all levels (i.e., new ways of doing business, changing processes, building transformative leadership and organizational conditions for change), not only implementing the evidence in practice.
  13. Implementation is not a ‘make it so’ proposition, but rather is complex and time consuming (varying time and complexity for different situations), and it is in our best interest to recognize these challenges whilst trying to fulfill political and real world expectations.
  14. Implementation is not a one size fits all enterprise. We should not expect to arrive at a menu of implementation theories/frameworks, approaches and/or strategies that can be mapped to sector, context, health issue, or population. It will always be an iterative process.
  15. There are likely some factors that are universally important for implementation success across contexts (i.e., heath, mental heath, community, global health, education), and some that are unique to  a particular setting.  Research is working to parse this in a way that can help us to better plan for an engineer successful change (Barwick 2016; and Barwick et al submitted).
  16. Implementation planning and ultimately its success will depend, in part, on the implementation context or the mechanisms driving the implementation. Whether the implementation initiative is driven by the motivation of the organization, government, funder, EBP purveyor, or research study likely matters for planning, costs, and most especially for sustainability (Barwick 2016; and Barwick et al., submitted).

As mentioned above, this list of implementation fundamentals is ‘evergreen’, and will evolve over time.  Please feel free to contribute a comment; what is missing, and is there face validity to those identified in this list?

References

Barwick M. (2016). The Consolidated Framework for Implementation Research: Comparison across contexts. Paper presented at he 3rd Biennial Australasian Implementation Conference, Melbourne Australia, October 6th 2016.

Barwick M, Kimber M, Akrong L, Johnson S, Cunningham CE, Bennett K, Ashbourne G, Godden T. (Submitted Nov 3 2016, PLOS ONE). Evaluating Evidence-Informed Implementation: A Multi-Case Study of Motivational Interviewing in Child and Youth Mental Health.

Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM. (2015). An introduction to implementation science for the non-specialist. BMC Psychology 2015,3:32.

Brownson RC, Colditz G, Proctor E. (Eds.)  (2012). Dissemination and implementation research in health: Translating science to practice. Oxford: Oxford University Press.

Cabassa, LJ.  (2016).  JOURNAL OF SOCIAL WORK EDUCATION, 2016, VOL. 52, NO. S1, S38–S50, http://dx.doi.org/10.1080/10437797.2016.1174648.

Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C (2012). Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care, 50: 217-226. 10.1097/MLR.0b013e3182408812.

Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, et al. (2009) Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci 4: 50. DOI: 10.1186/1748-5908-4-50.

Metz A. (2015). Implementation Brief: The potential of co-creation in implementation science. Chapel Hill NC: NIRN.  Accessed January 12th 2017 from http://nirn.fpg.unc.edu/sites/nirn.fpg.unc.edu/files/resources/NIRN-Metz-ImplementationBreif-CoCreation.pdf

Palinkas IA & Soydan H. (2012). New horizons of translational research and research translation in social work. Research on Social Work Practice, 22, 85-92.

Pinnock H, Barwick M, Carpenter C, Eldridge S, Grandes G, Griffiths C, Rycroft-Malone J, Meissner P, Murray E, Patel A, Sheikh A, Taylor S. (In press, January 11 2017). Standards for Reporting Implementation Studies (StaRI) Explanation and Elaboration Document. BMJ Open

Pinnock H, Barwick M, Carpenter C, Eldridge S, Grandes G, Griffiths C, Rycroft-Malone J, Meissner P, Murray E, Patel A, Sheikh A, Taylor S (In press, Oct 25 2016). Standards for reporting implementation studies (StaRI) Statement. British Medical Journal.

Powell BJ, , McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, York JL. (2012). A compilation of strategies or implementing clinical innovations in health and mental health. Medical Care Research and Review, 69, 123-157.

Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C & Mittman B. (2012). Implementing research in mental health services: An emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health and Mental Health Research, 36, 24-34.