Deprecated: Function Yoast\WP\SEO\Conditionals\Schema_Blocks_Conditional::get_feature_flag is deprecated since version Yoast SEO 20.5 with no alternative available. in /home1/melanieb/public_html/wp-includes/functions.php on line 6078

Deprecated: Function Yoast\WP\SEO\Conditionals\Schema_Blocks_Conditional::get_feature_flag is deprecated since version Yoast SEO 20.5 with no alternative available. in /home1/melanieb/public_html/wp-includes/functions.php on line 6078

Deprecated: Function Yoast\WP\SEO\Conditionals\Schema_Blocks_Conditional::get_feature_flag is deprecated since version Yoast SEO 20.5 with no alternative available. in /home1/melanieb/public_html/wp-includes/functions.php on line 6078

Warning: Cannot modify header information - headers already sent by (output started at /home1/melanieb/public_html/wp-includes/functions.php:6078) in /home1/melanieb/public_html/wp-includes/feed-rss2.php on line 8
Capacity Building Archives | Melanie Barwick Consulting https://melaniebarwick.com/category/capacity-building/ Specialist in Child and Youth Mental Health Systems and Knowledge Translation Fri, 12 Nov 2021 23:21:37 +0000 en-CA hourly 1 https://wordpress.org/?v=6.5.3 160686368 Shortcomings of Evidence Dissemination https://melaniebarwick.com/shortcomings-of-evidence-dissemination/ https://melaniebarwick.com/shortcomings-of-evidence-dissemination/#respond Fri, 12 Nov 2021 23:14:37 +0000 http://melaniebarwick.com/?p=853 As we welcome 2022, the healthcare landscape is replete with organizations whose main purpose is to disseminate empirically supported evidence to a wide range of knowledge users. To be sure, getting evidence to the right people, at the right time, and in the right format remains an important goal, and we are still building capacity...

The post Shortcomings of Evidence Dissemination appeared first on Melanie Barwick Consulting.

]]>
As we welcome 2022, the healthcare landscape is replete with organizations whose main purpose is to disseminate empirically supported evidence to a wide range of knowledge users. To be sure, getting evidence to the right people, at the right time, and in the right format remains an important goal, and we are still building capacity on this front. People cannot make informed decisions in a vacuum, and as the world contends with insurmountable misinformation compounded by the pandemic, targeted and effective dissemination of evidence-based knowledge remains imperative for achieving optimal health and well-being. 

Drivers for improved dissemination include voluntary health sector and philanthropic institutions and health funders who have backed the creation of intermediary and dissemination-focused organizations rooted in specific domains (e.g., see painmental healthinjury prevention, to name but a few). These endeavours have gone a long way to improve access to evidence-based knowledge. Evidence dissemination has been buoyed by targeted funding (i.e., NCE-KM, now retired), advances in social media and e-health technologies, and capacity building for a multi-skilled knowledge practitioner (KTP) workforce that bring the evidence to light. Evidence about a broad range of health topics is now widely accessible on the internet, although the problem of evidence quality and source credibility remains ubiquitous. 

Shortcoming One

For some time now, I have observed two shortcomings in the evidence dissemination space. The first is that despite the volume of evidence dissemination, there appears to be very little meaningful evaluation of its’ outcomes and impacts. If intermediaries are indeed evaluating their dissemination efforts, they’re not sharing what they’re learning. The field seems content with disseminating well-crafted soundbites to targeted audiences but it’s hard to discern to what effect.

Many intermediaries collect indicators of access or spread because, frankly, this is automated and easy to do. But I surmise, based on years of interacting with KT workshop trainees, that the impact of these efforts is not being optimized due to lack of evaluation. Content about dissemination evaluation is something our workshop participants can’t get enough of. What appears to be the norm is that the effort is surrendered mid-way, after a dissemination output is launched and the task is taken to be complete. The result is a missed opportunity to learn whether the dissemination outputs are meaningful and impactful relative to their intended purpose.

The analogy that comes to mind is the dandelion. Some efforts to share evidence are characterized as passive diffusion – the dandelion seed flies with the wind and lands where it may, whether on sterile or fertile soil, to eke out an existence or not. Diffused evidence may travel far but efforts are random, untargeted and unintentional, and thus, impacts are rare. Might as well not have dispersed the knowledge at all.

In the more recent era of evidence dissemination, we have taken care to plant the seeds of knowledge in a more systematic and intentional manner, row upon row, ripe for the picking. Yet, more often than not, we fail to return to the crop to ascertain how it grew, who picked it, and what they did with it. Did it grow as expected (the product), was it seen and by whom (access, reach), and most importantly, how was it experienced (benefits for the knowledge user).

The result is a somewhat routine productivity cycle that overemphasizes a volume of products (dissemination outputs) that are loosely tethered to their purpose and intended goal, and that provides little evidence of goal attainment and subsequent benefits. We are missing the ‘so what’ of the effort, often focusing solely on the production line without assessing the impacts of our dissemination products. This is a huge, missed opportunity that leaves few insights as to whether the investment is yielding impact. Evaluation insights inform future dissemination efforts by highlighting strategies that were effective for specific audiences and aims. Evaluation provides those all-important impact stories. Failure to evaluate leaves us shooting in the dark.

What to do?

Comprehensive frameworks for evidence dissemination lay out an approach that informs intentional, explicit and systematic steps that, if followed, will ensure dissemination outputs are linked to the audience, main message, clarity of purpose, appropriate dissemination strategies, and evaluation of the dissemination effort relative to its’ purpose: where the message landed, who saw it, what they thought of it, and how they benefitted. Sadly, the evaluation component is often unplanned (no identified purpose for the communication and no subsequent indicator of whether it was achieved), aborted or solely captured as reach using web analytics.

It is common to share evidence-based knowledge on the web, as this offers greater visibility, interactivity, and access for a wider audience. Web analytics can tell you such things as how many users are on your site right now, what cities and countries your users are visiting from, what devices your audience uses, the channels that are driving the most traffic to your site, and how users navigate your site and content. What web analytics can’t tell you is how users experienced and subsequently benefitted from the content, and isn’t this also what we want to know?

A couple of years ago, the editorial advisory board at AboutKidsHealth (AKH) gathered to review annual analytics. AKH is a web-based health education resource for children, youth and caregivers that is approved by healthcare providers at The Hospital for Sick Children. The resource aims to empower families to partner in their own health care by equipping them with reliable, evidence-based health information. It does this by making complex health information easy to understand for families and making it immediately available whenever and wherever they are in Canada or the world.

I was new to the editorial board and wondered aloud if we couldn’t do more to explore how users experienced and benefitted from the content. It was a lightbulb moment that subsequently gave rise to connecting with Dr Pierre Pluye at McGill University, who had developed and piloted a survey methodology that captures this type of information about web content. We subsequently adopted the Instrument Assessment Method as a feedback pop up survey on AKH content pages and are now tracking users’ impressions of content relevance and comprehension, usefulness and intended use of the information, and anticipated benefits. These data will inform content revisions, new content, and help us better understand user impacts. Pretty neat.

In short, dissemination organizations need more evaluation of their efforts and they need more opportunities to showcase what they learn with the field. Some conferences offer appropriate venues for this interchange, but these are few and far between. Years ago, I proposed a practice-based magazine (not a journal) where KTPs and their organizations could share their dissemination and implementation practice-based evaluation work, but I didn’t get any traction. I’m still on the lookout for opportunities to realize this vision. 

Shortcoming Two

Most intermediary organizations focus solely on dissemination and do not venture into the implementation space. Dissemination is most definitely an important effort, and for some types of evidence – conceptual and symbolic –[1] this is sufficient if done well. Of late, however, many of my workshop participants have questioned how they can support the implementation of empirically supported innovations when they are not mandated, equipped or resourced to do so. What, they ask, do we do when we want to go beyond dissemination to support how evidence is applied in practice settings or when it has the potential to inform policy (instrumental use1). This seems to be beyond the mandate of many dissemination-focused organizations and represents a need to build knowledge and capacity in implementation science (IS) within the bounds of organizational mandates, resources, and workforce capacity.

What to do

The first solution to this predicament is to ramp up KTPs’ familiarity with implementation science and practice; what it is, what it entails, and how it’s related but different from dissemination[2]. KTPs have an opportunity to disseminate both the evidence-based intervention and the evidence-based guidance on how it can be taken up, yet this rarely happens. Without implementation guidance, we’re essentially disseminating interventions with the intention they will be taken up but without the necessary supports to facilitate this.

This is akin to IKEA sending you home with the flat box of bits and pieces only to discover there are no instructions on how to put it together. Exasperation ensues and the box then sits in the corner of a room, untouched. And so goes the outcome potential for reams of evidence-based interventions left languishing because no one thought to include instructions for use. Tsk, tsk.

The intention here is not to transform KTPs into implementation facilitators necessarily[3], but this might be feasible for some organizations that are prepared to build capacity in this area. Herein lies solution two. Dissemination organizations could broaden their mandate to include implementation facilitation. To do this would be a significant undertaking in workforce development and take time to do well, but kudos to those who can steer their organization in this direction for this is what we need. For some types of evidence, that which is ready for use, dissemination is only part of the journey. We can’t arrive at our destination of improved health and well-being without explicitly attending to implementation.

The third solution is to build IS capacity in health research. Researchers can and should do more to consider how their innovations will be used in practice and to incorporate new research designs (hybrid implementation studies), equitable research and engagement practices, and practical implementation facilitation whilst establishing effectiveness rather than after years of randomized trials. Building IS knowledge among health researchers will yield research innovations that are more amenable to application because implementation considerations will be embedded alongside the intervention.

In other words, innovations are only as effective as their complementary implementation guidance, so both must travel the dissemination pathway hand-in-hand. Effective interventions that are not effectively implemented and used optimally will fail to reap value from existing investments (aka research waste). Implementation considerations are fundamental to intervention development. Leave it too late, and implementation will remain an afterthought that will have you playing catch up for several more years, further delaying optimal outcomes for the population.

So, there you have it, my thoughts on the shortcomings of evidence dissemination occurring across the world. These are modifiable, and I believe making the recommended shifts will move the field ahead and improve the perceived value of dissemination work and the organizations dedicated to it.


[1]  J. M. Beyer (1997) summarizes the three types of research use in the following way: Research on the utilization of research findings has revealed three types of use: instrumental, conceptual, and symbolic. Instrumental use involves applying research results in specific, direct ways. Conceptual use involves using research results for general enlightenment; results influence actions but more indirectly and less specifically than in instrumental use. Symbolic use involves using research results to legitimate and sustain predetermined positions. (P. 17). Quoted from Amara N, Ouimet M, Landry Ré. New Evidence on Instrumental, Conceptual, and Symbolic Utilization of University Research in Government Agencies. Science Communication. 2004;26(1):75-106. doi:10.1177/1075547004267491

[2]Dissemination refers to the process and dissemination strategies that make scientific findings accessible and understandable to the knowledge user. Implementation is the use of implementation process and strategies that promote the adoption, integration and scale-up of evidence-based interventions and change practices within specific settings. These are related but different methods.

[3] Implementation facilitators are specifically trained to apply implementation science knowledge and interventions to enable others to understand what they need to change, plan and execute changes and address barriers to change efforts. They work with implementation teams within implementing organizations to select innovations, adapt them to the local context, and steer implementation process, strategies, and evaluation to support implementation. Adapted from BMC Health Serv Res. 2017; 17: 294. doi: 10.1186/s12913-017-2217-0. Mona J. Ritchie, Louise E. Parker, Carrie N. Edlund, and JoAnn E. Kirchner

The post Shortcomings of Evidence Dissemination appeared first on Melanie Barwick Consulting.

]]>
https://melaniebarwick.com/shortcomings-of-evidence-dissemination/feed/ 0 853
Thoughts on the Common Challenges of Knowledge Translation Practitioners https://melaniebarwick.com/thoughts-on-the-common-challenges-of-knowledge-translation-practitioners/ https://melaniebarwick.com/thoughts-on-the-common-challenges-of-knowledge-translation-practitioners/#respond Thu, 11 Nov 2021 15:52:53 +0000 http://melaniebarwick.com/?p=840 I teach two professional development courses directed at knowledge translation practitioners (KTPs), also commonly known as knowledge brokers. For the decade and a half since these courses launched, it has been interesting to see how the KT workforce has evolved. My workshops have engaged over 3,500 individuals and numerous organizations, and though some participants have been uniquely in...

The post Thoughts on the Common Challenges of Knowledge Translation Practitioners appeared first on Melanie Barwick Consulting.

]]>
I teach two professional development courses directed at knowledge translation practitioners (KTPs), also commonly known as knowledge brokers. For the decade and a half since these courses launched, it has been interesting to see how the KT workforce has evolved. My workshops have engaged over 3,500 individuals and numerous organizations, and though some participants have been uniquely in research roles, many have direct responsibility for KT within their organization. There are many more formal KT positions across a range of organization types and sectors than there were a decade ago, yet this profession remains emergent as KT practitioners hone their skills and knowledge, and employers learn how to situate, integrate and realize value from KT work. In this post, I share some insights and potential solutions for optimizing the KT role within organizations from both the KTP and leadership perspectives. 

New Kid on the Block

KTPs are often challenged with defining and clarifying their role and related expectations in organizations in which they are the only KT professional or one of a small few. This often involves carving out their territory while building alliances with more established departments, like communications. Organizations may have the vision to create a KT position but struggle with defining what the KTP will do or how they will know the work has yielded value. Thus, it’s often left to the KTP to establish clarity about how they will do their job, how they will interact with other departments, plan and manage expectations, and demonstrate value. This is a tough road, but it offers some degree of decision latitude that can be used to their advantage. It requires additional effort to educate colleagues and leaders while doing the work that is more outwardly focused.

What KTPs Can Do 

You would be hard-pressed to find a KTP that has not faced this situation when newly hired. KTPs report this is a common and daunting challenge, and yet, it can be overcome (see our KT Casebook for ideas). I have watched many sole KT positions grow into KT programs with multiple workers and projects, innovations, and impacts (the Evidence-to-Care team at Holland Bloorview Kids Rehab comes to mind here). There are great models and mentors out there.

Networking internally is vital to finding your way and building alliances and integrated workflows with other departments. Take time to meet your colleagues, ask questions, and learn what they do. Tell them how you see your role. Share your KT program plan and look for alignment. Ask how you can support them and work through workflow and timing for these moments. Identify ways they can leverage your goals. Because workplaces are dynamic, internal networking isn’t a one-time occurrence and will require continued effort. Build this time into your workplan.

Networking outwardly is also central to the KTP role. Identify partners and stakeholders and reach out, meaningfully and often. Building relationships that lead to trust, insights, and knowledge will enable you to tap into opportunities for open collaboration (aka integrated KT) and improve your work.

Develop a program plan for how you will work and review it often with your manager and team. The plan should outline how you see your role and how its’ functions align with the organization’s strategic plan, the work of other departments, and the needs and expectations of internal and external partners and stakeholders. The program plan should reference your KT budget, something you should have established existed before you accepted the position. Think of this as the project management part of your job. The plan for how you will work should include evidence-informed KT methods, frameworks, tools, and resources for how you will approach planning and execution of dissemination and, possibly, implementation activities, although few KT programs extend beyond dissemination. Useful tools include the Knowledge Translation Planning Template, for instance. A great case example is the work of Renira Narrandes, a former KTPC* alum. Renira is now KTPC faculty and teaches a segment that showcases the KT program and evaluation framework she developed for the Cundhill Centre at the Centre for Addiction and Mental Health in Toronto. She’s an innovator.

Develop a career plan that provides opportunities for growing your KT skills and your career within and beyond the organization. Be wary of employers who don’t have a vision for what KT means for their organization or don’t value how it can help them achieve their vision. Ensure there is opportunity for compensation growth over time; organizations that have a narrow or undervalued salary range and/or no opportunities for leadership positions will eventually leave you feeling stuck with nowhere to grow. Ensure there are education funds available to you so that you can cultivate your competencies and attend networking events like the Canadian Knowledge Mobilization Forum. Lastly, don’t underestimate the value of volunteering at KT organizations to help you get a start in the field; this has started some great careers.

Evaluate, evaluate, evaluate! Not only is the evaluation of KT activities essential to your success, but it is also the foundation for your career growth. Without evidence of what you have produced, how it has led to impact, and how these align with what your organization values, your manager will have little knowledge of what you do, why it’s important, and what it’s worth when they do your performance evaluation. They also won’t be able to amplify your successes. Failing to evaluate your work is shooting yourself in the foot. Let me stress here that capturing KT productivity – that is, the KT activities and outputs you produced – is only one of the things you should be doing, and sadly, it’s often the only thing KTPs do. There are countless missed opportunities for evaluating whether your KT activities and outputs met their intended target (Hey, SKTT and KTPC alums, remember the KT Goal?!). If you followed the core components of good KT planning, then you disseminated with purpose. Without intentional evaluation you cannot demonstrate that you achieved KT goals or that your KT activities and outputs had any impact on the intended audiences. Impacts are your KT stories and they provide a direct line to communicating your value and your contribution to the organization. Yes, your manager wants to know that you’ve been busy doing KT, but few will have the foresight to ask you to evaluate and demonstrate whether the KT activities have led to benefits for the knowledge users. KTPs must take the reins here and build evaluation into how they do the work, and then they must showcase their impacts. Yes, I’m saying you must KT your KT. Impacts translate to stories that resonate and to career opportunities and advancement. Good evaluation doesn’t happen as often as it should. Don’t ‘phone it in’.

Insert yourself in other work of the organization. It’s easy to develop myopia in any job. Cast your eyes and ears to what is going on in other parts of the organization and beyond. In particular, the strategic plan, the needs of the funder and key partners, the system, and people who are impacted by what you do. Volunteer for working groups, consult with researchers, be helpful. It’s reciprocal.

Manage up and build relationships with your manager and other leaders. This can offer important opportunities to build their understanding of KT, how you can support the organization, and what you have accomplished (impact stories). Look for opportunities to explore how the organization thinks about and values KT and impact. There are few organizations that don’t think about impact but many that don’t know how to capture it; you can do this for them. Check out and share tools that will help organizations assess their impact potential, like Emerald Publishing’s Real Impact Institutional Healthcheck Workbook developed by Dr David Phipps from York University in Toronto and Julie Bayley from the University of Lincoln in the UK. This is innovation in KT.

Identify important drivers for change that can help you build a business case for the value of KT in your organization. Four key drivers of KT relevance are top of mind. First, research funders often require KT plans within proposals. This is good leverage, so provide support for your research colleagues, if you have them. Second, universities now increasingly recognize KT and community engagement activities as bona fide scholarly criteria for promotion – check out Creative Professional Activity in the Temerty Faculty of Medicine at the University of Toronto promotion manual. Support your internal academics in this endeavor.  Third, Ontario has instituted Performance Based Funding for Postsecondary Institutions. This means Ontario universities will now be assessed on their impacts in order to determine the level of support they receive from the province. Not a perfect set of metrics, but it’s a start. Lastly, all organizations want their work to be impactful and widely recognized. You can help them to shape this.

Check-in with reality. Your job is to promote and disseminate research evidence but those that produce it will always believe the buck ends with the discovery. We know they’re only partway to their destination and that discovery without dissemination and implementation is unfinished business. Be gentle but persistent in leading them along the path. Much of this job requires continuous capacity building from the inside. 

Find your champions. You can and should build your career on the strengths of your own merits. But it doesn’t hurt to find leaders who will toot your horn and shine a light on your star. Work it. They can open doors and help you manage up.

Find your mentors. KT as a specialty and profession is old enough to have more senior experts who are often quite accessible. Seek them out in times of need. There is also a knowledgeable peer group in this profession – KTE Community of Practice

Step out of your sandbox. Those who have taken my courses may recall that I’m a strong advocate for the power of open collaboration. Being great at what you do will require you to innovate, and to do that, you need to step outside your sandbox. Read and explore outside your profession and stretch into the boundaries of what others are learning and doing. Innovate and showcase your successes – write, talk, share. 

__________

What Leaders Can Do 

We include KTP supervisors in several ways in our KTPC workshop because they are crucial for how KTPs will enable the work and build a KT culture in the organization. Facilitative leadership is an important driver for change. Simply building skills is a useless endeavor unless the organizational conditions enable them to be applied and realized. Optimizing the value of the KTP role requires employers to consider several things ahead of the hire and ongoingly, in collaboration with the KTP. 

How does KT align with the organization’s mission, vision, and strategic plan? More broadly, how does the organization think about impact and what impacts do they want to achieve? Emerald’s Impact Workbook, mentioned above, is a great exercise for the leadership team to do in collaboration with their KTPs.

Where should the KT program be relative to the rest of the organization in five- and ten-years time? How will leadership accommodate and promote that growth? What are the desired outcomes for the KT work and how do you carve a path to enable this? 

Is there a KT program budget beyond the KTP salary, and is that salary sufficiently competitive that you can retain your KTPs? As in any other business, if your KTPs are really great at what they do, someone will poach them, and rightly so. Keep them growing.

Lastly, consider that if you’ve not enabled your KTP to succeed with KT that is intentional, explicit, and systematic, and this includes KT evaluation, then they will not be able to demonstrate their impacts. This means the organization as a whole will not be able to either. Impacts start with researchers, KTPs, and roll-up. When it comes time to showcase what you do, you can’t ask people for impact evidence they don’t have.

__________

My hope is that this blog post provides food for thought and some steppingstones for improving capacity for knowledge translation work and the KTP profession. We’ve come a long way, but the journey continues. Glad to be a part of it, and from where I sit, it’s been fascinating to meet and learn from these groundbreakers.

*Knowledge Translation Professional Certificate

The post Thoughts on the Common Challenges of Knowledge Translation Practitioners appeared first on Melanie Barwick Consulting.

]]>
https://melaniebarwick.com/thoughts-on-the-common-challenges-of-knowledge-translation-practitioners/feed/ 0 840