It Takes Two to Tango: Knowledge Mobilization and Ignorance Mobilization in Science Research and Innovation

Gaudet, J. (2013). It takes two to tango: Knowledge mobilization and ignorance mobilization in science research and innovation. Prometheus, 31(3), 169-187. http://dx.doi.org/10.1080/08109028.2013.847604

Abstract

The main goal of this paper is to propose a dynamic mapping for knowledge and ignorance mobilization in science research and innovation. An underlying argument is that ‘knowledge mobilization’ science policy agendas in countries such as Canada and the United Kingdom fail to capture a critical element of science and innovation: ignorance mobilization. The latter draws attention to dynamics upstream of knowledge in science research and innovation. Although perhaps less visible, there is ample evidence that researchers value, actively produce, and thereby mobilize ignorance. For example, scientists and policymakers routinely mobilize knowledge gaps (cf. ignorance) in the process of establishing and securing research funding to argue the relevance of a scientific paper or a presentation, and to launch new research projects. Ignorance here is non-pejorative and by and large points to the borders and the limits of scientific knowing – what is known to be unknown. In addition, processes leading to the intentional or unintentional consideration or bracketing out of what is known to be unknown are intertwined with, yet remain distinct from, knowledge mobilization dynamics. The concepts of knowledge mobilization and of ignorance mobilization, respectively, are understood to be the use of knowledge or ignorance towards the achievement of goals. The value of this paper lies in its conceptualization of the mobilization of knowledge as related to the mobilization of ignorance within a complex, dynamic and symbiotic relationship in science research and innovation: it takes two to tango.

Joanne Gaudet (@gaudetj_99) guest blogged about ignorance mobilization on Mobilize This! in January 2013. Her paper was published later that year and I thought it would be interesting for the journal club to step back from our practices and think about the words we use to describe our work.

Ignorance mobilization needs to be defined but before we define ignorance mobilization we need to define ignorance. Ignorance not in a pejorative sense but “as the limits and borders of knowing – what scientists know that remains unknown in any given area of science.” Ignorance is the lack of knowledge and for scientists that means the questions that arise from discovery of new knowledge. As we discover new knowledge we raise questions about what remains unknown. That is ignorance and it can be mobilized. When a researcher publishes a scientific paper s/he almost always writes about what questions (whose answers are unknown) arise from the project or what the next steps will be to learn what is unknown.

In a reference to the title of the article, Joanne explains how knowledge and ignorance are interconnected. “Knowledge remains provisory in relation to ignorance (and vice versa) in a complex, dynamic and symbiotic relationship: it takes two to tango.”

Furthermore, ignorance drives innovation. Research seeks to know what is unknown and develops new approaches to products, policies and services that are only improved by mobilizing ignorance to develop new knowledge.

Are you with me?

Knowledge mobilization is mobilizing what is known. Ignorance mobilization is mobilizing what is unknown.

If you are mobilizing knowledge you are disseminating what is already known by making knowledge accessible in alternative formats to enhance its uptake by wider audiences than traditional scientific publication. CIHR calls that end of grant KT. I call it dissemination.

If you are mobilizing ignorance you are working to develop knowledge about what is unknown. Anyone practicing integrated KT (i.e. collaborations between academic researchers and non-academic partners) is actually practicing ignorance mobilization. Any collaboration starts with a shared interest in a question about something that is unknown. In other words a collaboration starts with a shared interest in ignorance. Collaborating on ignorance is ignorance mobilization.

We disseminate knowledge but we collaborate on ignorance.

Are you with me now?

Ignorance drives innovation (see above) and is also used to inform policy. Joanne claims that “when policymakers pay attention to expert opinion on knowledge gaps, they are mobilizing ignorance – using active non-knowledge in attempts to reach political goals.” Helping end users achieve a goal is the raison d’être of knowledge and ignorance mobilization.

I find this paper interesting but I do not see how it helps my knowledge mobilization practice except to give me a new perspective on what I am doing. The models she presents of interactive knowledge and ignorance mobilization are from her thinking in 2011 or 2012 (the paper was published in 2013). Our understanding of pathways from research to impact and the role of knowledge brokers in both dissemination and integrated methods of knowledge/ignorance mobilization have become more sophisticated. I don’t recommend getting lost in her models. But I do encourage all of us to take a step out of the weeds of our daily work and consider other perspectives.

Questions for brokers:

1. What percentage of your effort is spent mobilizing knowledge (dissemination or end of grant KT) vs supporting collaborations on ignorance (integrated KT)?

2. Ignorance mobilization: An interesting theoretical concept or a distraction from getting my job done. Discuss.

3. How much time do you wish you had to think about your work to better inform how you do it?

Research Impact Canada is producing this journal club series as a way to make the evidence and research on KMb more accessible to knowledge brokers and to create on line discussion about research on knowledge mobilization. It is designed for knowledge brokers and other knowledge mobilization stakeholders. Read the article, then come back to this post and join the journal club by posting your comments.

The “Dark Side” of Knowledge Brokering

Kislov, R., Wilson, P. & Boaden, R. (2016). The “dark side” of knowledge brokering. Journal of Health Services Research & Policy, 22(2), 107–112. http://journals.sagepub.com/doi/full/10.1177/1355819616653981

Abstract

Deploying knowledge brokers to bridge the ‘gap’ between researchers and practitioners continues to be seen as an unquestionable enabler of evidence-based practice and is often endorsed uncritically. We explore the ‘dark side’ of knowledge brokering, reflecting on its inherent challenges which we categorize as: (1) tensions between different aspects of brokering; (2) tensions between different types and sources of knowledge; and (3) tensions resulting from the ‘in-between’ position of brokers. As a result of these tensions, individual brokers may struggle to maintain their fragile and ambiguous intermediary position, and some of the knowledge may be lost in the ‘in-between world’, whereby research evidence is transferred to research users without being mobilized in their day-to-day practice. To be effective, brokering requires an amalgamation of several types of knowledge and a multidimensional skill set that needs to be sustained over time. If we want to maximize the impact of research on policy and practice, we should move from deploying individual ‘brokers’ to embracing the collective process of ‘brokering’ supported at the organizational and policy levels.

Not a research article but an article based in the literature and the authors’ experiences as knowledge mobilization practitioners. This makes it potentially more interesting since it is based on lived experience not detached observation.

The authors critically assess the usual position of the “knowledge broker as an unquestionable enabler of evidence-based medicine, enhancing the flow of knowledge between researchers and practitioners” by examining the “growing evidence about the unintended consequences of deploying knowledge brokers in health care which are often overlooked”. This critical lens is an important juxtaposition to the many articles that report positive outcomes of knowledge brokering.

This journal club has in the past presented articles that take a critical look at brokering including the challenging career paths of knowledge brokers. In addition, Dobbins et al published on the lack of evidence for effective knowledge brokering in a randomized controlled trial. The authors of the current article point out a number of challenges brought about by the intermediary nature of knowledge brokers. These include:

• The tendency of knowledge brokers to focus on information management techniques as opposed to linkage/exchange and capacity building since the latter are more time intensive and the former are more easily standardized and evaluated in practice

• The focus on explicitly codified knowledge vs. tacit knowledge

• The temptation to slip from facilitating knowledge uptake to doing knowledge uptake on behalf of end users especially if the broker is a subject matter expert

• The ambiguity and instability of the “in between-ness” of knowledge brokers

• Lack of role clarity, guidance and career path

The authors propose a shift from individual knowledge brokers to knowledge brokering teams. “We call for a major shift from this perspective towards embracing knowledge brokering as an inherently collective process unfolding at the team level and actively supported by the broader organization. If we want to maximize the impact of research on policy and practice, we should move from deploying ‘brokers’ to embracing ‘brokering’.”

The authors describe what needs to happen to accomplish this:

“The first step in this direction is to foster brokering teams composed of people with different professional backgrounds and having complementary skills. These skills should combine those needed for successful information management, linkage and exchange, and capacity building with broader clinical, managerial and contextual knowledge. This may be achieved by the involvement of academics, clinicians, managers, information scientists and service users.”

“Second, organizations deploying knowledge brokers should recognize brokering as part of their ‘core’ business, providing a range of learning, development and promotion opportunities to staff occupying the ‘in-between’ roles. Supporting the knowledge brokers’ communities of practice and creating regional or national forums for staff occupying intermediary roles can help alleviate their sense of isolation and enable peer-to-peer learning.”

Questions for brokers:

1. Are you a lone broker in your organization/project? Do you experience this dark side?

2. The authors call for communities of practice and regional/national forums. Have you signed up (for free) for the KTE CoP (mainly Canadian) or the Knowledge Brokers Forum (international) or the Knowledge into Practice Learning Network (international)? Do you attend the Canadian Knowledge Mobilization Forum?

3. The authors state, “organizations deploying knowledge brokers should recognize brokering as part of their ‘core’ business.” Is knowledge mobilization a priority in your organization’s strategic plan?

Research Impact Canada is producing this journal club series as a way to make the evidence and research on KMb more accessible to knowledge brokers and to create on line discussion about research on knowledge mobilization. It is designed for knowledge brokers and other knowledge mobilization stakeholders. Read the article, then come back to this post and join the journal club by posting your comments.

Faculties of education and institutional strategies for knowledge mobilization: An exploratory study

Sá, C.M., Li, S.X. & Faubert, B. (2011). Faculties of education and institutional strategies for knowledge mobilization: An exploratory study. Higher Education 61(5), 501-512. doi:10.1007/s10734-010-9344-4.

You can request a copy of the article from ResearchGate here

Abstract

The goal to enhance the impacts of academic research in the ‘real world’ resonates with progressive visions of the role of universities in society, and finds support among policy makers who have sought to enhance the ‘transfer’, ‘translation’, ‘uptake’, or ‘valorization’ of research knowledge in several areas of public services. This paper reports on an exploratory study of the strategies used by selected Canadian and international faculties of education to mobilize research knowledge. Drawing on data from semistructured interviews with senior administrators of 13 faculties of education, the analysis reveals several themes. Academic leaders recognize knowledge mobilization as a desirable institutional mission, but they identify a number of barriers to greater efforts in this area. Although a number of strategies are employed, changes across multiple organizational dimensions to encourage and support knowledge mobilization were reported at only two institutions. These results are relevant to faculty administrators, scholars, and policymakers interested in understanding the role of academic institutions in the mobilization of research knowledge to the broader education community.

This is another journal club about an article on institutional roles in knowledge mobilization. You can see recent others here and here. I am writing about these recently because I think we have lots of literature about the functions of researchers, partners and knowledge brokers but less on what institutions can do to support the work of the people who actually (co-)create the evidence and translate that into impact.

Should academic institutions play a role in knowledge mobilization? The authors acknowledge the dissenting opinions. “The notion that universities should take deliberate steps in this area is also subject to criticism. Some construe these efforts as an encroachment of utilitarian and instrumental views of the role of universities in society; others believe that emphasizing the external impact and uptake of research threatens forms of inquiry that do not lend themselves to immediate applications.” And then they move on. Nicely done.

If we accept that knowledge mobilization and related activities are functions of every SSHRC, CIHR, CRC and CFI (see Benefits to Canada) and health charity and most NSERC grant applications then I believe it is incumbent on institutions to support their faculty and students in these endeavours.

The authors quote literature identifying potential roles of institutions. Knowledge mobilization can be facilitated in academic institutions through changes in five areas: revising promotion and tenure guidelines to encourage and reward knowledge mobilization; proving funding and organizational resources such as opportunities for networking, skills training and administrative support; developing facilitating internal structures such as establishing dedicated offices; enhancing organizational orientation towards knowledge mobilization; and standardizing knowledge mobilization practices within the institution.

Five areas…how is your institution doing?

One area that didn’t come up in this research is the role of institutional planning. If knowledge mobilization and impacts of research are not in the university academic plan or strategic research plan (or equivalent in your institution) then it will only be a marginal activity on soft money. Planning drives resources which then enable activities like these five areas above. This is mentioned obliquely under “institutional priority and supports” but then goes on to describe supports and not the role of planning in identifying institutional priorities.

There were 5 Canadian Faculties of Education among the 13 interviewed. Only two (Melbourne and London, UK) has structures to support knowledge mobilization.

Sorry but this is wrong because the methodology was focused solely on the Faculties of Education. York University (a participating university) has a centralized knowledge mobilization unit under the VP Research & Innovation that provides services across the campus including to the Faculty of Education. Asking the Faculties to speak about their organizational contexts would have picked this up.

Table 2 identifies the barriers to Faculty level supports for knowledge mobilization including money, time, divided attitudes, difficulty assessing outcomes and setting targets and because it is hard plus others. Wait…we don’t do it because it is hard…give me a break. That’s a justification for needing professional supports not a justification to not do it when every SSHRC grant (which funds education research) requires it.

Rant over. Journal club almost over.

Questions for brokers:

1- The literature identifies five areas where institutions can support knowledge mobilization. How is your institution doing?

2- Is knowledge mobilization part of your institution’s planning? If not, how might you move this into institutional planning?

3- Institutional supports for knowledge mobilization: Faculty-based or a central service unit. Which is better?

Research Impact Canada is producing this journal club series as a way to make the evidence and research on KMb more accessible to knowledge brokers and to create on line discussion about research on knowledge mobilization. It is designed for knowledge brokers and other knowledge mobilization stakeholders. Read the article, then come back to this post and join the journal club by posting your comments.

Institutionalising Evidence-Based Policy: International Insights into Knowledge Brokerage

Lenihan, A. T. (2015). Institutionalising evidence-based policy: International insights into knowledge brokerage. Contemporary Social Science. 10(2), 114-125.
http://www.tandfonline.com/doi/full/10.1080/21582041.2015.1055297

Abstract

Numerous organisations act as ‘evidence brokers’, providing and translating research for use by decision-makers. The relationship between the supply and demand for evidence is far from linear, and whether these organisations are self-professed evidence brokers or government appointed bodies, they face similar challenges in their quest to impact policy. This paper analyses the strategies of two organisations considered ‘exemplars’ of institutional knowledge brokerage: the CPB Netherlands Bureau for Economic Policy Analysis and the Washington State Institute for Public Policy. The author posits that three primary factors help these organisations connect evidence successfully to policy-makers: the institution’s credibility, based on independence, neutrality, reputation, trust, transparency and the quality of its methods and evidence; the utility of its research, based on transferability, timing, stakeholder involvement and resonance with policy-makers; and the communication of that research, in terms of effectiveness, dissemination, presentation and translation for policy-makers. Findings, and the possibility of applying these insights internationally, are then discussed and contextualised.

This article isn’t open access, sorry. If you want a copy you can request one from Dr. Lenihan directly at http://lse.academia.edu/AshleyThomasLenihan

This is another article that discusses the role of organizations as knowledge brokers. I recently reviewed another article on this topic. As opposed to that previous article which was about networks as knowledge brokers, this article is about discrete organizations working as knowledge brokers to inform policy decisions. It is good to see literature about the role of knowledge brokering organizations to complement the literature on human knowledge brokers.

Critique #1: the author references the status quo as the “two communities” approach from Caplan, 1979. We have long abandoned such reductionist views. Policy and research might sit in different organizational constructs but researchers and policy makers share spaces, collaborate and exchange.

Studying two different knowledge brokering organizations the author concludes that there are three organization factors that define successful knowledge brokering organizations:

1. The institutional credibility
2. The utility of the research
3. The communication of that research

Makes sense, no argument here. What I find interesting is to compare that to the three elements of the PARIHS framework that speak to the factors of evidence that enable evidence use:

1. The credibility of the evidence itself
2. The context of its use
3. The facilitation dedicated to supporting uptake of evidence

It seems there are elements of our work that transcend different contexts. The factors that support evidence use are similar to the factors that support institutional knowledge brokering and these seem to transcend contexts. I recently blogged on an article I published about context being important but not for the reasons you think.

Critique #2: “Effective communication of research to policy makers can be paramount to its utilisation”. Not wrong but certainly incomplete. Metrics of communications (i.e. social media analytics, downloads, media impressions) are at best proxies of use. You can have use without communication but you can communicate the heck out of your evidence and never get it used. We know from the evidence on evidence use that integrated (i.e. collaborative) forms of knowledge brokering are more effective than dissemination (i.e. communication) forms. Don’t just communicate evidence but actively facilitate its uptake by doing engaged workshops with decision makers (for example).

Here’s a fun fact: both organizations studied were inside government. They appear to be working as independent (“We are specifically non-partisan. We do what we are assigned and we don’t make recommendations”) research organizations (i.e. think tanks) generating evidence for policy decision making. This doesn’t affect the utility of the evidence or how it is communicated but I feel it draws into question the claims to credibility of the organization. If it is by government for government then government has to claim it is credible. If it isn’t credible then government will shut it down.

Questions for brokers

1. Why do you think that certain enabling qualities can apply to evidence and organizations? What underpins this commonality?

2. Both organizations are independent evidence brokering organizations. If they are inside government how independent can they be (don’t forget to reference Stephen Harper in your discussion)?

3. How might these three factors underpinning successful knowledge brokering organizations map onto universities seeking to also provide evidence to policy makers? How does a university ensure credibility, utility and communication of evidence when it isn’t the university, but the researcher, who is generating the evidence?

RIR is producing this journal club series as a way to make the evidence and research on KMb more accessible to knowledge brokers and to create on line discussion about research on knowledge mobilization. It is designed for knowledge brokers and other knowledge mobilization stakeholders. Read the article, then come back to this post and join the journal club by posting your comments.

The Impact of Higher Education Leadership, Management and Governance Research

Morrow, S. (2016). The Impact of Higher Education Leadership, Management and Governance Research: Mining the 2014 Research Excellence Framework Impact Case Studies. Report from Leadership Foundation for Higher Education.

Elizabeth Morrow is an independent researcher who produced this report for the Leadership Foundation for Higher Education. It is an exhaustive review (121 pages) of 1,309 Research Excellence Framework (REF) impact case studies from legal, governance and management (LGM) research. This report is important not for LGM research (well it is…but that’s not why I find it important) but because it is a different look at the REF. The REF is the only system wide (every UK university participated) research impact assessment process in the world so we can learn a lot about the opportunities and challenges to research impact assessment by studying the REF.

Why should knowledge mobilizers care about the REF? We care because impact is what we want to achieve and knowledge mobilization is one way to maximize impact. When evaluating our work impact is the dependent variable and knowledge mobilization is the independent variable (see more on this here). So we should care about the REF if we want to measure the effects of our knowledge mobilization efforts.

This is such a big report it is impossible for me to pull out all the good and interesting sections so I will use the limited space of this journal club to present the Adaptive Systems Framework for Advancing Research (AS-FAR) model. AS-FAR appears on page 86 of the report and is summarized on this HEFCE blog post. Click that link now and take a look at the model.

I wrote to Elizabeth Morrow. I thanked her and observed the report and hence AS-FAR are derived from retrospective cases and I struggled to try to think about applying AS-FAR in a prospective manner despite the report claiming this was possible. My world is mostly about helping researchers and partners collaborate to co-produce research that has both academic merit and is useful for the partners who create the impact. I struggled to work through AS-FAR in this prospective, co-production fashion. I found AS-FAR was more easily understood retrospectively collecting evidence and creating the story of impact using transfer and exchange mechanisms.

I frequently critique academic frameworks since they describe a gold standard process or model yet I work in a world that never hits a gold standard. I find many frameworks describe what should be and I work in messier realities of what is. I struggle with thinking how I would implement AS-FAR sitting across from a research partnership helping them develop their impact strategy for their grant application.

Elizabeth responded acknowledging the complexity and indicated her current project is developing a Research Leader’s Impact Toolkit focusing more on what research leaders can do to support researchers and future impact. We need this tool kit because AS-FAR as a model is challenging. We need tools to cut through complexity, working with it but not letting it become a barrier. As practitioners we accept imperfection and work with it to get to impact. I am not certain AS-FAR helps but the tools Elizabeth is developing might.

Stay tuned!

Questions for brokers:

1. You’re probably deep in the trenches helping to create impact. How do you assess it? Do you have a particular framework that guides you (and don’t say the Knowledge To Action Framework for all the reasons here).

2. Would REF work in Canada? We better figure this out because my bet is Canadian universities will be held accountable for impact at some point.

3. Do we have enough frameworks? Don’t we need more practical guidance on their use (and some tools would be nice)?

ResearchImpact-RéseauImpactRecherche is producing this journal club series as a way to make the evidence and research on knowledge mobilization more accessible to knowledge brokers and to create on line discussion about research on knowledge mobilization. It is designed for knowledge brokers and other knowledge mobilization stakeholders. Read the article. Then come back to this post and join the journal club by posting your comments.

Knowledge Mobilization/Transfer and Immigration Policy: Forging Space for NGOs – the Case of CERIS – The Ontario Metropolis Centre

Shields, J., Preston, V., Richmond, T. Sorano, Y., Gasse-Gates, E., Douglas, D., Campey, J. and Johnston, L. (2015) Knowledge mobilization/transfer and immigration policy: Forging space for NGOs – the case of CERIS – The Ontario Metropolis Centre. Journal of International Migration & Integration, 16(2), 265–278.

http://link.springer.com/content/pdf/10.1007%2Fs12134-015-0425-1.pdf

Abstract

The role of evidence-based knowledge and research in informing immigration and settlement policy is an important but under-examined area of inquiry. Knowledge for evidence-based policy-making is most likely to be useful to policymakers when it is produced collaboratively through sustained engagement between academic and non-academic stakeholders. This paper seeks to explore the role of non-governmental organizations in evidence/research-centred knowledge mobilization/transfer by a case study of CERIS—The Ontario Metropolis Centre, one of five immigration research centres in Canada that promoted partnerships to facilitate ongoing, systematic and timely exchange of social science knowledge. We explore the strategies and outcomes of establishing and maintaining relationships among academic researchers, representatives from non-governmental organizations and government policymakers. The experience at CERIS underscores the potential benefits from partnerships with non-governmental organizations that have detailed local knowledge of immigration and settlement issues and highlights the persistent challenges of funding and power imbalances that impede equitable and effective partnerships. The CERIS experience offers valuable insights into successful knowledge exchange from which the local, national and international immigration policy community can learn.

This article arises from a long standing government-academic-NGO collaboration on immigration and settlement. While the dedicated Metropolis funding has sadly dried up the legacy of the collaboration continues to generate impact for academic and non-academic partners. This article is about the role of NGO partners in KT/KM and illustrates not only what they get from a collaboration but what expertise they bring to the table. CERIS is itself a knowledge brokering organization – check out last month’s journal club post for more on institutional knowledge brokering.

This is a story about power and the “power imbalances that impede equitable and effective partnerships” – although on this very important point I would encourage the authors to go further than they did. The conclusion is correct, in my opinion, but I would like to have seen more in the analysis which was devoted mainly to the research symposia, see below.

The role of local NGOs is best summed up by the authors. “Locally, each centre was also required to establish partnerships with representatives from local NGOs that are service providers and advocates for newcomer populations. Since each centre was to concentrate on issues in its own region, community partners provided critical familiarity with local immigration and settlement issues and crucial contacts to facilitate the research, its dissemination and its use.” But even here the authors go one step further than working with NGOs for access to communities. Yet they modestly do not mention this in the article itself. Check out the authors.

Two of the authors are from the Ontario Council of Agencies Serving Immigrants. CERIS is predicated on a collaboration that involves non-academic partners throughout, including in governance, decision making, dissemination in peer reviewed literature and in the use of the evidence by NGO partners. It is this mutual involvement in all stages of the research to impact process that makes this an authentic partnership.

This is particularly justified in policy research. “It is virtually impossible to draw a straight-line link between research and policy decision. Rather, ongoing knowledge exchange in which all participants contribute to the identification of research questions, their investigation, the interpretation and presentation of research findings and their dissemination is most likely to result in relevant knowledge and its actual utilization in policy-making.”

In an authentic partnership explicit efforts are made to share power. Authentic partnerships inform the research that is conducted helping to co-create evidence that can have academic integrity and inform decisions about public policy and social services. As previously covered in this journal club it is engaged scholarship, not knowledge transfer, that helps get research evidence used by non-academic partners.

My only complaint about the paper is the authors set up this amazing collaboration with CERIS as a knowledge brokering organization. They go to great lengths to balance power differentials and then spend the majority of the analysis (4 pages) on the annual Community Research Symposium. I’m sure it’s a great symposium. But it’s only the public and most visible example of the collaboration that is CERIS. There is so much more to CERIS that accounts for its success as a knowledge brokering organization. In their own words, “In the end, KT and KM are about relationship building, not simply about doing and disseminating research.”

Questions for brokers

It’s not about academic supply of and community demand for knowledge. It’s about valuing the different yet complementary expertise of academics, policy makers, community partners and people with lived experience. How do you tell an academic researcher that s/he doesn’t know it all?

Assessing the impact of a symposium (a discrete activity) is far easier than assessing the impact of a long standing collaboration. What advice would you give John Shields and his colleagues in the event they are considering the next paper about assessing their efforts to balance power in the collaboration?

What methods do you use to balance power between academic and non-academic (especially NGO) partners?

ResearchImpact-RéseauImpactRecherche is producing this journal club series as a way to make the evidence and research on knowledge mobilization more accessible to knowledge brokers and to create on line discussion about research on knowledge mobilization. It is designed for knowledge brokers and other knowledge mobilization stakeholders. Read the article. Then come back to this post and join the journal club by posting your comments.

Collective action for implementation: a realist evaluation of organisational collaboration in healthcare

Rycroft-Malone, J., Burton, C.R., Wilkinson, J., Harvey, G., McCormack, B., Baker, R., Dopson, S., Graham, I.D., Staniszewska, S., Thompson, C., Ariss, S., Melville-Richards, L., & Williams, L. (2016). Collective action for implementation: A realist evaluation of organisational collaboration in healthcare. Implementation Science, 11(17). DOI: 10.1186/s13012-016-0380-z

http://implementationscience.biomedcentral.com/articles/10.1186/s13012-016-0380-z

Abstract

Background: Increasingly, it is being suggested that translational gaps might be eradicated or narrowed by bringing research users and producers closer together, a theory that is largely untested. This paper reports a national study to fill a gap in the evidence about the conditions, processes and outcomes related to collaboration and implementation.

Methods: A longitudinal realist evaluation using multiple qualitative methods case studies was conducted with three Collaborations for Leadership in Applied Health Research in Care (England). Data were collected over four rounds of theory development, refinement and testing. Over 200 participants were involved in semi-structured interviews, non-participant observations of events and meetings, and stakeholder engagement. A combined inductive and deductive data analysis process was focused on proposition refinement and testing iteratively over data collection rounds.

Results: The quality of existing relationships between higher education and local health service, and views about whether implementation was a collaborative act, created a path dependency. Where implementation was perceived to be removed from service and there was a lack of organisational connections, this resulted in a focus on knowledge production and transfer, rather than co-production. The collaborations’ architectures were counterproductive because they did not facilitate connectivity and had emphasised professional and epistemic boundaries. More distributed leadership was associated with greater potential for engagement. The creation of boundary spanning roles was the most visible investment in implementation, and credible individuals in these roles resulted in cross-boundary work, in facilitation and in direct impacts. The academic-practice divide played out strongly as a context for motivation to engage, in that ‘what’s in it for me’ resulted in variable levels of engagement along a co-operation-collaboration continuum. Learning within and across collaborations was patchy depending on attention to evaluation.

Conclusions: These collaborations did not emerge from a vacuum, and they needed time to learn and develop. Their life cycle started with their position on collaboration, knowledge and implementation. More impactful attempts at collective action in implementation might be determined by the deliberate alignment of a number of features, including foundational relationships, vision, values, structures and processes and views about the nature of the collaboration and implementation.

With the use of the term “collective action” I first thought (i.e. hoped) this article might be linking implementation science with collective impact (a theoretical framework from the social innovation literature) – now that would have been cool! Small moment of disappointment…nonetheless, this is an article about what organizations can do to create a culture of effective evidence use. The paper examined three CLAHRCs (Collaborations for Leadership in Applied Health Research in Care) which are funded by the UK government to expressly link academic research with clinical practice. They are essentially big experiments in implementation science. I like this paper since it focuses on organizations as opposed to individuals. The focus on organizations as opposed to individuals is an interesting contribution.

The paper argues that much practice has moved beyond creation, translation and transfer of knowledge products (i.e. clinical practice guidelines) to integrated or collaborative processes. “In theory, collaborations could blur this academic-practice boundary and the evidence would be co-produced within communities of practice, increasing the relevance to that community and its potential use.” This calls for organizational collaboration and the need for boundary spanning functions (like knowledge brokers, “these roles and individuals were an essential CLAHRC component”) to overcome barriers between research and practice organizations.

The authors conclude, “collective action in implementation might be determined by the deliberate alignment of a number of features, including foundational relationships, vision, values, structures and processes and views about the nature of the collaboration and implementation.” Wait…doesn’t this sound familiar?

Think about your own efforts at getting evidence into use. They probably also depend on factors like relationships, shared vision, values, structures and processes etc. At the end of the day these are same determinants of successful evidence use/knowledge mobilization/implementation derived from studies of individual interventions as well as this study at the organizational level. And this shouldn’t be a surprise since an organization is comprised of individuals so the characteristics of successful implementation at the individual level will translate up to the organizational level.

However, some of the recommendations are more applicable to organizations including: leadership; governance; and, investing in boundary spanners.

One key determinant was the “quality of existing relationships between higher education and health service”. Essentially, your implementation efforts will be more successful if you have pre-existing relationships among the partners because this basis the efforts on trust. “A history of working together catalysed collective action (and therefore, impacts) in a shorter time frame.”

This sounds obvious for individuals but possibly not for organizations. For some more thinking on organizational relationships read about the institutional knowledge mobilization collaboration between York University and United Way of York Region (now United Way Toronto & York Region).

Questions for brokers:

1. If the determinants of successful implementation are similar for individuals and organizations what might be different in how these play out (i.e. incentives, complexity, concepts of individual vs organizational collaborations; competencies and qualities)?

2. How might you apply organizational specific determinants to your individual efforts? I.e. how might you apply individual leadership; governance of your collaboration; and, boundary spanners to your implementation interventions?

3. If we agree that much practice has moved beyond creation, translation and transfer of knowledge products (i.e. clinical practice guidelines) to integrated or collaborative processes, why do we still see a focus on “bridging the gap” between evidence produced in research and its use in practice settings? Should we be closing the loop or some other metaphor that overcomes the siloes of research evidence and its use in practice? Why do we have repositories of “what works” when what works in your collaboration cannot be easily implemented in mine?

ResearchImpact-RéseauImpactRecherche is producing this journal club series as a way to make the evidence and research on knowledge mobilization more accessible to knowledge brokers and to create on line discussion about research on knowledge mobilization. It is designed for knowledge brokers and other knowledge mobilization stakeholders. Read the article. Then come back to this post and join the journal club by posting your comments.

Top 5 Most Popular KMb Journal Club Posts of 2016

Here’s a recap of the five most read KMb Journal Club posts of 2016.

#1 – 184 views – 9 comments

Knowledge Translation Through Evaluation: Evaluator as Knowledge Broker

#2 – 121 views – 1 comment

Incorporating Community Engagement Language into Promotion and Tenure Policies: One University’s Journey

#3 – 98 views

Listening to the Stars: The Constellation Model of Collaborative Change

#4 – 77 views

Creating Research Impact: The Roles of Research Users in Interactive Research Mobilisation

#5 – 58 views – 1 comment

Reinventing the Civic University

Knowledge Translation Through Evaluation: Evaluator as Knowledge Broker

Donnelly, C., Letts, L., Klinger, D. & Shulha, L. (2014). Knowledge translation through
evaluation: Evaluator as knowledge broker. Canadian Journal of Program Evaluation, 29(1), 36–61 doi: 10.3138/cjpe.29.1.36

http://www.evaluationcanada.ca/secure/29-1-036.pdf

Abstract

The evaluation literature has focused on the evaluation of knowledge translation activities, but to date there is little, if any, record of attempts to use evaluation in support of knowledge translation. This study sought to answer the question: How can an evaluation be designed to facilitate knowledge translation? A single prospective case study design was employed. An evaluation of a memory clinic within a primary care setting in Ontario, Canada, served as the case. Three data sources were used: an evaluation log, interviews, and weekly e-newsletters. Three broad themes emerged around the importance of context, efforts supporting knowledge translation, and the building of KT capacity.

I usually post a summary of an article I think makes a valuable contribution to the knowledge mobilization literature and hence the practice of knowledge mobilization. Not so in this case. This article creates false dichotomies between evaluators/evaluation and knowledge brokers/knowledge translation. This article might be news to evaluators but there is nothing new for knowledge brokers. Nonetheless, this article begs the question why is this news to evaluators and what can we do to let them realize they are already an important part of the various worlds of knowledge mobilization.

Here’s a quick summary of the article.

Evaluators want to evaluate knowledge translation.
Evaluators implement knowledge translation activities in a memory clinic.
Evaluators assess knowledge translation success.
Conclusion: evaluators can function as knowledge brokers, a new role for evaluators.

Duh. This is not a “drop the mic” moment for knowledge brokers.

If you ask an evaluator to undertake knowledge translation roles then they are knowledge brokers. If you asked a plumber to undertake knowledge translation s/he would be a knowledge broker.

I don’t see what this adds to the literature.

But this allows us to examine the intrinsically interlinked roles of knowledge translation and evaluation.

Every good knowledge broker establishes an evaluation of their knowledge translation intervention to assess if our work made a difference. Knowledge brokers are evaluators so it should come as no surprise that evaluators working in a knowledge translation study are knowledge brokers.

When planning a knowledge translation intervention we know that the impact of the intervention is the dependent variable (that thing we measure) and knowledge translation is the independent variable (that thing we change to observe an effect on the dependent variable). The two roles of knowledge broker (affecting the independent variable) and evaluator (assessing the dependent variable) are intrinsically linked.

Much evaluation happens along the way (measuring process indicators) and at the end (ex post measuring outcome indicators). But since knowledge brokers plan the knowledge translation (including how to evaluate it) at the beginning of the process then knowledge translation planning is ex ante research impact assessment.

See a blog I wrote about this September 2015.

When you connect the dots like this then knowledge mobilization/translation embraces evaluation and knowledge brokers can be evaluators and vice versa. The paper adds nothing new to this but it lets us step back and realize how knowledge translation and evaluation are intimately linked.

Questions for brokers:

1- Was this a moment of “doh, of course I knew that” or was this a revelation to you? Why and how different do you see your role now?
2- If knowledge brokers and evaluators both work on a spectrum from planning for impact to assessing impact why do we present them as an artificial duality with artificially distinct roles?
3- What can the disciplines of knowledge mobilization/translation and evaluation/research impact assessment do to create a shared space of “research impact practitioners” (thank you @JulieEBayley)?

ResearchImpact-RéseauImpactRecherche is producing this journal club series as a way to make the evidence and research on knowledge mobilization more accessible to knowledge brokers and to create on line discussion about research on knowledge mobilization. It is designed for knowledge brokers and other knowledge mobilization stakeholders. Read the article. Then come back to this post and join the journal club by posting your comments.

Listening to the Stars: The Constellation Model of Collaborative Change

Surman, M. & Surman, T. (2008). Listening to the stars: The constellation model of collaborative change. Social Space: 24-29. https://marksurman.commons.ca/publications/listening-to-the-stars-the-constellation-model-of-collaborative-social-change/

“The Constellation Model of Collaborative Governance is a complexity-inspired framework designed to ‘hold’ collaborations within dynamic systems. Balancing chaos and order, energy and structure, the governance model supports multi-organization partnerships and networks within complex systems.”

This is a brief description found on an update to the Constellation Model posted by the Centre for Social Innovation. At this link you will find additional documents and updates on the Constellation Model and its application in the Canadian Partnership for Children’s Health and Environment. You will also see examples of the model used by the Ontario Nonprofit Network and the Ontario Literacy Coalition.

This is an interesting article for anyone who is working in and/or supporting collaborations for social change but who is struggling with heavy governance that drives you to ask questions like:

How would collective goals be set?
Would they have to agree on everything?
How could autonomy and diversity be preserved?
Who would be ‘in charge’?
How could they best leverage each other’s talents?

Lightweight governance
• Develop a stewardship group with simple governance documents including: 1) guiding principles; 2) terms of reference; and 3) strategic plan that establishes goals
• This is light touch governance. Day to day coordination of the partnership is managed by a third party (see below)

Action focused work teams
“Within the broader strategic vision of the partnership, constellations take the form of clusters of activity in which a subset of the partners voluntarily participate. They can be formal projects, opportunistic initiatives or working groups. They must however act consistently with the partnership’s overall vision”.
• Funding for the work of different constellations is shared among that constellation’s membership, balancing power and funding the member that has the skills and capacity

Third party coordination
• This is the most interesting element to me. Someone else (not the constellations or the stewardship group) manages the day to day. The rationale is “When non-profits set up collaborative projects, they typically address this need by creating a secretariat within the partner who has the most capacity. This is seldom an ideal solution. Placing the coordination function within one of the partners permanently alters the power dynamic of the group. One partner takes power. The others defer responsibility and lose energy.”

These three elements help to overcome the old models where distrust and competition is at the core of collaboration. But the article recognizes that too much trust can also be detrimental as it risks creating a clique that excludes new members.

This is also a cautionary tale. It took the Canadian Partnership “five years for the partners to fully grow into and understand” the Constellation Model. But once they did they were able to “maintain organizational independence and collaborate nimbly with others.”

Questions for brokers:

1. Compare and contrast the Constellation Model with Collective Impact.
2. The third-party coordinator manages day to day activities like facilitating meetings, supporting new constellations, fundraising, communications, planning, dispute resolution. How do you find a third-party coordinator who is not a member of a constellation but is invested enough in the issue to do all the work?
3. How are you supporting the capacity development of others in your own constellations?

ResearchImpact-RéseauImpactRecherche is producing this journal club series as a way to make the evidence and research on knowledge mobilization more accessible to knowledge brokers and to create on line discussion about research on knowledge mobilization. It is designed for knowledge brokers and other knowledge mobilization stakeholders. Read the article. Then come back to this post and join the journal club by posting your comments.

Incorporating Community Engagement Language into Promotion and Tenure Policies: One University’s Journey

Pelco, L.E. & Howard, C. (2016). Incorporating community engagement language into promotion and tenure policies: One university’s journey. Metropolitan Universities, 27(2), 87-98, DOI: 10.18060/21129

https://journals.iupui.edu/index.php/muj/article/view/21234

Abstract

This case study describes the campus context and process for successfully including community engagement language into promotion and tenure policies at Virginia Commonwealth University, a high research, urban public university. The paper also describes barriers our campus faced during the promotion and tenure policy revision process, especially myths that emerged surrounding community-engaged work in the academy. We describe key supports that facilitated a successful process, including the important champions who played roles on our campus.

Incentives and rewards for community engaged scholarship is cited frequently as a barrier to academic researchers becoming meaningfully engaged with community partners. “It’s not recognized as part of tenure & promotion” (T&P). There is a concern that time spent with community is time that is not spent with students, research or teaching all of which is rewarded in traditional T&P structures.

Well Virginia Commonwealth University has addressed this barrier.

The authors present a case study of their efforts to recognize community engaged activities in research, teaching and service. An early accomplishment was to agree on definitions that would apply across the campus. Check out their definitions of Community Engaged Scholarship and Community Engaged Research and tell me if you understand the difference – beyond the debatable proposition that scholarship also includes service – because I don’t.

The case study illustrates the importance of five “supports” for changing T&P. These include:

    High quality community engaged work already visible
    A pan university committee to tackle the issue
    Leadership (Provost, VP Research)
    Well timed strategy
    Expert support (in the form of a consultant in this case)

And there is discussion of a critical piece: assessment. The case calls for developing methods for assessing community engaged scholarship (research, teaching and service) especially in comparison to tradition scholarship. It also recognized the need to train academics and administrators in documentation and assessment.

A small, almost footnote, also stands out. “When evaluating and rewarding faculty work, it will also be important for universities…to increase the value of local impact so that it is afforded the same credibility in promotion and tenure reviews as national and international impact”. This echoes a previous journal club post about the moral obligation for universities to work locally for local benefit. International reputations like the Community University Partnership Program of the University of Brighton are built on local community engagement.

However, note this about Virginia Commonwealth. Since 2006, this university has had a Division of Community Engagement under the Provost. “The Division provides support and coordination for community-engaged teaching, research and outreach activities across all academic units on both campuses and currently employs more than 20 full time staff members. Virginia Commonwealth has an undergraduate student population of 24,051. That’s half the size of my university. I don’t know of any university in Canada that has even 5 full time members handling community (not industry) engagement even for universities that are much larger.

So what to do? I wrote about this in our LinkedIn page in 2014 in a post titled “It’s time to stop complaining about tenure and promotion and do something about it”. In that post I challenged passionate faculty members to organize and mobilize for T&P. This article shows them how to do it.

Questions for brokers:

Is there a correlation between the size of the university’s engagement office and its willingness to commit to tackling T&P? If both of these signal the degree of commitment to community engaged scholarship then what hope do Canadian universities without such support have?

Check out some Canadian work that looked at rewards and incentives for community engaged scholarship. Is your university there? What are some shining Canadian examples?

What this paper fails to do is consider incentives and rewards for community members partnering on these activities. How can the campus and community partners collaborate on incentives and rewards for all involved in the engagement efforts?

ResearchImpact-RéseauImpactRecherche is producing this journal club series as a way to make the evidence and research on knowledge mobilization more accessible to knowledge brokers and to create on line discussion about research on knowledge mobilization. It is designed for knowledge brokers and other knowledge mobilization stakeholders. Read the article. Then come back to this post and join the journal club by posting your comments.

Reinventing the Civic University

Goddard, J. (2009). Reinventing the civic university.
http://www.nesta.org.uk/sites/default/files/reinventing_the_civic_university.pdf

From the summary of this report done for Nesta, a U.K. innovation charity:

In this provocation I argue that all publicly-funded universities in the UK have a civic duty to engage with wider society on the local, national and global scales, and to do so in a manner which links the social to the economic spheres. Engagement has to be an institution wide-commitment, not confined to individual academics or projects. It has to embrace teaching as well as research, students as well as academics, and the full range of support services. All universities need to develop strategies to guide their engagement with wider society, to manage themselves accordingly and to work with external partners to gauge their success….I argue that civic engagement should move beyond being a third or separate strand of activity for universities, with less prestige and fewer resources than teaching or research. It should become a guiding principle for their organisation and practice.

This report was written in 2009 in the throes of the global economic meltdown. What is striking about this report is that even with this backdrop the author, John Goddard, does not retrench to the narrow view of universities as economic drivers producing technologies and talent for companies. He takes the broad view that universities need to engage their research and teaching portfolios with local and global communities to create social and economic benefits for the country.

A key perspective differentiating this report from much of the knowledge mobilization literature: the focus of this report is the institution not the researcher. This report focuses on what institutions should do to support the researchers, students, partners, knowledge brokers, public engagement officers (collectively I refer to this group as “research impact practitioners”).

Seven years ago this report sets up all the arguments for research impact which is the focus of much attention today. The only significant element of this report that seems to have materialized in the UK is the Research Excellence Framework (www.ref.ac.uk) that is the basis for allocating £1.6 billion annually.

Even though UK universities are assessed on the impacts of research there is little funding for institutional activities that support the creation of research outside of research commercialization. One exception is the Impact Acceleration Accounts from the Economic and Social Research Council although much of this funding will go to the support the collaborative activities of researchers and partners. There is still the need in the UK – and of course, in Canada – to seed the development of infrastructure to support research impact. In Canada all CIHR and SSHRC grant applications require a knowledge translation or mobilization (respectively) strategy yet there are few institutions investing in institutional supports for those strategies.

The 12 ResearchImpact universities are each supporting knowledge mobilization in different organizational constructs working with partners from the public, private and non-profit sectors. And we are sharing across our institutions so that everyone has the opportunity to adapt promising practices to their own local context. This 2009 report calls for what our network has been working towards since 2006 and continues to share among members today.

Much of this report is timely even seven years later as Canada consults to develop a new innovation policy. The committee reviewing options for the six priority areas for innovation in Canada would do well to remember their 2009 history and consider all 11 recommendations in this report, two of which are:

-This is not solely a matter for universities. Companies, local government, development organisations, NGOs and the public have much to gain from thinking about how to interact more effectively with local universities.

-Universities should be asked to bid for civic status, with access to substantial amounts of money, in exchange for demonstrating their ability to generate worthwhile impact.

Build a knowledge mobilization unit. Support the creation of research impacts. Work well with partners from the public, private and non-profit sectors. Bid for funding (if there was any!). And become a Civic University.

ResearchImpact-RéseauImpactRecherche is producing this journal club series as a way to make the evidence and research on knowledge mobilization more accessible to knowledge brokers and to create on line discussion about research on knowledge mobilization. It is designed for knowledge brokers and other knowledge mobilization stakeholders. Read the article. Then come back to this post and join the journal club by posting your comments.

Mobilising Knowledge to Improve UK Health Care: Learning From Other Countries and Other Sectors

Davies, H. T. O., Powell, A. E. & Nutley, S. M. (2015). Mobilising knowledge to improve UK health care: Learning from other countries and other sectors – a multimethod mapping study. Health Services and Delivery Research, 3(27). www.journalslibrary.nihr.ac.uk/__data/assets/pdf_file/0003/146766/FullReport-hsdr03270.pdf

Abstract

Background: The past two decades have seen rich conceptual development and a wide variety of practical initiatives around research use or ‘knowledge mobilisation’, but so far there has been little systematic effort to map, conceptualise and learn from these initiatives, or to investigate the degree to which they are underpinned by contemporary thinking as set out in the literature. This gap is particularly apparent when looking at knowledge mobilisation at the ‘macro’ level, that is the strategies and activities of major research funders, major research producers and key research ‘intermediaries’.

Aims and objectives: The study had three key objectives with associated research questions: to map the knowledge mobilisation landscape in health care (in the UK and internationally) and in social care and education within the UK; to understand the models, theories and frameworks that underpin the approaches to knowledge mobilisation; and to learn from the success or otherwise of the strategies and approaches in use.

Methods: The study was multimethod and multiphased, with considerable interactivity between the different strands. Data were collected through a review of 71 published reviews on knowledge mobilisation; website review of the knowledge mobilisation activities of 186 agencies; in-depth interviews (n = 52) with key individuals in agencies; a web survey (response rate 57%; n = 106); and two stakeholder workshops (at months 6 and 16).

Findings: We identified a wide range of models, theories and frameworks used to describe knowledge mobilisation and created a conceptual map that highlights six domains of thinking and debate in the literature. The interview and survey data showed three broad, overlapping roles undertaken by agencies: developing and sharing research-based products; emphasising brokering; and focusing on implementation. The knowledge mobilisation approaches in use had been shaped by many factors but there was only limited use of the models, theories and frameworks from the literature. Participants saw formal evaluation of knowledge mobilisation activities as important but highly challenging. Rich formative experience was described but formal evaluation was relatively rare. Few agencies involved service users or members of the public in knowledge mobilisation activities. Working inductively from the study data we derived eight key archetypes or ‘bundles of knowledge mobilisation activities’ that could be used by agencies to explore their knowledge mobilisation activities, future strategies and stakeholder perspectives.

Conclusions: Knowledge mobilisation could be enhanced by providing support to enable cross-sector and interagency learning, reflection on the conceptual basis of approaches and increased evaluation of knowledge mobilisation activities. Further research is needed to evaluate approaches to assessing research use and impact, on systems approaches to knowledge mobilisation, on sustaining and scaling-up approaches, and on applying a wider range of literatures to knowledge mobilisation. Further research would also be useful on the knowledge mobilisation archetypes and how they can work in complementary ways.

This article, while not peer reviewed, nonetheless serves as the raw material for publications such as this article by the authors. There is much detail in this report but I will focus on two items: 1) six domains from the literature; and, 2) eight archetypes of knowledge mobilization organizations/activities.

The authors identify Six Domains of Interest:

  1. purpose(s) and goals (implicit or explicit)
  2. knowledge (of all kinds)
  3. connections and configurations (between people; between organisations)
  4. people, roles and positions
  5. actions and resources available and
  6. context of operation (different in kind from the other five domains, but influential and interactive with each of them).

These six map onto many of the elements of the knowledge mobilization planning guide of the Centre of Excellence for Child & Youth Mental Health (Ottawa, Canada) which uses the format of: who (#4), what (#2), where (#6), when, how (#3, #5), why (#1).

These six are brought back later in the report when the eight archetypes are discussed. “Archetypes may be thought of as idealised types or configurations of agencies (i.e. not necessarily actual or real). They provide accounts of an idealised agency that can be used as interpretive heuristics, allowing us to assemble and interpret observations.

The eight archetypes (some similar enough to be clustered in pairs) are:

  1. archetype A: producing knowledge (product push)
  2. archetypes B and C: brokering and intermediation (own research; wider research)
  3. archetype D: advocating evidence (proselytisers for an evidence-informed world)
  4. archetypes E and F: researching practice (research into practice; research in practice)
  5. archetype G: fostering networks (building on existing networks; developing new ones)
  6. archetype H: advancing knowledge mobilisation (building knowledge about knowledge
  7. and knowing).

The authors indicate that these are not intended to describe real life knowledge mobilization practice. “Real organisations rarely display all of the features of ideal types like our archetypes: instead, they are much more likely to show different features to varying degrees.” Nonetheless, these eight archetypes are useful as they describe knowledge mobilization organizations not the more common research on knowledge mobilization practice. For other work on knowledge mobilization organizations, see this article by Amanda Cooper.

Starting on page 113, the report analyzes these eight archetypes using the six domains of interest identified from the literature to illustrate how each of the archetypes has a unique domain profile.

Questions for brokers:

  1. Look at your organization. Which archetype(s) does it resemble?
  2. Next, assess your organization’s archetype(s) against the six domains of interest. Is there alignment with the domain analysis in the table starting on page 113? What does this tell you about your organization?
  3. How is your organization (or how are you) using evidence from the literature to inform your practice? Can you cite the research evidence on which you are basing your practice?

ResearchImpact-RéseauImpactRecherche is producing this journal club series as a way to make the evidence and research on knowledge mobilization more accessible to knowledge brokers and to create on line discussion about research on knowledge mobilization. It is designed for knowledge brokers and other knowledge mobilization stakeholders. Read the article. Then come back to this post and join the journal club by posting your comments.

ResearchImpact-Réseau Impact Recherche Journal Club

Laptop in classic library with books in background

Welcome to the Knowledge Mobilization Journal Club!

The journal club presents a summary of KMb related academic journal articles in a standard format. This space is designed to make KMb research more accessible to KMb practitioners.

Please let us know what you think of the review by leaving us a comment.

If you are reading a journal article that you think would be relevant to KMb practice, you are invited to submit a journal club summary to us for consideration by emailing kmbunit@yorku.ca

Knowledge Mobilization Journal Club Evaluation – Results

Please keep it going, even if we have to clone David

Journal Club Word Cloud

In 2011, we published our first knowledge mobilization journal club post to help make knowledge mobilization research more accessible to knowledge mobilization practitioners. Since then we have published 55 journal club posts, approximately one each month. The evaluation was conducted using an on line survey disseminated in June 2016. The results help us to understand what readers like about the journal club and how we might improve it.

There were 37 respondents to the online survey that was posted on the journal club and disseminated on twitter and using list serves.

  • Majority identified as a knowledge broker
  • Employer: 10/37 non-profit; 9/37 academic; 7/37 public sector; 6/37 self employed
  • Majority had been in their current role and in the field of knowledge mobilization between 1-5 years

Quantitative results

How often do you read a journal club post?

  • Vast majority read the journal club monthly, very few more frequently

How many of the 55 posts have you read

  • Few have accessed all the journal club posts, many less than half, although the qualitative comments (see below) suggest the posts, when read, are valued by the reader

How much of each post did you read?

  • 32% dip in and out, 32% skim it all and 24% read the whole post so it is getting read

Length:

  • Majority of respondents would like to see the length cut in half (to about 450 words).

Original article:

  • Some respondents never read the original article, some always do with many answering that they open the original article about half the time – posting a link to the original article (when available) is useful at least for some respondents

Utility:

  • Most used for individual reading, a few for group discussion and others as references for their own articles and grant applications

Open access articles:

  • Very mixed response ranging from “it’s not important” to “only review open access articles” with the majority of respondents landing in the middle not feeling strongly one way or the other

Route of access:

Qualitative results

Respondents were asked two questions:

  1. What does the journal club provide that is of value or important to you?
  2. Do you have any additional comments or feedback about the journal club?

See the word cloud (attached to this post) for the frequency of words used in responses. Some key words that stick out are: helpful, good, access, keep, important, useful. A sample of responses is below:

  • An honest review of the utility of an article from a trusted source
  • David is good at picking articles that I also find interesting/useful among all the chaff, so I like the journal club because it keeps me current without me having to do all the work to sift through everything that is out there.
  • Sometimes I first learn of an important publication through the club. Great summaries. Useful questions for consideration/discussion.
  • linking to articles that I haven’t read in depth as yet and framing them in a very helpful way
  • saving me considerable effort
  • This is extremely important for beginners in the field to get their feet wet and read what the pros are reading.
  • it allows me to get exposure to the research done in this field, and learn things that can help me directly in my role
  • Love David’s insightful thoughts and critical thinking/questioning of the article. It gives you something to think about. And, he always has very good points that resonate with me because they make sense.
  • It might be good to also have some kind of scheduled live chat about the article – would encourage me to read it
  • Please continue! This is great and so helpful. I’m working hard to try to set time aside to be able to read it more often and more thoroughly.

And the following comment is perfect, “helps bring evidence into practice”. This is exactly why we started the journal club in the first place to make evidence on knowledge mobilization accessible to knowledge mobilization practitioners.

So what now? Considering the quantitative and qualitative responses we will undertake the following actions:

  • Length: will experiment with shorter post
  • Open Access: will prioritize open access but will review an article if is important to our practice
  • Live chat: will explore a tweet chat on a post especially if the author is available to join
  • Access: will maintain the email distribution

Thanks to the 37 respondents who helped inform these actions. Thanks to everyone else for reading and commenting on the journal club posts. Look for the next journal club post early August.