Evaluative Inquiry III: Mixing methods for evaluating research

Evaluative Inquiry III: Mixing methods for evaluating research

Critiques on research metrics have produced an unhelpful divide between quantitative and qualitative research methods. In this blogpost, we explain why research evaluation can benefit from the strengths of both.

Since 2017 we have been developing a new approach to research evaluation, which we call Evaluative Inquiry (EI). In a series of blog posts, we discuss the four principles of EI, based on our experiences in projects with the Protestant Theological University and with the University for Humanistics. In these projects we analyzed the value of the work of these institutes which subsequently informed the self-evaluation document to be submitted to the evaluation committee. In our previous two posts we discussed the EIs focus on value trajectories, and the EIs contextualization of research value. This third post focuses on our methods for evaluating research.

Many in academic and professional environments have discussed and criticized the reliance on metrics and quantitative methods in research evaluation. The Leiden Manifesto, the Metric Tide, or DORA have offered careful considerations on how to measure and represent research value. The Evaluative Inquiry participates in this project using a portfolio of quantitative and qualitative methods that are used in complementary ways to make visible the complexity of academic research.

The Dutch evaluation protocol requires academic organizations to provide proof of use and recognition of academic and societal production. In many fields this distinction between the societal and the academic is artificial, which makes it difficult for scientific boards and managers to put together the self-evaluation document that is required by the evaluation committee. Mixing methods provides different pieces to this complex puzzle, allowing for a less dichotomized and more contextualized approached.

We have used bibliometric methodologies, using Web of Science, Google Scholar and Microsoft Academic, to get insight into co-authorship relations, citation scores and visibility in journals. As fields such as Anthropology and Theology often produce not only articles in journals but also books, monographs and edited volumes, we like working with Google Scholar as it allows us to trace the use of books both in the academic domain as beyond. In addition to these analyses, we have used Ad Prins' Contextual Response Analysis, which makes it possible to get a sense of the users and the patterns of use of the output of the particular research organization. The VOSviewer tool that CWTS developed makes it possible to map the discursive space that research organizations operate in, as well as changes over time. Other than these more quantitative methods, we have used interviews and focus group discussions to get a more fine-grained understanding of the organizational context as well as people’s perceptions of the organization’s strengths and weaknesses. These interviews and focus groups allow us to probe scholars about the relevance of their work for other social environments other than academia which is still their main focus. Our portfolio includes other advanced scientometric analytics that we will put to use in the future, such as the Strength, Potential and Risk analysis and the Area Based Connectedness analysis.

The Evaluative Inquiry uses its methods in a complementary way. Doing Google Scholar analyses and user studies makes it possible to get a sense of the different elements of the academic value trajectory: the kind of output, reception, theoretical and topical developments and people’s engagements and perceptions. The more qualitative interviews and focus groups facilitate the collection of organizational and contextual information, allowing us to situate these value trajectories in direct relation to the academic organizations and stakeholder networks that scholars work within. One method’s insights can, moreover, corroborate or dispute claims of another. Theologians have for example claimed in interviews that their work reaches multiple audiences, while the user analysis and Google Scholar impressions showed a more homogeneous user group. Discussing these dissonances yielded important information about theologians' value trajectories.

An important part of our methodological strategy is to work closely with the academic institute, provide regular updates of our analyses, and allow for questions and input. In the project with the humanists we, moreover, invited them to actively contribute to the analyses. An example of this is the user analysis we did for the University for Humanistic Studies. When we presented preliminary results, the scientific board commented on our classification of topics that users expressed interest in. We therefore invited them to do the labelling themselves, involving them in the nitty-gritty of our analysis and classification process. Allowing them into the kitchen of our analyses not only improved their understanding of the user analysis but also created ownership of the results. We achieved a similar effect with the workshop we have organized towards the end of the project. We invited the whole academic organization for this event where we shared our preliminary findings and gave everyone the opportunity to weigh in and fine-tune our results.

A portfolio of quantitative and qualitative methods provides complementary insights. Metrics can give powerful insights into collaboration patterns, disciplinary orientation and relevant audiences but they are insufficient to understand and represent value and relevance in context; qualitative insights are needed as well. The Evaluative Inquiry uses methods in a versatile and complementary way. This entails a letting go of believing in one method getting it right and being capable of providing the most accurate representation of academic work. We argue, instead, that a multiplication of methods and close collaboration with clients allow for more interesting insights into academic realities. This approach, lastly, fits the requirements of the new Dutch Strategic Evaluation Approach (SEP) 2021 -2027, which calls for a narrative focus of the self-evaluation document.

0 Comments

Add a comment