National and International Initiatives Discussion Group Archives | DORA https://sfdora.org/category/national-and-international-initiatives-discussion-group/ San Francisco Declaration on Research Assessment (DORA) Mon, 11 Nov 2024 05:04:03 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://sfdora.org/wp-content/uploads/2020/11/cropped-favicon_512-1-32x32.png National and International Initiatives Discussion Group Archives | DORA https://sfdora.org/category/national-and-international-initiatives-discussion-group/ 32 32 DORA Initiatives Meeting: Academic evaluation in Uruguay and updates from group members https://sfdora.org/2024/11/11/dora-initiatives-meeting-academic-evaluation-in-uruguay-and-updates-from-group-members/ Mon, 11 Nov 2024 05:04:03 +0000 https://sfdora.org/?p=161814 Each quarter, DORA holds a Discussion Group meeting for National and International Initiatives working to address responsible research assessment reform. This community of practice is a space for initiatives to learn from each other, make connections with like-minded organizations, and collaborate on projects or topics of common interest. Meeting agendas are shaped by participants. If…

The post DORA Initiatives Meeting: Academic evaluation in Uruguay and updates from group members appeared first on DORA.

]]>
Each quarter, DORA holds a Discussion Group meeting for National and International Initiatives working to address responsible research assessment reform. This community of practice is a space for initiatives to learn from each other, make connections with like-minded organizations, and collaborate on projects or topics of common interest. Meeting agendas are shaped by participants. If you lead an initiative, coalition, or organization working to improve research assessment and are interested in joining the group, please find more information here.

During the DORA National and International Initiatives Discussion Group meeting on August 13, 2024, members of the group discussed changes in research policy happening at each of their respective organizations. The group also heard a presentation from Fernanda Beigel, Chair of the UNESCO advisory committee for Open Science (2020-2021), Principle Researcher at the National Scientific and Technical Research Council (CONICET), Argentina, and Chair of Argentina’s National Committee for Open Science. Beigel, who was joined by Celia Quijano Herrera, Maria Soledad Gutierrez Parodi, and Erika Teliz Gonzalez of the Uruguay National Council for Innovation, Science and Technology (CONICYT), presented on her 2024 report Un estudio de la evaluación académica en Uruguay en perspectiva reflexiva or A study of academic evaluation in Uruguay from a reflective perspective. The report and supporting materials are available in Spanish and the executive summary is available in English here.

A study of academic evaluation in Uruguay from a reflective perspective was commissioned based on a public call for proposals by CONICYT out of an interest in evaluating the evaluation of researchers in Uruguay and supporting new institutional assessment practices. This work provides: 1) an important foundational understanding of academic incentives (or disincentives) in Uruguay, and; 2) a set of evidence-based recommendations for how to implement more responsible research assessment practices in Uruguay.

Beigel began her presentation by providing context on Uruguay: Uruguay is a relatively small country of over 3 million citizens, with 1.84 researchers per 1000 inhabitants. One public university (the University of the Republic Uruguay) accounts for 75% of the national research output. In Uruguay, there is extensive overlap between national-level research assessment and institution-level research assessment. Beigel’s work provides a unique perspective and insight into the landscape of assessment and reform in the context of a small country with extensive national oversight over research assessment. This is important because the level of interconnectivity between institutional and national assessment policy determine the key challenges, logistic considerations, and approach that must be taken into consideration when advocating for or reforming assessment practices.

Beigel outlined the different major systems of research assessment in Uruguay and their relationship to one another: the University of the Republic Full Dedication Regime (a university-level assessment system known as RDT) and the National System of Researchers (a national-level assessment system known as SNI), the latter of which is integrated into the Uruguayan Curriculum Vitae system (CVUy). Within these assessment systems, reviewers retain a high degree of autonomy to make decisions regarding faculty promotion. The report includes analyses between the evaluation systems and professional trajectory, methods of knowledge circulation (e.g., publications, books, etc.), combining those data with interviews and focus groups with “members of evaluation committees, officials, academic-scientific referents, and researchers.”

The report includes twenty recommendations for each of these systems across a range of topics. The recommendations for the production indicators and circulation of knowledge include:

  • 9. Broaden the notion of scientific production to include diverse profiles and value both traditional publications and technological production, technical contributions, artistic productions, social reports with public policy recommendations.
  • 10. Promote publication in quality scientific journals, in open access diamond, edited in the country and in Latin America, stimulating quality communication circuits and expansion of audiences.
  • 11. Value the tasks of academic editing (journal management, participation in editorial teams) in the permanence and promotion of the country’s academic evaluation systems.

Beigel, Quijano Herrera, Gutierrez Parodi, and Teliz Gonzalez also discussed the challenges inherent to implementing reform when assessment standards are set at a national level. While a long-term goal is changing national policy to incorporate the report recommendations, they highlighted that an immediate and critical starting point would be the autonomous reviewer panels that make assessment decisions. Sharing resources on the misuse of quantitative indicators and best practices in research assessment with these reviewer panels will be an important step towards building buy-in and support for reform.

We also heard updates from other discussion group member organizations, which are briefly summarized below:

Haley Hazlett was DORA’s Program Manager.

The post DORA Initiatives Meeting: Academic evaluation in Uruguay and updates from group members appeared first on DORA.

]]>
DORA Initiatives Meeting: CLACSO-FOLEC on responsible assessment and open science https://sfdora.org/2024/06/18/dora-initiatives-meeting-updates-from-clacso-folec/ Tue, 18 Jun 2024 18:06:43 +0000 https://sfdora.org/?p=160954 Each quarter, DORA holds a Community of Practice (CoP) meeting for National and International Initiatives working to address responsible research assessment reform. This CoP is a space for initiatives to learn from each other, make connections with like-minded organizations, and collaborate on projects or topics of common interest. Meeting agendas are shaped by participants. If…

The post DORA Initiatives Meeting: CLACSO-FOLEC on responsible assessment and open science appeared first on DORA.

]]>
Each quarter, DORA holds a Community of Practice (CoP) meeting for National and International Initiatives working to address responsible research assessment reform. This CoP is a space for initiatives to learn from each other, make connections with like-minded organizations, and collaborate on projects or topics of common interest. Meeting agendas are shaped by participants. If you lead an initiative, coalition, or organization working to improve research assessment and are interested in joining the group, please find more information here.

During the DORA National and International Initiatives Discussion Group meeting on May 14, 2024, members of the group discussed changes in research policy happening at each of their respective organizations. The group also heard a presentation from Laura Rovelli, coordinator of the Latin American Forum for Research Assessment (FOLEC) from the Latin American Council of Social Sciences (CLACSO). Rovelli discussed the importance of open infrastructures and evaluation systems in research, noting the challenges faced in Latin America in accessing information from the US and Europe due to the privatization of research information.

Rovelli stressed the importance of open research information having transparent evidence and inclusive data, values that are supported by the Barcelona Declaration on Open Research Information. This declaration was created in 2023 by a group of over 25 research information experts and promotes research openness and sustainability of infrastructures for open research information.

Based on a recent collective blog post published in Leiden Madtrics, Rovelli discussed why inclusion and diversity require multiple information sources, and made the argument for:

  • Increasing the relevant documents, especially from under-represented areas
  • Ensuring “Richness of metadata,” as a single source cannot support all research information
  • Preserving a variety of information sources such as histories, languages etc.

Rovelli also touched on how responsible research assessment and open science are interconnected and strengthening one can help improve the other. She discussed FOLEC’s support of instituting a decentralized and federated research information source, noting that data sources should have both regional/local and international repositories.

Other ongoing projects at CLACSO include the AGoRRA Project with RoRI UKRI, and a new publication with UNESCO, Debates actuales y reformas en curso en la evaluación responsable de la investigación en América Latina y el Caribe (English translation: “Current debates and ongoing reforms in the responsible evaluation of research in Latin America and the Caribbean”), discussing responsible research assessment reforms in national evaluation systems in Latin American countries.

We also heard updates from other organizations, which are briefly summarized below:

  • INORMS Research Evaluation Group (INORMS REG) has recently created a forum event about responsible research assessment, and has been holding community calls for the More Than Our Rank initiative.
  • University of Leeds highlighted that they have signed the Knowledge Equity Network; a declaration that promotes collaborative sharing of knowledge globally, aims to address pressing challenges, and supports a culture of openness, diversity, and inclusion within the higher education sector.
  • The organization Recognition & Rewards conducted their Recognition & Rewards culture barometer survey, which yielded data on “the state of affairs concerning a different way of recognising and rewarding academic work within institutions.” A report will be written and published with the results. They have also organized a recognition and rewards festival with workshops that will be held in November, 2024.
  • Universities Norway is part of a CoARA working group that is conducting a survey on how academic careers are assessed. They also highlighted NOR-CAM: “A toolbox for recognition and rewards in academic careers.”
  • Projeto Metricas presented on DORA principles to the Ministry of Education (MEC) as part of its celebration of 20 years of the current evaluation system, noting they are providing consultancy to the Ministry for the redesigning of the system. They also presented about DORA values at Fiocruz, the largest public network of biomedical institutions in Brazil.
  • As for DORA, we have released a guidance document on research indicators, a report on supporting responsible assessment practices in pre-award funding processes, and a report on monitoring the effectiveness of narrative CVs.

Suggested Reading List

Casey Donahoe is DORA’s Policy Associate

The post DORA Initiatives Meeting: CLACSO-FOLEC on responsible assessment and open science appeared first on DORA.

]]>
DORA Initiatives Meeting: Reformscape https://sfdora.org/2024/03/07/dora-initiatives-meeting-reformscape/ Thu, 07 Mar 2024 15:34:32 +0000 https://sfdora.org/?p=160012 Each quarter, DORA holds a Community of Practice (CoP) meeting for National and International Initiatives working to address responsible research assessment reform. This CoP is a space for initiatives to learn from each other, make connections with like-minded organizations, and collaborate on projects or topics of common interest. Meeting agendas are shaped by participants. If…

The post DORA Initiatives Meeting: Reformscape appeared first on DORA.

]]>
Each quarter, DORA holds a Community of Practice (CoP) meeting for National and International Initiatives working to address responsible research assessment reform. This CoP is a space for initiatives to learn from each other, make connections with like-minded organizations, and collaborate on projects or topics of common interest. Meeting agendas are shaped by participants. If you lead an initiative, coalition, or organization working to improve research assessment and are interested in joining the group, please find more information here.

During this quarters’ initiatives meeting, our very own Haley Hazlett, Program Manager, presented about Reformscape, a searchable collection of assessment criteria and standards from academic institutions. Reformscape is a tool developed by the TARA team at DORA and has been molded and improved over the past two and a half years by including community discussions and user testing. Reformscape is meant to inform institutions on how reform occurs by providing information on reform policies, processes, action plans, announcements etc. for multiple disciplines and levels of career. The tool has been active since January, 2024, and will continue to grow and expand in information, materials and geographics.

During the round table discussion we heard updates from multiple organizations including the following:

  • HELIOS Open organized a workshop in which 50 senior leaders (presidents, VPs, provosts etc.) joined in Miami to discuss modernizing tenure and promotion procedures.
  • The Southern African Regional Universities Association (SARUA) is creating a community of practice for responsible research assessment at academic institutions in South Africa mentioning their interest in the participation of experts to help drive the discussion.
  • The Natural Sciences and Engineering Research Council of Canada (NSERC) has published guidelines on research assessment, developing an evaluation framework for research, training and mentoring.

Suggested Reading List:

Casey Donahoe is DORA’s Policy Associate

The post DORA Initiatives Meeting: Reformscape appeared first on DORA.

]]>
Multilingualism and Language Bias in Research Assessment: Supporting Non-English Speaking Scientists https://sfdora.org/2024/01/18/multilingualism-and-language-bias-in-research-assessment-supporting-non-english-speaking-scientists/ Thu, 18 Jan 2024 16:24:10 +0000 https://sfdora.org/?p=159811 Each quarter, DORA holds a Community of Practice (CoP) meeting for National and International Initiatives working to address responsible research assessment reform. This CoP is a space for initiatives to learn from each other, make connections with like-minded organizations, and collaborate on projects or topics of common interest. Meeting agendas are shaped by participants. If…

The post Multilingualism and Language Bias in Research Assessment: Supporting Non-English Speaking Scientists appeared first on DORA.

]]>
Each quarter, DORA holds a Community of Practice (CoP) meeting for National and International Initiatives working to address responsible research assessment reform. This CoP is a space for initiatives to learn from each other, make connections with like-minded organizations, and collaborate on projects or topics of common interest. Meeting agendas are shaped by participants. If you lead an initiative, coalition, or organization working to improve research assessment and are interested in joining the group, please find more information here.

During the DORA National & International Initiatives CoP Discussion Group meeting on November 14, 2023, members of the group and DORA staff members met to discuss changes in research policy happening at each of their respective organizations. The group heard a presentation from Janne Pölönen, Secretary General of the Federation of Finnish Learned Societies, on multilingualism and language biases in research assessment.

Pölönen discussed the efforts of the Coalition for Advancing Research Assessment (CoARA) working group on multilingualism and language biases, highlighting their mission to raise awareness and provide guidelines for recognizing and rewarding science in all languages. Pölönen first outlined the importance of recognizing research in all languages, noting the challenges many non-English speaking scientists face when they publish papers in their native languages. English-language journals and universities are often used as the standard or default for the scholarly community, putting pressure on researchers to translate their work and try to publish in English journals. As a result, publications are largely skewed towards English, leaving non-English publications less cited. Non-English publications may also be passed over due to search engines ignoring special characteristics of other languages, subsequently omitting information or meaning.

Although a large majority of indexed publications are in English (upwards of 90%), preliminary data from the Federation Finnish Learned Societies demonstrates that, of 152,644 journals included in a global analysis, 53% are multilingual or in languages other than English. This suggests that multilingualism in publications is more established than once thought. The issue then becomes accessibility. While there may be lots of local-language publications, if they are not indexed in major databases, they essentially become “invisible” to evaluators and their impact is not recognized. The CoARA working group aims to address these challenges by raising awareness about multilingualism in science, addressing language biases in metrics, expert-assessment and rankings, and providing institutions with guidelines and tools to implement equitable recognition and incentivization for research in all languages.

Image credit: Mikael Laakso and Janne Pölönen (2023). Mapping the global landscape of journals.

In addition to his role with the CoARA working group, Pölönen is on the management committee of the Helsinki Initiative on Multilingualism in Scholarly Communication, of which the Federation of Finnish Learned Societies is a founding signatory. Created in 2019, the Helsinki Initiative works to address the challenges faced by non-English speaking scientists around the world. The Helsinki Initiative provides recommendations for promoting language diversity in research assessment, evaluation, and funding systems: “Make sure that in the process of expert-based evaluation, high quality research is valued regardless of the publishing language or publication channel.” Pölönen stressed the importance of ensuring the availability of high quality research in all languages. The dissemination of research beyond academia and the promotion of interacting with heritage, culture, and society are crucial components of the Helinski Initiative.

Pölönen’s presentation was followed by a group discussion that touched on a range of key issues related to supporting multilingualism in publishing and scholarly communications. For example, small journals that serve local communities often encounter difficulty being visible to the scientific community beyond the local one in which they serve. While publishing locally serves the community and can prevent the loss of geographically and culturally important elements, journals of this nature face difficulties being accessible to a broader audience and the discrepancies are very large across multiple disciplines.

Both Pölönen and DORA promote the recognition of research across all disciplines and cultures, valuing openness and transparency. In his presentation, Pölönen discussed the importance of these values and emphasized the importance of equitable and inclusive scholarly communication. Promoting multilingualism in research can help to ensure scientists from around the world not only have access to knowledge and research, but also the ability to share results in their native language.

Suggested Reading List

Multilingual publishing in the social sciences and humanities: A seven-country European study

Helsinki Initiative on Multilingualism in Scholarly Communication

Multilingual publishing across fields of science: analysis of data from Finland and Poland

Mapping the global landscape of journals

Beyond the Web of Science: An overview of Brazilian papers indexed by regionally relevant databases (slides from ISSI 2021 Conference)

The patterns of publication languages used by STEM research in China (slides from ISSI 2023 Conference)

Who are the users of national open access journals? The case of the Finnish Journal.fi platform

A longitudinal analysis of university rankings

The manifold costs of being a non-native English speaker in science

Using Narrative CVs: Process Optimization and bias mitigation

Multilingualism of social sciences

Casey Donahoe is DORA’s Policy Associate

The post Multilingualism and Language Bias in Research Assessment: Supporting Non-English Speaking Scientists appeared first on DORA.

]]>
A Quality Assessment Framework for Research Design, Planning, and Evaluation: Updates from the Sustainability Research Effectiveness Program in Canada https://sfdora.org/2023/09/06/a-quality-assessment-framework-for-research-design-planning-and-evaluation-updates-from-the-sustainability-research-effectiveness-program-in-canada/ Wed, 06 Sep 2023 19:26:38 +0000 https://sfdora.org/?p=158784 Each quarter, DORA holds a Community of Practice (CoP) meeting for National and International Initiatives working to address responsible research assessment reform. This CoP is a space for initiatives to learn from each other, make connections with like-minded organizations, and collaborate on projects or topics of common interest. Meeting agendas are shaped by participants. If…

The post A Quality Assessment Framework for Research Design, Planning, and Evaluation: Updates from the Sustainability Research Effectiveness Program in Canada appeared first on DORA.

]]>
Each quarter, DORA holds a Community of Practice (CoP) meeting for National and International Initiatives working to address responsible research assessment reform. This CoP is a space for initiatives to learn from each other, make connections with like-minded organizations, and collaborate on projects or topics of common interest. Meeting agendas are shaped by participants. If you lead an initiative, coalition, or organization working to improve research assessment and are interested in joining the group, please find more information here.

Socially impactful research is multi-faceted, requiring transdisciplinary approaches that involve a wide range of actors, approaches, and pathways to influence change. However, addressing the epistemological and methodological variability of transdisciplinary research (TDR) has been challenging. There is a need for standards and methods to define and assess quality in TDR. To address this, Brian Belcher and his colleagues from the Sustainability Research Effectiveness Program at Royal Roads University, Canada conducted a systematic literature review that provides several definitions and ways of assessing research quality in a transdisciplinary context. During our May CoP meeting, Brian Belcher, Rachel Claus, and Rachel Davel presented some excerpts of their study to highlight how their Transdisciplinary Research Quality Assessment Framework (QAF) has advanced and been refined through testing.

The Linear Model of Research Impact assumes that the production and dissemination of new knowledge will eventually reach those who will find it useful, leading to uptake, scaling, and inevitable impact for society at large, where the world becomes a better place. In reality, there are multiple pathways to research influence. Researchers aiming for social impact take on a more active role in relationship-building through partnerships and networking, knowledge co-generation, capacity-building (particularly to do and use research), public discourse and policy processes, lobbying and negotiation, and more. Transdisciplinary approaches facilitate and generate value through these processes to support greater research impact. Yet, Belcher noted that there are limits to a project’s influence as it progresses through the research process. When passing the boundary from a project’s sphere of control (i.e., activities and outputs) into its sphere of influence (i.e., actors whom the project works with and through to influence changes in behaviours), there are more external processes and influences at play that can affect the realization of a project’s intended outcomes and ultimately  social, economic, and environmental impacts.

As background to the team’s latest revisions of the QAF, the QAF originates from a systematic review of literature to determine the most appropriate principles and criteria to define and assess TDR quality . The literature identified dominant themes that should be considered in TDR evaluation: engagement with problem context, collaboration and inclusion of stakeholders, heightened need for explicit communication and reflection, integration of epistemologies and methods, recognition of diverse outputs, focus on outcomes and impact, and reflexivity and adaptation throughout the process. The objective of the systematic literature review was to pinpoint suitable principles criteria to delineate and gauge research excellence, and subsequently structure these into a comprehensive assessment framework. The QAF was published in 2016, organized by the four principles of : Relevance, Credibility, Legitimacy, and Effectiveness.

Over the past several years, the team has conducted a series of case study evaluations of completed research projects where they applied the QAF to learn lessons about how TDR qualities can support the realization of outcomes. Through this testing, the team also identified several aspects of the QAF that could be improved. Some of the main changes include revisions to the principle and criteria names and definitions to support clarity, the addition of missing criteria, the removal of overlap in criteria definitions that led to double-counting, and replacement of the former rubric with practical guidance. The revised principles are defined as:

  • Relevance: the appropriateness of the problem framing, research objectives, and approach for intended users and audiences;
  • Credibility: the rigor of the research design and process to produce dependable and defensible conclusions;
  • Legitimacy: the perceived fairness and representativeness of the research process; and
  • Positioning for use: the degree to which research is likely to be taken up and used to contribute to outcomes and impacts.

A key emphasis of the latest version of the QAF is to assess and score each criterion against the project’s purpose. QAF scores can be visualized in spidergrams to support further analyses, such as identification of TDR qualities that are present or absent in projects (Image 1) as well as comparisons between projects (Image 2).

For example: Project A has a strong socially relevant problem but scores lower on the criteria pertaining to research objectives, communication, and explicit theory of change (Image 1). Project A satisfied most of the positioning for use criteria and relatively fewer of the credibility and legitimacy criteria.

Image 1: Spidergrams used to visualize how a project scores against the QAF’s principles of relevance, credibility, legitimacy, and positioning for use

This visualization can also be used to compare projects. For instance, Project A might be strong in addressing a socially relevant research problem, but Project B is much stronger in terms of proposing 1) relevant research objectives, 2) clearly defined problem context and 3) engagement with the problem context (Image 2).

When asked how this framework could be more impactful, Belcher said that involving stakeholders in the research process would help them understand how the research contributes to change as well as their role within the change process.

In addition to ex-post evaluation, the QAF enables a structured assessment of project proposals ex-ante to identify and allocate funding to projects that are relevant, credible, legitimate, and well positioned for use. It can also be used to appraise the weaknesses in a proposal from a transdisciplinary perspective. Belcher also mentioned that the QAF has been applied to some projects within a single traditional discipline that had several strong elements and contributed to social change processes. Additionally, some Canadian and international organizations have incorporated these principles and criteria into their research proposal guidelines. These include (1) the Pacific Institute for Climate Solutions (a collaboration of four universities in British Columbia), (2) the Natural Sciences and Engineering Research Council of Canada (NSERC; a Canadian federal funding agency), and (3) CGIAR (an international agricultural research group aiming to deliver science for food security).

Image 2: Comparison of two projects’ QAF scores for the Relevance principle

Conclusion

Belcher highlighted that the application of the QAF to research evaluation will add more transdisciplinary qualities that will achieve more impactful outcomes. However, more widespread application and testing to build the evidence base is needed, and this is possible when the framework is shared among other users. Expanded applications would enable users to identify challenges and opportunities to inform revisions.

A copy of the presentation can be accessed on the Sustainability Research Effectiveness team’s website.


Queen Saikia is DORA’s Policy Associate


References

  • Belcher, B. M., Rasmussen, K. E., Kemshaw, M. R., & Zornes, D. A. (2016). Defining and assessing research quality in a transdisciplinary context. Research Evaluation, rvv025. http://doi.org/10.1093/reseval/rvv025
  • Cash, D., Clark, W. Alcock, F., Dickson, N., Eckley, N., and Jager, J. (2002). Salience, Credibility, Legitimacy and Boundaries: Linking Research, Assessment and Decision Making. KSG Working Paper Series RWP02-046
  • Guthrie, S., Wamae, W., Diepeveen, S., Wooding, S., and Grant, J. (2013). Measuring Research: A Guide to Research Evaluation Frameworks and Tools. Rand Europe.
  • Ofir, Z., Schwandt, T., Duggan, C., and McLean, R. (2016). Research Quality Plus: A Holistic Approach to Evaluating Research. IDRC, Canada.
  • Stokes, D. (1997). Pasteur’s Quadrant: Basic Science and technological innovation. Brookings Institution Press 

Relevant resources shared on the call


.


 

The post A Quality Assessment Framework for Research Design, Planning, and Evaluation: Updates from the Sustainability Research Effectiveness Program in Canada appeared first on DORA.

]]>
Empowering fair evaluation and collective impact: efforts to drive change from INORMS and CoARA https://sfdora.org/2023/07/18/empowering-fair-evaluation-and-collective-impact-inorms-and-coaras-efforts-to-drive-change/ Tue, 18 Jul 2023 14:44:03 +0000 https://sfdora.org/?p=158373 Each quarter, DORA holds a Community of Practice (CoP) meeting for National and International Initiatives working to address responsible research assessment reform. This CoP is a space for initiatives to learn from each other, make connections with like-minded organizations, and collaborate on projects or topics of common interest. Meeting agendas are shaped by participants. If…

The post Empowering fair evaluation and collective impact: efforts to drive change from INORMS and CoARA appeared first on DORA.

]]>
Each quarter, DORA holds a Community of Practice (CoP) meeting for National and International Initiatives working to address responsible research assessment reform. This CoP is a space for initiatives to learn from each other, make connections with like-minded organizations, and collaborate on projects or topics of common interest. Meeting agendas are shaped by participants. If you lead an initiative, coalition, or organization working to improve research assessment and are interested in joining the group, please find more information here.

Status quo of research evaluations

Research evaluation practices have the power to impact academic careers and reputations positively or negatively with their far-reaching implications for funding, awards, career advancements, and other prospects. Although the way research is evaluated has a critical part in academic lives, the traditional quantitative approaches to research assessment are increasingly recognized as inappropriate. The indicators presently in use are overly reliant on narrow publication-based quantitative metrics that fail to adequately capture the full range of research contributions while limiting equity and innovation. They do not consider factors such as the quality of the research, the rigor of the methodology, or the impact on society or openness/interoperability. Such metrics can lead to disadvantages for those working in less mainstream or interdisciplinary fields, or for those who belong from less-privileged demographic backgrounds than others. Relying solely on such indicators can result in an incomplete and potentially unfair assessment of a researcher’s contributions to their field.

With the goal to support more responsible research assessment processes, guidelines and recommendations such as the Declaration on Research Assessment (DORA), Leiden Manifesto for Research Metrics, the Metric Tide and other individual university guidelines came into action to set the standards for responsible assessment. In 2018, the International Network of Research Management Societies (INORMS), a collective of research management associations and societies worldwide, set out to build a structured framework to embrace the shift towards more responsible approaches to assessment. INORMS has various initiatives, projects, toolkits, and guidelines to promote evaluation practices that prioritize and foster fairness, openness, inclusivity, transparency, and innovations. Two of their key contributions are 1) designing the SCOPE Framework for Research Evaluation and 2) the More Than Our Rank (MTOR) initiative.

During the first quarterly DORA National and International Initiatives discussion group call of 2023, Elizabeth Gadd, Chair of INORMS Research Evaluation Group (REG), provided updates on the work of INORMS, MTOR, and the Coalition on Advancing Research Assessment (CoARA), where she serves as Vice-Chair.

SCOPE Framework: a structured approach for evaluations

In 2019, the first version of the SCOPE Framework was released by the INORMS Research Evaluation Working Group. This framework was developed to be used as a practical guide to successfully implement responsible research assessment principles and to facilitate the use of more thoughtful and appropriate metrics for evaluations in institutions and organizations.

In 2021, INORMS REG published an updated version of the SCOPE framework that included a five-step process to be followed by organizations during evaluations: Start with what is valued, consider Context, explore Options for measuring, Probe deeply, and Evaluate the evaluation. Each of these five steps has been elaborated on in the practical guide for easy adoption. The operating principles behind this five-stage process are:

  1. Evaluate only where necessary
  2. Evaluate with the evaluated
  3. Draw on evaluation expertise

To help research leaders and practitioners drive robust evaluation processes in their institutions the working group has publicly shared several resources online. The guidebook also presents details of the process of change through multiple case studies to learn from. Some of the use case examples, also mentioned by Gadd in her presentation, included: Joint UK HE funding bodies who put deliberate efforts to redesign the Research Excellence Framework (REF), Emerald Publishing who are consciously ensuring more diversity in their editorial board, etc. Additionally, there are SCOPE workshops for institutional research administrative leaders to learn how to systemically adopt this framework.

Some of the strengths of the SCOPE framework are its holistic step-by-step approach, flexibility, and adaptability to different disciplinary and institutional contexts. The framework can be customized to reflect the specific goals and priorities of different stakeholders and can be used to evaluate research at various levels of the research evaluation food chain.

More Than Our Ranks (MTOR) Initiative: an opportunity to recalibrate university rankings

Because evaluation processes can also profoundly impact the “reputation” and funding of academic organizations, INORMS launched the MTOR initiative, which is closely linked to the SCOPE framework and seeks to provide institutions with a means by which they can surface all their activities and achievements not captured by the global university rankings.

Gadd published “University rankings need a rethink” in 2020, which highlighted the key findings from their work on evaluating the ranking agencies based on community-designed criteria. They found that most “flagship” university rankings barely incorporated open access, equality, diversity, sustainability, or other society-focused agendas into their criteria for global rankings. This work sparked many discussions on how the current university ranking system may be inadequate and harmful because it does not “meet community’s expectations of responsibility and fairness”. So, there was a necessary change needed to bring a sense of accountability amongst rankers to determine what matters most for each university.

The INORMS REG believes that any institution, even a top ranker, has more to offer than what could be captured by parameters used in the current ranking systems. This was the foundational drive behind pioneering the MTOR initiative in October 2022, which encourages institutions to declare in a narrative way their perspectives on their unique missions, activities, contributions to society, teachings, etc., and explain why they are more than the overhyped university rankings. Gadd emphasized that signatory institutions are not required to boycott rankings altogether.­­­

The REG has also provided guidelines on their website for Higher Education Institutes (HEIs) to learn about how to participate in the MTOR movement and also for individuals from the community to contribute in encouraging their universities for being a part of the MTOR initiative. Organizations like Loughborough University, Keele University, Izmir Institute of Technology, and Queensland University of Technology (QUT) are some of the early adopters of MTOR. However, it was also discussed that one major challenge for institutions to participate in the MTOR movement is due to their financial dependence on global rankings for their fundings, etc. in the current system.

Finally, Gadd shared updates from CoARA, which is a network bringing together stakeholders in the global research community (viz., research funders, universities, research centers, learned societies, etc.) to enable systemic level reform for the use of responsible, and effective research assessment practices.

CoARA: a common direction for reforming research assessment

After its international iterative creation, facilitated initially by the European University Association (EUA), Science Europe and the European Commission, the Agreement on Reforming Research Assessment was published in July 2022. Organizations willing to publicly commit to find a path to improve their research assessments can sign the Agreement and also be eligible to be a part of the Coalition to be actively involved in decision-making processes within the CoARA. The Agreement, which built on the progress made by earlier responsible research assessment guidelines and principles (DORA, Leiden Manifesto, Hong Kong principles, etc.), consists of 4 core commitments: 1) recognizing diverse contributions during assessments, 2) using qualitative peer-review-based metrics over quantitative indicators for research evaluation, 3) avoiding uses of journal and publication-based metrics and 4) abolishing the use of university rankings during assessments. These four core commitments are accompanied by 6 supporting commitments related to building and sharing new knowledge, tools and resources, and raising awareness within the community. Within 5 years of becoming a signatory, organizations have to demonstrate the changes made for reforming research assessment at their institutions.

As a newly established association, CoARA had the first General Assembly meeting on 1st December, 2022 since when the secretariat role was handed over to the European Science Foundation – Science Connect (ESF-SC). CoARA has recently opened the first call for Working Groups and National Chapters in March 2023, and will update on more General Assembly meetings and many other activities, such as webinars, conferences, etc. to build the network stronger and initiate dialogues amongst the different CoARA stakeholders with relevant evaluation initiatives and community of practices. Gadd’s talk was followed by discussions on the working group’s probable focus areas, including peer reviews, responsible metrics, funding disparities, etc.

United collaborative efforts from the research community, including individuals, universities, funders, initiatives, etc., are vital to push forward and evolve responsible research assessment on a systemic level.

Sudeepa Nandi is DORA’s Policy Associate

The post Empowering fair evaluation and collective impact: efforts to drive change from INORMS and CoARA appeared first on DORA.

]]>
Projeto Métricas/Fapesp: A Collaborative Roadmap for DORA Implementation in Brazil https://sfdora.org/2023/01/24/projeto-metricas-brazil/ Tue, 24 Jan 2023 22:00:22 +0000 https://sfdora.org/?p=157064 Each quarter, DORA holds a Community of Practice (CoP) meeting for National and International Initiatives working to address responsible research assessment reform. This CoP is a space for initiatives to learn from each other, make connections with like-minded organizations, and collaborate on projects or topics of common interest. Meeting agendas are shaped by participants. If…

The post Projeto Métricas/Fapesp: A Collaborative Roadmap for DORA Implementation in Brazil appeared first on DORA.

]]>
Each quarter, DORA holds a Community of Practice (CoP) meeting for National and International Initiatives working to address responsible research assessment reform. This CoP is a space for initiatives to learn from each other, make connections with like-minded organizations, and collaborate on projects or topics of common interest. Meeting agendas are shaped by participants. If you lead an initiative, coalition, or organization working to improve research assessment and are interested in joining the group, please find more information here.

The scope for change in evaluation systems in Brazil

In order to make science work for national betterment, a country’s research institutions need to align themselves with societal impact, environmental responsibility, open science, and responsible research practices. Often, these are the values that academic institutions espouse. However, a narrow overreliance on inappropriate metrics as proxy measures of quality can fail to recognize and reward open practices, rigor, reproducibility, locally relevant research, and more. Towards the goal of creating and supporting more responsible evaluation systems in Brazil, the Sao Paulo State Research Foundation (FAPESP) founded and funded Projeto Métricas, a multi-institutional project involving the six major public universities in Sao Paulo. The aim of Projeto Métricas is to boost awareness about the appropriate use of bibliometrics and improve university governance and science communication. The project offers a number of awareness programs, workshops, and courses for professionals and researchers all across Brazil.

On November 8, 2022, Justin Axel-Berg and Jacques Marcovitch presented the work of Projeto Métricas at DORA’s National and International Initiatives for Research Assessment Community of Practice meeting. On the call, Justin Axel-Berg highlighted that although a few departments and institutions are improving their assessment practices responsibly, a majority of evaluation methods in Brazil are still highly orthodox. Broadly, they are quantitative and restrictive, both at the level of the Council (i.e., the national governing body of research and research funding agencies) for institutional-level evaluation and/or by the institutions in their hiring and promotion policies. In light of this, Projeto Métricas is currently working on monitoring and identifying the gaps in research assessment practices at public universities, and also mapping the impacts of ongoing reform efforts. As part of their project for the DORA Community Engagement Grant program, Projeto Métricas held a workshop and conducted a survey to help the development of a roadmap for supporting the implementation of DORA’s responsible research assessment principles at Brazilian institutions. Axel-Berg pointed out during the discussion that, although the number of individual and organizational DORA signatories in Brazil are comparable to those in Europe and the UK, only two research-focused Brazilian organizations have signed so far: The University of São Paulo (USP) and The State University of Campinas (UNICAMP). In an attempt to gain a broader understanding of the obstacles in committing to responsible research evaluation practices and to identify areas for intervention, Projeto Métricas took a 3-step approach. The details of their work can be found online

Course of action

In their first DORA Community Engagement Workshop, they held an exploratory quantitative and qualitative survey based on the five pillars of the SCOPE rubric. Respondents included individual DORA signers from USP and UNICAMP and were from a diverse range of academic career stages. Because of this, Projeto Métricas was able to gain a holistic understanding of the reasons behind the challenges to implementing responsible evaluation practices. Survey results suggested that the major barriers were: i) institutional resistance to change, ii) lack of training of evaluators, iii) dearth of society engagement and consultations for evaluations, and iv) senior colleagues being unaware of the advancements for responsible research. The workshop was then followed by a public event that was open to the wider Brazilian research community. During this event, the key points from the preliminary survey were discussed to identify the main action areas and the possible ways to resolve the issues. Lastly, a consolidated report, with the key findings from the workshop, the survey, and the public event, was evaluated by field experts within Brazil who helped to narrow down the results to three concrete target areas of action:

  1. Awareness of responsible evaluation practices: The community needs to be made more aware of responsible evaluation practices and initiatives (e.g., DORA, Leiden, etc.) that are working towards reform. Informing and empowering the community through discussions about impact-driven and responsible assessment would be the first step to understanding, interpreting, and reflecting on what performance indicators are useful and what indicators are being misused in Brazil. Reforming the traditional, fixed, and limited evaluation models that are currently in use will require a collective broadening of mindsets. Therefore, awareness programs should actively engage community members at various career stages. It is also important for universities and research institutes to incorporate community ideas in developing frameworks that are suitable for a wide variety of evaluations, including career advancement, recruitment, grants, etc. while aligning with personal, institutional, local, and national values.
  2. Training and capacity building of evaluators and participants in processes: Workshops, training programs, debates and discussions to be conducted regarding meaningful and conscious evaluation tools for evaluators to those being evaluated.
  3. Planning and execution of new evaluation practices: The evaluation methods must also be flexible and accommodating to dynamic evolutions in the academic system. Developing a well-structured plan of action, for short-, medium- and long-term goals, with consistent monitoring, is essential to the successful implementation of newer practices. This would enable universities and institutions agility into the future.

Underlying challenges and possibilities in the Brazilian context

Axel-Berg elaborated on some of the underlying features that are unique to Brazil and hindering its progress toward responsible research assessment reform. The Brazilian higher education system is highly heterogeneous in terms of the nature of the institutions, the fields of study, etc., which complicates the process of developing uniform evaluation policies. Compounding this, cultural factors contribute to resistance in changing mindsets and a lack of participation from government bodies and academic faculty and staff. Because the universities and academic institutions are bound by strict federal laws, there is not much room for change other than on an individual basis currently. Another unique challenge that Axel-Berg highlighted was the language and digitalization gaps in disseminating scholarly knowledge from Brazilian researchers. Strategies for overcoming some of these issues were discussed during the meeting and a number of suggestions were made, including: involving learned societies in the reform process to facilitate more wide-ranging communications in a context- and discipline-specific manner; enhancing trainings on structuring; substantiating and popularizing narrative CVs; retaining staff to support a strong institutional memory; and promoting institutional autonomy for making changes more easily.

Projeto Métricas is taking on these challenges one step at a time, and its immediate goal is to provide guidance and assistance at the individual level. Axel-Berg concluded his presentation by reiterating their long-term goals to bring about a more holistic change by incorporating not only Open Science and research integrity but also society-centric metrics and environmental responsibility as parameters into evaluations.

Sudeepa Nandi is DORA’s Policy Associate

The post Projeto Métricas/Fapesp: A Collaborative Roadmap for DORA Implementation in Brazil appeared first on DORA.

]]>