Funder discussion groups Archives | DORA https://sfdora.org/category/dora-funder-discussions/ San Francisco Declaration on Research Assessment (DORA) Mon, 28 Oct 2024 21:36:13 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://sfdora.org/wp-content/uploads/2020/11/cropped-favicon_512-1-32x32.png Funder discussion groups Archives | DORA https://sfdora.org/category/dora-funder-discussions/ 32 32 Updates from the Asia-Pacific Funder Discussion Group 18/09/24 https://sfdora.org/2024/10/28/updates-from-the-asia-pacific-funder-discussion-group-18-09-24/ Mon, 28 Oct 2024 21:36:13 +0000 https://sfdora.org/?p=161687 19 representatives from 7 research funder organisations participated in the last quarterly  Asia-Pacific Funder Discussion Group hosted by DORA.  New DORA staff members, Liz Allen and Janet Catterall were introduced to the group. The position of  DORA Program Manager is currently vacant. DORA updates to the group included the upcoming implementation guide and guidance document,…

The post Updates from the Asia-Pacific Funder Discussion Group 18/09/24 appeared first on DORA.

]]>
19 representatives from 7 research funder organisations participated in the last quarterly  Asia-Pacific Funder Discussion Group hosted by DORA. 

New DORA staff members, Liz Allen and Janet Catterall were introduced to the group. The position of  DORA Program Manager is currently vacant.

DORA updates to the group included the upcoming implementation guide and guidance document, anticipated for early next year, and the imminent release of three toolkits, which emerged from the workshops DORA held in May 2024 with the  Elizabeth Blackwell Institute/MoreBrains project. These workshops sought to find ways to increase equality, diversity, inclusion, and transparency in funding applications. The toolkits will focus on simplifying funding call structures, changing application processes to reduce likelihood of bias in outcomes (e.g. recognising a broader range of research outputs, narrative CV formats etc) and improving training for reviewers and evaluators. The team is also producing three case studies about the funding application process.

Participants then engaged in a roundtable discussion where members shared their work from the past year, including collaborations, asked questions of the group and suggested topics they would like the group to cover in the future. Common themes that emerged from these discussions included:

  • Trialling new selection processes, fellowships and panels to foster greater inclusivity of underrepresented communities, particularly Aboriginal and Torres Strait Islander, Māori and Pasifika applicants and assessors, and exploring ways to engage with these communities.
  • Experimenting with the inclusion of the narrative cv option in grant applications. 
  • Exploring alternative metrics for research assessment and new KPIs.
  • Investigating different models that support strategic discretion in decision making to further equity and fairness
  • Condensing and simplifying the application process
  • Fostering a greater understanding of how artificial intelligence tools can be or are being utilised by applicants and assessors

The utility of artificial intelligence was further discussed in terms of the assessment process itself- can these technologies be used for the initial screen? To find new peer reviewers? To summarise panel results? How can an agency build such tools into the review process? Funders reported that AI tools had been trialled already for assigning peer reviewers and for aligning applications with assessors. Confidentiality is a big consideration so it is recommended that title and keywords only be used in prompts and not the abstract.

The last quarterly meeting for 2024 will feature a presentation from the Global Research Council RRA Working Group members Joanne Looyen (MBIE) and Anh-Khoi Trinh (NSERC)

Call for member presentations for 2025

The post Updates from the Asia-Pacific Funder Discussion Group 18/09/24 appeared first on DORA.

]]>
DORA Newsletter May 2024 https://sfdora.org/2024/05/07/dora-newsletter-may-2024/ Tue, 07 May 2024 10:00:07 +0000 https://sfdora.org/?p=161192 Announcements Reformscape in Full Swing Since its release in January, Reformscape has been serving the research community in providing 230 documents encouraging openness & transparency. New documents and information continue to be added and are always publicly available. Upgrades have also been made to make the platform more user-friendly. You can now search for responsible…

The post DORA Newsletter May 2024 appeared first on DORA.

]]>

Announcements


Reformscape in Full Swing

Since its release in January, Reformscape has been serving the research community in providing 230 documents encouraging openness & transparency. New documents and information continue to be added and are always publicly available. Upgrades have also been made to make the platform more user-friendly. You can now search for responsible research assessment resources based on type, making it easy to find:

  • Action plans
  • Policies to reform hiring, promotion or tenure
  • Outcomes of new policies
New guidance released

DORA’s opposition to the overuse of the Journal Impact Factor is well known, but the original declaration did not specifically address other forms of indicators that are sometimes used as proxy measures of quality in research assessment. In a new guidance document, we examine the potential problems of not only the Journal Impact Factor, but the h-index, altmetrics, and various other citation measures. None of these indicators are without problems, but the guidance provides five principles to help reduce some of the concerns.

This guidance is available on the DORA website and Zenodo. For questions about this guidance, email info@sfdora.org.

New Narrative CV Report

DORA is pleased to announce a new report on the implementation and monitoring of narrative CVs for grant funding. This report was created in collaboration with FORGEN CoP, Science Foundation Ireland, the Swiss National Science Foundation, UK Research and Innovation, and the University of Bristol, Elizabeth Blackwell Institute for Health Research. The report summarizes takeaways and recommended actions from a joint workshop held in February 2022 on identifying shared objectives for and monitoring the effectiveness of narrative CVs for grant evaluation. More than 180 people from over 30 countries and 50 funding organizations participated.

Read the report

New report on improving pre-award processes

The processes that take place before research is submitted for funding (pre-award processes) serve as important scaffolding to support equitable and transparent research assessment. This report summarizes the key recommendations from DORA’s Funder Discussion Group symposia and workshops to improve pre-award processes, which were held in collaboration with the Elizabeth Blackwell Institute for Health Research (EBI) at the University of Bristol and the MoreBrains Cooperative.

Read the report

Building on this work, we are pleased to also announce that DORA, EBI, and MoreBrains are continuing their collaboration and are developing a new project to look at how three of the recommendations could be implemented. In May 2024, we will host two workshops that will bring DORA’s Funder Discussion Groups together with research administrators and managers to generate tools and guidance that address practical implementation of these recommendations.

DORA seeks new steering committee member from Asia

DORA is looking for Steering Committee members based in Asia. To be considered for this position, please complete this self-nomination form by May 31, 2024.

Approaching 25,000 signatories

There are nearly 25,000 individuals and organizations that have recognized a need to promote responsible research assessment. Every day the number of signatories for DORA increases and we anticipate reaching 25,000 signatories by summer. Signing the San Francisco Declaration on Research Assessment signifies a dedication to the principle that scientists should be evaluated on their individual achievements and quality of their work rather than on journal-based metrics, such as Journal Impact Factor. We are excited to reach the 25,000 mark and share this milestone with the research community!

The post DORA Newsletter May 2024 appeared first on DORA.

]]>
Improving pre-award processes for equitable and transparent research assessment https://sfdora.org/2024/04/26/improving-pre-award-processes-for-equitable-and-transparent-research-assessment/ Fri, 26 Apr 2024 16:29:52 +0000 https://sfdora.org/?p=160619 Click here to download this report. This report may also be found in DORA’s Resource Library. Background Research funding organizations are well positioned to be drivers of change towards more responsible research assessment practices. They are in a unique position to shape research culture, determining the allocation of resources, shaping research priorities, and influencing who…

The post Improving pre-award processes for equitable and transparent research assessment appeared first on DORA.

]]>
Diagram showing how different actors (researchers, publishers, funders, research institutes, etc) in the scholarly ecosystem are connected with each other.
Chhajed, DeMeester, Lee, Wong, and Schmidt (2020). “System and Tensions Mapping”

Click here to download this report. This report may also be found in DORA’s Resource Library.

Background

Research funding organizations are well positioned to be drivers of change towards more responsible research assessment practices. They are in a unique position to shape research culture, determining the allocation of resources, shaping research priorities, and influencing who receives funding and for what projects. Funders around the world are experimenting with formats and processes, like the narrative CV, that move beyond the use of journal impact factor or journal prestige to assess research quality. These new formats aim to better assess research on its own merits, address unconscious biases, and value a wider range of research outputs. The design and supporting processes that go into creating funding calls or programmes influence factors like: who has access to funding, applicant population, the formatting of proposal materials, the types of work outputs that applicants highlight, and how proposals are assessed by reviewers. 

DORA’s Asia-Pacific (A-P) and Africa, Americas, Europe (AAE) funder discussion groups were created in 2020 to support communication between funding organizations about research assessment reform, and to accelerate the development of new policies and practices that lead to positive changes in research culture. During their September 2022 meetings, these groups participated in a mapping exercise to identify existing research assessment interventions and areas to align on for future work. One of the key themes that emerged from this exercise was the importance of considering all steps in the funding process as part of responsible research assessment. This included the processes that take place before research is submitted for funding (pre-award processes), such as the timing of proposal calls and deadlines, transparency and guidance, phrasing and language use, selection and training of reviewers, and planning for “evaluating the evaluators” or “evaluating the evaluation process”.

In January 2023, the Elizabeth Blackwell Institute for Health Research (EBI) at the University of Bristol, in collaboration with the MoreBrains Cooperative, organized a symposium of researchers to analyze how pre-award processes can present obstacles to researchers and unintentionally reinforce biases like the Matthew effect, in which resources flow to those who have them. The findings from this symposium, and from subsequent discussions with research funders, resulted in a report with general recommendations on how the scholarly community can support transparency and equality*, diversity, and inclusion in pre-award processes. 

A natural next step for the EBI report was to identify which recommendations were most practically applicable. Given the interest of the DORA research funders discussion groups in this topic, we partnered with EBI and MoreBrains to identify which of the report recommendations were most actionable for research funders to undertake. In September and October 2023, DORA, EBI, and MoreBrains organized a set of parallel symposia and workshops for each funder discussion group. The groups used the EBI report recommendations as a tool to develop and prioritize areas for intervention.

Key takeaways

Feasibility and impact

During the virtual symposia held in September 2023, Ian Penton-Voak (EBI at University of Bristol), Josh Brown and Alice Meadows (MoreBrains) presented the report rationale and findings to the groups. This was followed by a presentation from Gearoid Maguire (Wellcome) on the Equitable Funding Practice Library to prime the subsequent breakout discussions. Members of the research funder groups then conducted a deep dive into select recommended actions from the EBI report and identified the 1) feasibility and 2) anticipated impact(s) of each recommendation. Here, the term “impact” was used in the context of the anticipated benefit that researchers might have from a particular recommendation, in the form of reduced barriers to funding applications and/or greater transparency. Funders in both groups discussed several mechanisms by which this recommendation could be implemented. The actions and potential ways to address them practically included: 

  • Avoid training becoming yet another burden. 
  • Diversify the idea of research careers, and break down cultural silos. 
  • Explore and experiment fairly and transparently. 
  • Consider the impact of how funding opportunities are designed, structured, and shared. 
  • Leverage providing applicants with reviewer feedback to support reapplication and reviewer integrity. 

During the large group discussion, several important points of consideration were highlighted by the group: 

  1. Communication is a key facet of pre-award processes that can either encourage or discourage applicants. 
  2. Funders should work to focus on inclusivity and reach (e.g., gender neutral language, opportunities to apply in other languages, broaden communications channels used to share funding calls, target smaller organizations and early career researchers). Of note, the value of protected or targeted opportunities in various forms was mentioned in the majority of Asia-Pacific breakout sessions. 
  3. Unconscious biases around what constitutes research excellence can present a barrier to more inclusive call design. For example, if a call is co-created with researchers, the biases and priorities of those academics influence call design and limit flexibility on the part of the funder. 
Reform in action

The work from the September symposia informed the design of the October workshops, which featured recorded lightning talks from members of each discussion group. These lightning talks highlighted practical examples of how funders have worked to reduce barriers for applicants and support transparency and/or EDI in their funding calls. We heard from: 

Shea Robin-Underwood of the Ministry of Business, Innovation & Employment (MBIE),  Aotearoa New Zealand. Robin-Underwood’s talk Supporting Indigenous Research focused on MBIE’s He aka ka toro investment fund for projects that advance iwi, hapū, hapori, and Māori research, science, and innovation (RSI). Robin-Underwood highlighted the importance of involving Indigenous peoples in every aspect of the fund design process, which was done for the RSI investment fund, and shared that there is a strong appetite and interest in the Indigenous researcher community for this type of protected funding.

Shomari Lewis-Wilson of Wellcome, United Kingdom. Lewis-Wilson’s talk Research culture and Communities focused on Wellcome’s values-based approach to supporting research culture in all of its programming: supporting ethical, open, and engaged research; enabling equitable, diverse, and supportive research cultures; and being an inclusive partner. There are several ways that Wellcome puts this approach into practice: embedding “Research Environment” in applications, recently launching the Institutional Fund for Research Culture, implementing a narrative CV, and supporting Black-led grassroots organizations.

Julie Glover of the Australian National Health and Medical Research Council (NHMRC). Glover’s talk Examples of Equity, Diversity and Inclusion Initiatives focused on initiatives to support NHMRC’s goals for equitable funding for female and non-binary chief investigators and for Aboriginal and Torres Strait Islander researchers. Glover highlighted that NHMRC uses structural priority funding as a mechanism for awarding funds to meet these goals, and this has resulted in more women and Indigenous researchers receiving funding. To address the attrition of more senior female applicants, the NHMRC has also recently prioritized funding equal numbers of grants to men and women.

Sarah Smith of the Natural Sciences and Engineering Research Council (NSERC), Canada. Smith’s talk NSERC’s Approach to Equity, Diversity and Inclusion in the Pre-Award Process highlighted that transparency and EDI are codified as organizational priorities in NSERC’s strategic plans, which also supports NSERC’s goals to support Indigenous researchers. Smith outlined several examples of how NSERC is working to implement these goals, including piloting a narrative CV, regular community engagement, removing procedural barriers within its programs, proportional representation of awardees, and target funding for Black and Indigenous trainees.

Click to watch the lightning talk recordings

Identifying realistic and transformative actions

Following the lightning talks, the attendees reviewed the list of actions that were discussed during the symposia and prioritized which actions they would like to focus on. During the subsequent breakout sessions, participants drafted brief action plans on how best to put each proposed action into practice. Examples include:

“Our group focused on making call structure simpler, clearer, and more inclusive. The activities needed to implement this idea are working with the communications department (including editing and creating plain language), and community consultation on what is needed. The resources needed to implement this idea are time and staff, for inter- and intra-organizational communications, access to experts, and community members. The people or groups who would need to be involved in this are community members, communications department, policy staff, EDI advisors, IT department. We think we should all take this idea forward because it’ll increase applications.”

Funders then voted on the top actions that would be 1) most realistic for funders to implement and 2) most transformative to the pre-award ecosystem. The areas of activity that were most impactful or transformative were:

  • Making assessment more holistic
  • Making call structures simpler, clearer, and more inclusive 
  • Improving training for reviewers and assessors 

The areas of activity that were most feasible and realistically achieved were:

  • Improving training for reviewers and assessors
  • Making call structures simpler, clearer, and more inclusive

Across the two workshops, which represent different geographic regions, the participants felt that making call structures simpler, clearer, and more inclusive was one of the most realistically achievable for funders and potentially most transformative for applicants and the pre-award landscape.

“It has been so gratifying to see the work we did with EBI to improve equity, diversity, and inclusion in pre-award processes being taken up by the DORA community. It’s inspiring to hear about their existing efforts in this area, and we look forward to continuing to work with them to develop concrete guidance on opportunities for improvement across the wider research funding community.” – Alice Meadows, MoreBrains co-founder 

Next steps

Building on the recommendations identified by the DORA funder discussion group in 2023 symposia and workshops, we are pleased to announce that EBI has secured additional funding from the University of Bristol for a new project to look at how three of the recommendations could be implemented. The three areas of focus are: 1) simplification of funding call structures; 2) changes to application processes to reduce likelihood of bias in outcomes (e.g. recognizing a broader range of research outputs, narrative CV formats); and 3) improvement in training for reviewers and evaluators. This project will include two parallel workshops to bring together members of the DORA funder discussion groups and research administrators. Participants will discuss how interventions relating to the three key areas could be implemented, practical barriers could be reduced, and progress could be benchmarked. The outcomes of this project will also include a report and other materials to support research funders in their implementation of evidence-based best practices.

Haley Hazlett is DORA’s Program Manager


*  In the report, the EBI and MoreBrains team used the term equality” as in the acronym EDI, for equality, diversity, and inclusion to refer to approaches and values that may lead to equitable outcomes, noting that “the terms ‘equality’ and ‘equity’ are different and [that they] felt…  the focus on equality of rights and opportunities was highly pertinent in the context of the pre-award process.”

The post Improving pre-award processes for equitable and transparent research assessment appeared first on DORA.

]]>
Updates from the Howard Hughes Medical Institute and Health Research Board Ireland https://sfdora.org/2024/04/20/dora-funder-discussion-group-meetings-march-2024/ Sat, 20 Apr 2024 14:02:52 +0000 https://sfdora.org/?p=160583 Each quarter, DORA holds two Community of Practice (CoP) meetings for research funding organizations. One meeting takes place for organizations in the Asia-Pacific time zone and the other meeting is targeted to organizations in Africa, the Americas, and Europe. If you are employed by a public or private research funder and interested in joining the…

The post Updates from the Howard Hughes Medical Institute and Health Research Board Ireland appeared first on DORA.

]]>
Each quarter, DORA holds two Community of Practice (CoP) meetings for research funding organizations. One meeting takes place for organizations in the Asia-Pacific time zone and the other meeting is targeted to organizations in Africa, the Americas, and Europe. If you are employed by a public or private research funder and interested in joining the Funder CoP, please find more information on our webpage.

DORA’s Funder Discussion Group meetings bring together members of the scientific community from around the world, connecting research funding organizations that support fair research assessment. During this quarter’s meeting on March 12, 2024, we heard about a wide range of topics relating to research assessment including the use of journal names, narrative-like CVs, and materials offered by DORA.

Africa, the Americas, and Europe

Value is often gauged by association, and academia is no exception. Researchers may unconsciously equate publishing in certain journals with prestige or lack thereof. Anna Hatch from the Howard Hughes Medical Institute (HHMI) described a new initiative to de-emphasize journal names in researcher assessment at the organization. As a first step, HHMI removed journal names from talk slides and poster presentations at their science meetings and replaced them with PMIDS, a persistent identifier that makes articles easy to find without giving away publisher information. This helped HHMI scientists become familiar with the idea. In 2023, HHMI expanded its efforts and began removing journal names from researcher assessment materials, including bibliographies. To support their scientists during the transition, HHMI created a citation style for Zotero, named “Howard Hughes Medical Institute” that removes journal names and provides persistent identifiers for the journal article and preprint if available. They also built a digital form for the bibliography that generates citations for assessment without journal names. The approach aims to reduce pressure on scientists to publish their results in highly selective journals, which often can lead to delays in the dissemination of research findings, extended training periods for graduate students and postdocs, and demoralized early career scientists. Hatch noted that the move also makes it clearer HHMI values the discovery itself, not the venue where it was published.

During this meeting we also heard from Annalisa Montesanti from the Health Research Board (HRB), who discussed the HRB’s use of a narrative-like CV in research career funding schemes. The HRB has been using this format since 2017 and revised it in 2019 and 2022 based on user feedback, HRB reflections and alignment with international best practice. This CV is a hybrid, containing sections that are narrative-based combined with some lists. It contains sections that allow researchers to paint a broader picture of themselves beyond metrics. For example, applicants are given the opportunity to write about career breaks, key contributions such as relevant research outputs, training and development, and societal impact. The HRB also includes a section for a “personal declaration” which is upstream to the CV. Feedback was obtained from both applicants and reviewers. The majority of responses from applicants said the CV gave a good picture of researchers’ contributions, and their experience in completing it was overall positive, with the majority of responses from reviewers saying that the format highlighted the impact of researchers well. They did, however, say these CVs are more difficult to assess, but a combination of qualitative and quantitative metrics made it easier. Some other challenges were also identified and corrective actions were taken.

 

Asia-Pacific

DORA’s Program Manager Haley Hazlett presented on the different mechanisms DORA uses to share information and resources with the community. These include DORA’s case study repository, resource library, and Reformscape.

Hazlett highlighted the case study repository, which provides detailed and contextualized examples of how academic institutions and national consortia are working to improve academic career assessment by reforming their culture, policies, and practices.

Within the resource library is a list of over 100 tools, policies, guidance, papers, etc. relating to research assessment, each one giving a quick glance of the resource in the form of a short summary. The resource library also contains DORA-produced tools and resources, some with helpful visuals. The resource library includes material for a wide range of actors in the academic ecosystem, including research funders, academic institutions, and publishers.

Lastly, Hazlett introduced DORA’s new platform, Reformscape, which has a range of information, resources, documents, etc. about implementing reform in regard to responsible research assessment. It is a community-led resource that is continuously evolving based on feedback to better serve those who want to use it. Users can also suggest resources they believe would be valuable to Reformscape.

After this presentation, attendees participated in a written discussion where they collectively gave insight into how DORA can better support responsible research assessment efforts at their institutions and include information from a larger geographic area. Some suggestions included having a standardized presentation about DORA’s resources that is readily accessible, providing funder-specific case studies, co-hosting webinars with other organizations, providing information on RRA-related grants, and more. They also discussed how to approach multilingualism in research and the challenges associated with translating work to make it more broadly available. Although the group did not have an easy answer to this challenge, DORA will continue to seek mechanisms to better incorporate multilingual content into its resources to improve representation of non-English content.

Casey Donahoe is DORA’s Policy Associate

The post Updates from the Howard Hughes Medical Institute and Health Research Board Ireland appeared first on DORA.

]]>
Using Narrative CVs: A shared definition and recommendations for monitoring effectiveness https://sfdora.org/2024/03/25/using-narrative-cvs-a-shared-definition-and-recommendations-for-monitoring-effectiveness/ Mon, 25 Mar 2024 14:00:50 +0000 https://sfdora.org/?p=160856 In February 2022, DORA organized the online workshop series “Using Narrative CVs: Identifying shared objectives and monitoring effectiveness” in partnership with FORGEN CoP, Science Foundation Ireland, the Swiss National Science Foundation, UK Research and Innovation, and the University of Bristol, Elizabeth Blackwell Institute for Health Research. The aim of the workshop was to develop a…

The post Using Narrative CVs: A shared definition and recommendations for monitoring effectiveness appeared first on DORA.

]]>
In February 2022, DORA organized the online workshop series “Using Narrative CVs: Identifying shared objectives and monitoring effectiveness” in partnership with FORGEN CoP, Science Foundation Ireland, the Swiss National Science Foundation, UK Research and Innovation, and the University of Bristol, Elizabeth Blackwell Institute for Health Research.

The aim of the workshop was to develop a shared definition for narrative CV and create a list of common objectives for its use in grant evaluation. The workshop took place on two occasions to accommodate global participation, and more than 180 individuals from over 50 funding organizations in more than 30 countries registered for the events. To support alignment of values and collective action towards the iterative adoption of narrative CVs at funding organizations, we worked with attendees to discuss a shared definition for the narrative CV, generate a list of common objectives for its use in grant evaluation, and identify ideas to support the optimization of narrative CVs as a robust tool for research assessment.

The group defined narrative CV’s as “a type of CV format that provides structured written descriptions of academics’ or researchers’ contributions and achievements that reflect a broad range of relevant skills and experiences” and identified three primary shared objectives for use:

1) Increase the recognition of a wider range of research contributions
2) Increase fairness in grant funding decisions
3) Increase flexibility for researchers to demonstrate contributions.

Our short report on the workshop summarizes the takeaways, provides a shared definition and objectives, and includes a one-page tip sheet “Ideas for Optimization: Five things to consider to optimize, evaluate & iterate on the use of narrative CVs” (see report page 6 or resource library entry Ideas for Optimization: Five things to consider to optimize, evaluate and iterate on the use of narrative CVs).

Click here to read the full report

Haley Hazlett is DORA’s Program Manager

 

The post Using Narrative CVs: A shared definition and recommendations for monitoring effectiveness appeared first on DORA.

]]>
A new narrative CV mentorship platform, funder research assessment reform reading list, and updates from the Japan Science and Technology Agency https://sfdora.org/2024/02/05/a-new-narrative-cv-mentorship-platform-funder-research-assessment-reform-reading-list-and-updates-from-the-japan-science-and-technology-agency/ Mon, 05 Feb 2024 11:00:56 +0000 https://sfdora.org/?p=159898 Each quarter, DORA holds two Community of Practice (CoP) meetings for research funding organizations. One meeting takes place for organizations in the Asia-Pacific time zone and the other meeting is targeted to organizations in Africa, the Americas, and Europe. If you are employed by a public or private research funder and interested in joining the…

The post A new narrative CV mentorship platform, funder research assessment reform reading list, and updates from the Japan Science and Technology Agency appeared first on DORA.

]]>
Each quarter, DORA holds two Community of Practice (CoP) meetings for research funding organizations. One meeting takes place for organizations in the Asia-Pacific time zone and the other meeting is targeted to organizations in Africa, the Americas, and Europe. If you are employed by a public or private research funder and interested in joining the Funder CoP, please find more information on our webpage.

During the December 2023 funder discussion group meetings, we heard presentations from group members and updates from institutions. The presentation during the Africa, the Americas, and Europe meeting focused on the Peer Exchange Platform for Narrative-style CVs (PEP-CV). During the Asia-Pacific meeting we heard from the Japan Science and Technology Agency on their activities for implementing responsible research assessment.

During the Africa, Americas and Europe meeting Sean Sapcariu, Programme Manager from the Luxembourg National Research Fund, presented on PEP-CV, a platform for “everyone active in the research and innovation sector to engage in simple peer mentoring exchanges around completing narrative-style CVs.” PEP-CV is being developed by a coalition of research funders to help early career researchers partner with mentors to learn to write effective narrative CVs and better display what they can contribute to the research community. PEP-CV will enable members of the scholarly community to connect with each other to share “experiences, achievements, and career paths in all types of narrative-style CVs.” 

Sapcariu fielded several questions about PEP-CV in discussion, explaining that mentors can be at any career stage as long as they have experience with narrative style CVs. Importantly, he clarified that PEP-CV is meant to be a tool for prospective applicants to learn how to effectively communicate within a narrative CV, rather than  a “how-to” for specific narrative CV templates. When asked about the ability for users to submit feedback about PEP-CV, Sapcariu noted there will be rating capabilities by users and an email to provide feedback. PEP-CV is slated for public release in Spring 2024.

Following Sapcariu’s presentation, several members of the Africa, the Americas, and Europe meeting described recent initiatives:

  • Science Foundation Ireland (SFI) have launched new EDI strategy and also plan to advance the use of narrative CVs
  • The Health Research Board (HRB) is using narrative CVs in career schemes.
  • The Natural Sciences and Engineering Research Council of Canada (NSERC) have been engaging with the scientific community about new research assessment guidelines, and are also piloting a narrative CV
  • The Canadian Institutes of Health Research (CIHR) are engaging with external groups about responsible research assessment and promoting DORA values in collaboration with these institutions.
  • Several new organizations have joined the DORA funder discussion groups, including the Canadian Cancer Society, Canada Foundation for Innovation, and the Chilean National Agency for Research and Development (ANID).

During the Asia-Pacific meeting Hiroko Tatesawa, Director of the Department of Strategic Basic Research from the Japan Science and Technology Agency (JST), presented on the JST’s recent work to reform research assessment practices for funding decisions. Tatesawa discussed how the JST is piloting a more narrative CV format for the Strategic Basic Research Program (SBRP), whose mission is to fund “top science” by utilizing the multiple funding programs under its umbrella. The JST has also recently introduced a more narrative style CV for the SBRP and aims to modify their practices to support more equitable and responsible assessment. To this end, the JST encourages its funding programs to consider the value and impact of all research outputs and be explicit in the criteria used to assess applicants, especially those early in their scientific career. Some of these criteria include: Project purpose; Proposal contents; Achievements and ability; and Research readiness.

Following Tatesawa’s presentation, several members of the Asia-Pacific meeting described recent initiatives:

DORA’s Funder Discussion Group meetings bring together members of the scientific community from around the world, connecting research funding organizations that support fair research assessment. If you are employed by a public or private research funder and are interested in joining the Discussion Group, please find more information here.

Reading list based on the group discussions

The post A new narrative CV mentorship platform, funder research assessment reform reading list, and updates from the Japan Science and Technology Agency appeared first on DORA.

]]>
Evaluation of researchers in action: Updates from UKRI and a discussion on the utility of CRediT https://sfdora.org/2023/08/08/evaluation-of-researchers-in-action-updates-from-ukri-and-a-discussion-on-the-utility-of-credit/ Tue, 08 Aug 2023 19:18:00 +0000 https://sfdora.org/?p=158609 Each quarter, DORA holds two Community of Practice (CoP) meetings for research funding organizations. One meeting takes place for organizations in the Asia-Pacific time zone and the other meeting is targeted to organizations in Africa, the Americas, and Europe. If you are employed by a public or private research funder and interested in joining the…

The post Evaluation of researchers in action: Updates from UKRI and a discussion on the utility of CRediT appeared first on DORA.

]]>
Each quarter, DORA holds two Community of Practice (CoP) meetings for research funding organizations. One meeting takes place for organizations in the Asia-Pacific time zone and the other meeting is targeted to organizations in Africa, the Americas, and Europe. If you are employed by a public or private research funder and interested in joining the Funder CoP, please find more information here.

Funding organizations play a key role in setting the tone for evaluation standards and practices. In recent years, an increasing number of funders have shifted their evaluation practices away from an outsized focus on quantitative metrics (e.g., H-Index, journal impact factors, etc.) as proxy measures of quality and towards more holistic means of evaluating applicants. At one of DORA’s March Funder Community of Practice (CoP) meetings, we heard how the UK Research and Innovation (UKRI) has implemented narrative style CVs for choosing promising research and innovation talent. At the second March Funder CoP meeting, we held a discussion with Alex Holcombe, co-creator of the Tenzing tool, about how the movement to acknowledge authors for the broad range of roles they play to contribute to a research project could also be applied to help funders in decision making processes.

During the first meeting, we heard from Hilary Noone, Tripti Rana Magar, and Hilary Marshall who discussed the implementation of narrative style CVs at funding organizations in the UK including the Cancer Research UK and National Institute of Health Research (NIHR). “Traditional” CVs can overlook the broad range of a researcher’s achievements and scholarly work, such as contributions to team science, mentorship, and “non-traditional” research outputs. Magar said that the concept of narrative CVs emerged around 2014 from the Nuffield Council on Bioethics with the goal of better recognizing and understanding the holistic contributions of researchers. In 2017, the Royal Society co-created the ‘Royal Society’s the Résumé for Researchers’ (R4R). In 2021, a more flexible version of the R4R was released: the Résumé for Research and Innovation (R4RI), which has been implemented by several major funders in the UK. Magar highlighted that the R4RI CV reduces emphasis on metrics and focuses on the quality of the contribution, provides space for applicants to list their full range of activities, and reduces bureaucracy by encouraging the adoption of a single framework.                   

The Joint Funders Group (JFG) are 54 funders (both UK-based and international) that support the wider adoption of R4RI-like Narrative CVs. The speakers also mentioned the Alternative Uses Group (AUG), another group hosted by UKRI that complements the efforts of the JFG by co-creating resources for recruitment, promotions, professional accreditation with the support of funders and universities. Both groups work together to accelerate cultural change and implement the R4RI-like Narrative CV. Additionally, resources relating to the R4RI-like CV are freely available in the Résumé Resources library, which includes several CV templates, training packages, starter guides for reviewers and applicants, and guidance on promotion practices.          

The speakers closed their presentation by introducing the Shared Evaluation Framework, a resource that the JFG and AUG have developed to support adoption of the R4RI-like CV. In addition to creating the Framework, UKRI is also developing an Evidence Platform that allows users of the R4RI-like CV to anonymously upload and share data relating to the adoption of the R4RI-like CV. The Evidence Platform will help the funders and academic institutions monitor the adoption, impact, and results of using the R4RI-like CV format. The JFG and AUG are also developing training packages for applicants, reviewers and staff, and are working with professional bodies to see if they could offer complementary modules on specific areas/ themes. If anyone is interested in testing the training packages, making recommendations, or volunteering to be part of the professional bodies workshop, please contact culture@ukri.org.

During the second meeting, Alex Holcombe from University of Sydney presented tenzing, a free and open-source tool created to support the adoption of Contributor Roles Taxonomy (CRediT). CRediT was introduced in 2012 to more accurately and thoroughly assign credit to authors for the specific roles they played on a particular research project. The CRediT system has been adopted by eLife, The BMJ, and MDPI. It offers a means to recognize scholarly contributions with greater granularity.

The landscape of research is expanding and demands a wide range of diverse technical inputs and ideas. Given this, there is also a need for more granular methods to recognize contributions or “who did what” on a particular project. Increased granularity in the recognition of author contributions can provide better insight into an author’s expertise and provide a mechanism to recognize and reward a broader range of types of contributions (e.g., insight from non-academic experts). On the call, we discussed how use of CRediT could be implemented by research funders to better recognize and reward different contributions in grant applications and awards, especially given that Wellcome was one of the founding organizations that created CRediT. Holcombe suggested that use of CRediT could be a means to provide funders an easier way to assess whether a proposed research project is within the umbrella of a researcher’s skillset or not.

Tenzing was launched in 2020 to help researchers to record their roles in a project from the start using the CRediT system. It is particularly helpful at the time of manuscript submission to a journal, as a researcher might face certain obstacles (such as author labor) during the process of accumulating information about every author that was engaged in the project from the beginning. However, there is no one-size-fit-all approach to all problems. Although the application of these tools seems clear for researchers and publishers, there is still much to learn about how CRediT could be implemented by funders directly.

During both meetings, the speakers discussed how different organizations are on their way to implement culture change, improving the recognition of the wide variety of different outcomes and outputs generated by researchers.

Queen Saikia is DORA’s Policy Associate

The post Evaluation of researchers in action: Updates from UKRI and a discussion on the utility of CRediT appeared first on DORA.

]]>
Research assessment reform in action: Updates from research funders in Canada and New Zealand https://sfdora.org/2022/08/18/research-assessment-reform-in-action-updates-from-research-funders-in-canada-and-new-zealand/ Thu, 18 Aug 2022 18:34:54 +0000 https://sfdora.org/?p=155297 Each quarter, DORA holds two Community of Practice (CoP) meetings for research funding organizations. One meeting takes place for organizations in the Asia-Pacific time zone and the other meeting is targeted to organizations in Africa, the Americas, and Europe. If you are employed by a public or private research funder and interested in joining the…

The post Research assessment reform in action: Updates from research funders in Canada and New Zealand appeared first on DORA.

]]>
Each quarter, DORA holds two Community of Practice (CoP) meetings for research funding organizations. One meeting takes place for organizations in the Asia-Pacific time zone and the other meeting is targeted to organizations in Africa, the Americas, and Europe. If you are employed by a public or private research funder and interested in joining the Funder CoP, please find more information here.

Research funding organizations are often thought of as leaders in the movement toward more responsible research evaluation practices. Often, the perception of “excellence” in research culture is filtered through the lens of who and what type of work receives funding. However, when a narrow set of indicators is used to determine who receives funding, the result can be a subsequent narrowing of academia’s perceptions of research excellence (e.g., journal impact factor (JIF), h-index). This places funders in the unique position of being able to help “set the tone” for research culture through their own efforts to reduce reliance on flawed proxy measures of quality and implement a more holistic approach to the evaluation of researchers for funding opportunities. Whether funders are seeking inspiration from their peers or insight on iterative policy development, the ability to come together to discuss reform activity is critical for achieving widespread culture change. At DORA’s June Funder Community of Practice (CoP) meetings, we heard how DORA is being implemented by the Natural Sciences and Engineering Research Council of Canada (NSERC) and the New Zealand Ministry of Business, Innovation and Employment (MBIE).

During the first meeting, Alison Janidlo, Nathalí Rosado Ferrari and Brenda MacMurray presented on the work that NSERC has done to implement DORA. NSERC is a national funding agency that supports researchers and students through grants and scholarships. Applications are evaluated through peer review.  In alignment with its existing good practices for research assessment (e.g., encouraging applicants to include outputs outside of journal articles and asking peer reviewers to focus on the quality and impact of outputs), NSERC signed DORA in 2019. Building on the previous version of its guidelines, NSERC developed a new assessment guidelines document that is more explicitly aligned with DORA principles. During the call, Janidlo, Rosado Ferrari and MacMurray described how the development of these new guidelines was informed by literature review, stakeholder engagement through committees, focus groups, and conference presentations. Key features of the guidelines include the promotion of NSERC’s support for research excellence, the explicit incorporation of DORA principles, and the emphasis that proxy measures of quality (e.g., JIF) must be avoided when evaluating research proposals.

Janidlo, Rosado Ferrari and MacMurray also described what is arguably one of the most important facets of implementing policy change: a reflection on what worked and lessons learned over the 2-year development process of their new guidelines. According to them, a key lesson was that it would have been useful for their community to have more clarity about the next steps for implementation at the time of the guidelines launch. The agency will also need to address the tricky question of how to track the impact of the new guidelines on funding decisions. In terms of strategies that were successful, the speakers highlighted that these new guidelines were developed iteratively and in consultation with a broad range of stakeholders within the NSERC community, allowing for greater buy-in with the new document. Additional successful strategies included listing forms of contributions and indicators of quality and impact in alphabetical order (to de-emphasize proxy measures of quality and impact), the language inclusivity of the new document (reflecting Canada’s official English and French bilingualism, consideration of the diverse postsecondary community and technical jargon used in the natural sciences and engineering domain), the use of the existing NSERC DORA Implementation Working Group to successfully liaise with stakeholders, and NSERC’s efforts to build awareness about DORA in both formal and informal meetings with stakeholders. To the final point, the NSERC team created a DORA email address for community members to send DORA-related questions and inquiries. The new guidelines were released May 6, 2022.

During the second meeting, Joanne Looyen and Farzana Masouleh presented on the work that the New Zealand MBIE has done to implement DORA. MBIE is a funding organization that offers contestable funds for a range of research. Looyen and Masouleh discussed the stewardship role that research funders hold and their responsibilities to support greater diversity in the research system. To fulfill this role, they emphasized that funders may need to rethink their existing guidelines to assess research “excellence”. Here, Looyen and Masouleh pointed out that DORA’s principles can be applied broadly to support more fair and consistent evaluation of funding proposals. To this point, MBIE is currently working with the Health Research Council and the Royal Society of New Zealand to develop a narrative-style CV for New Zealand researchers. The expectation is that researchers will be able to use this CV to apply for funds from any of the three agencies. This is important because interoperability of CV formats between organizations is an important concern often voiced by researchers. The speakers highlighted the specific benefits of using a more narrative-style CV for public funding applications, such as broadening the definition of what “excellence” is for Māori and Pacifica researchers, and better capturing the depth of their work. MBIE currently has two ongoing funding calls where it is introducing the narrative CVs: the Health Research Council Trial for the Ngā Kanohi Kitea (NKK) Community Fund, and the Annual Contestable Endeavour Fund. For example, NKK Community Fund applicants must submit a narrative-style CV that includes relevant experience to the proposed activity, contributions of knowledge generation, and contributions to iwi, hapū, or community.

Key considerations that were discussed included the importance of designing a narrative CV template that works with and for the New Zealand context. This means testing iterations of the template with research staff from New Zealand universities and research institutions, incorporating stakeholder feedback, and conducting a survey of applicants and assessors to gather data about their user experiences with the new template. Looyen and Masouleh concluded their presentation by returning the focus back to one of the key drivers of the MBIE’s reform efforts: increasing understanding of how research can contribute to the aspirations of Māori organizations and deliver benefits for New Zealand.

During both meetings, Funder CoP members discussed the actions that their peers have and are in the process of taking to implement a more holistic approach to the evaluation of researcher contributions. One of the key takeaways exemplified by both NSERC and MBIE is that there is no “one-size-fits-all” approach. Although the availability of examples of change are critical for drawing inspiration and demonstrating feasibility, ultimately the best policies and practices are those that fulfill the context-specific needs and goals of each funding organization’s community.

Haley Hazlett is DORA’s Program Manager.

The post Research assessment reform in action: Updates from research funders in Canada and New Zealand appeared first on DORA.

]]>
Changing the narrative: considering common principles for the use of narrative CVs in grant evaluation https://sfdora.org/2022/06/06/changing-the-narrative-considering-common-principles-for-the-use-of-narrative-cvs-in-grant-evaluation/ Mon, 06 Jun 2022 15:09:33 +0000 https://sfdora.org/?p=154997 Narrative CVs have become a potential avenue to recognize the broad range of a researcher’s scholarly contributions. They can be used as a tool to move toward a “quality over quantity” mindset in career evaluation, help reduce the emphasis on journal-based indicators, and better accommodate non-linear research career paths. To examine the merits and drawbacks…

The post Changing the narrative: considering common principles for the use of narrative CVs in grant evaluation appeared first on DORA.

]]>
Narrative CVs have become a potential avenue to recognize the broad range of a researcher’s scholarly contributions. They can be used as a tool to move toward a “quality over quantity” mindset in career evaluation, help reduce the emphasis on journal-based indicators, and better accommodate non-linear research career paths. To examine the merits and drawbacks of narrative CVs, DORA and the Funding Organisations for Gender Equality Community of Practice (FORGEN CoP) hosted a joint workshop in September 2021 for research funders. Based on the discussion from this initial workshop, a short report was released in December 2021 that summarized the key takeaways and recommended actions on the adoption of narrative CVs for grant evaluation. A key recommended action of the report was to create a shared definition of what a narrative CV is and to align on what objectives it hopes to achieve.

With this in mind, the February 2022 DORA funder discussion was dedicated to understanding the goals of using narrative CVs for grant evaluation, such as creating space for researchers to demonstrate broader contributions to research and documenting different talents. During the meeting, participants brainstormed on a range of topics: what a shared definition of a narrative CV might look like, objectives for its use in grant evaluation, and the value of a shared approach to monitor its effectiveness.

Toward a shared definition

Over the course of the discussion it became clear that more work is needed to develop a common narrative CV definition that accounts for differences across organizations and funding opportunities. One of the key challenges to developing a common definition was the idea that different organizations might have different understandings about the potential value that using narrative CVs might add to their assessment processes. Importantly, organizations may have distinct visions for where and how to use narrative CVs in their assessment processes. Participants agreed that a definition should be sufficiently flexible to adapt to various local contexts and cater to funders’ diverse needs and goals.

While considering how to address the challenges of creating a shared definition, one participant suggested the need to disentangle three key elements: what is a narrative CV, how might it be used, and how might it be implemented. Addressing the first element could include the creation of a flexible shared definition. From there, funders could address the second element and identify the expectations for use (e.g., what they expect a narrative CV to tell them that a bulleted CV cannot for a specific funding opportunity). Addressing the final element, implementation, could look like creating a practical and concrete plan to incorporate a narrative CV into their practices.

Because there are many types of narrative CV formats, participants also considered whether the conversation should be limited to formats that are completely narrative-based, such as the Resume for Researchers, or include hybrid-style CVs that feature “narrative elements.” Again, participants highlighted that building such flexibility into a shared definition could help address some of the challenges outlined above.

Brainstorming common objectives

To help build a foundation for effective use, participants were also asked to consider the objectives of a narrative CV. One participant noted that the use of narrative CVs might need to be dependent on the type of funding opportunity. An important foundational consideration might be “what is the most appropriate tool to achieve the goals of a specific funding opportunity.”

During this discussion, it was proposed that the general purpose of a CV is to document an individual’s professional trajectory. Some participants considered that a completely narrative CV may not offer a clear career timeline. One participant stressed that the goal of using alternative methods like narrative CVs for grant evaluation is ultimately to fund better research. Others added that another common objective of a narrative CV would be to allow applicants to provide evidence of a wider range of contributions, and give applicants the ability to explain why certain achievements are relevant to a specific proposal. We also heard that the power of using narrative CVs in grant evaluation lies in the idea that they can minimize the role journal prestige plays in the assessment of a candidate’s profile.

Finally, some participants expressed concern that a common misconception of narrative CVs is that they can be burdensome to create and review, and do not convey the same scientific rigor as quantitative indicators. During this discussion, other participants highlighted the importance of standardization as a mechanism to reduce the burden on applicants and reviewers. These discussions highlight the value of clear communication and training to ensure that both users and reviewers of narrative CVs understand what they are, how to create them, and how they should be used. To address this challenge, a recent example of an initiative to inform approaches to the use of narrative CVs is the UK Research and Innovation Joint Funders Group Résumé Library resources.

Value of collaboration to monitor effectiveness

There was strong consensus from the group that, where relevant, a collaborative approach to monitoring effectiveness would be valuable for funding organizations. For such an approach to work, it is important to take into consideration whether funders are using narrative CVs for grant evaluation in the same way. Building on that point, more than one type of investigation would be needed to accommodate the different uses of narrative CVs in varied types of funding opportunities.

There was also interest in collecting data across organizations with the aim of creating a longitudinal dataset on narrative CV use, which could help funders learn from the compound effect created by implementation across organizations. This dataset could include intra- or interorganizational qualitative and quantitative studies to monitor the use and effectiveness of narrative CVs.

Final takeaways

There is still work to be done to align on a shared definition, common objectives, and parameters for monitoring the effectiveness of narrative CVs. However, the discussions on this call demonstrated that interest is building in using narrative CVs as a qualitative method for research assessment, at least in part due to their potential to support a research culture that emphasizes meaningful research achievements over the use of flawed proxy measures of quality. Moving forward, fostering open and deep discussions with the community about narrative CVs and their implementation will be a key step toward the iterative development and use of narrative CVs as a tool for responsible research assessment.

DORA’s funder discussion group is a community of practice that meets virtually every quarter to discuss policies and topics related to fair and responsible research assessment. If you are a public or private funder of research interested in joining the group, please reach out to DORA’s Program Director, Anna Hatch (info@dora.org). Organizations do not have to be a signatory of DORA to participate.

Amanda Akemi is DORA’s Policy Intern

The post Changing the narrative: considering common principles for the use of narrative CVs in grant evaluation appeared first on DORA.

]]>
Cross-funder action to improve the assessment of researchers for grant funding https://sfdora.org/2022/01/19/cross-funder-action-to-improve-the-assessment-of-researchers-for-grant-funding/ Wed, 19 Jan 2022 21:29:37 +0000 https://sfdora.org/?p=153916 Funding organizations signal what is valued within research ecosystems by defining the criteria upon which proposals and applicants are assessed. In doing so, they help to shape the culture of research through the projects and people they support. Focusing on a limited set of quantitative criteria favors a narrow view of research and disincentivizes creativity…

The post Cross-funder action to improve the assessment of researchers for grant funding appeared first on DORA.

]]>
Funding organizations signal what is valued within research ecosystems by defining the criteria upon which proposals and applicants are assessed. In doing so, they help to shape the culture of research through the projects and people they support. Focusing on a limited set of quantitative criteria favors a narrow view of research and disincentivizes creativity and innovation. To counter this, many funding organizations are shifting to a more holistic interpretation of research outputs and achievements that can be recognized in the grant evaluation process through the use of narrative CV formats that change what is visible and valued within the research ecosystem. As communities of practice in research and innovation funding, the San Francisco Declaration on Research Assessment (DORA) funder’s group and Funding Organisations for Gender Equality Community of Practice (FORGEN CoP) partnered to organize a workshop focused on optimizing the use of narrative CVs, which is an emerging method used by research funders to widen the research outputs that can be recognized in research assessment.

DORA’s funder discussion group was launched in March 2020 to enable communication about research assessment reform. By learning from each other, funding organizations can accelerate the development of new policies and practices that lead to positive changes in research culture.

The focus on implementing narrative CV formats for grant funding grew organically within the discussion group. In 2019, the Royal Society in the United Kingdom introduced the Résumé for Researchers as an alternative to the traditional CV to recognize the diversity of research contributions through a concise, structured narrative. At the same time, a handful of research funders began to experiment with the use of narrative CVs in grant evaluation. During the first meeting of the DORA’s funders group, the Dutch Research Council and the Swiss National Science Foundation shared information about their pilot projects to implement narrative CVs. At subsequent meetings in 2020 and 2021, the Health Research Board Ireland, Luxembourg National Research Fund, and Science Foundation Ireland also shared updates on their implementation of narrative CVs.

FORGEN CoP, led by Science Foundation Ireland, was established in October 2019 and aims to share knowledge and best practice on gender equality in research and innovation funding. Mitigating gender bias within the grant evaluation process was identified as a primary area of focus.

Inequalities in the grant evaluation processes are more prevalent when assessments are focused on the researcher, as opposed to the research. With the conversation shifting to the assessment of people, the discussion naturally led to the use of narrative CVs and their assessment. Narrative CVs can be used as a method to move away from assessing productivity (quantity over time) to research achievements, focusing on the quality and relevance of these achievements in relation to the research proposed. They may also support researchers with non-linear career paths or career breaks by removing the focus on productivity which can be affected by periods of leave for care-giving responsibilities. During subsequent meetings, members shared their experiences implementing narrative CVs and their assessment methods. In a special FORGEN CoP seminar focused on how funders could mitigate the long-term gendered impacts of the COVID-19 pandemic, narrative CVs were highlighted as a method to address potential gender inequalities and/or a reduction in productivity that some researchers experienced throughout the pandemic.

During September 2021, DORA and the FORGEN CoP joined forces to address current knowledge gaps for the use of narrative CVs in grant evaluation and to improve their usefulness as a tool for responsible research assessment. We jointly organized a workshop, facilitated by Dr. Claartje Vinkenburg, which was convened as two events to enable global attendance. More than 120 participants from over 40 funding organizations and 22 countries joined us to explore strategies to mitigate potential biases of narrative CVs and monitor their effectiveness in the grant funding process. During these events, we heard from organizations implementing narrative CVs and experts who spoke about the influence of gender bias and language on research assessment. Researchers spoke to us about decision-making and process optimization for grant funding. In breakout sessions, we were able to identify useful areas of alignment for funding organizations and what studies are needed to gather evidence to improve the implementation of narrative CVs.

We are delighted to share a short report that captures our learnings from the workshop and identifies a course of action to improve narrative CVs as a tool for assessment. The workshop and report provide a foundation for the optimization of narrative CV formats upon which we plan to build.

We compiled a list of resources leading up to and during the workshop to help us understand the opportunities and challenges of using narrative CVs for grant funding, such as how bias might influence their evaluation (see below for the list of resources). As narrative CVs can be used for purposes other than grant funding, including the hiring and promotion of research and academic staff, we foresee that this report and the collated resources will also be useful to the wider academic community.

Narrative CVs have been the focus of other international funders’ fora in tandem with the work of DORA and FORGEN CoP. The Swiss National Science Foundation and the Open Researcher and Contributor ID (ORCID) organization established the CV Harmonization Group (H-group) in 2019 to improve academic CVs as a tool for responsible research assessment. In 2021, UK Research and Innovation (UKRI) launched a Joint Funders Group, an international community of practice of research funders, to develop a shared approach for the adoption of the Resume for Researchers, which UKRI also plans to implement.

Research assessment reform requires collective action, which is why DORA and FORGEN CoP have now partnered with the Swiss National Science Foundation and UKRI to build on this work. We are excited to announce that a second workshop for research funders is planned for February 2022 that aims to identify shared objectives for the use of narrative CVs in grant funding and determine how funding organizations can work collaboratively to monitor their effectiveness. If you work at a public or private research funding organization and would like to participate, please email info@sfdora.org.

Narrative CVs reduce emphasis on journal-based indicators, allow for the recognition of a variety of research contributions, and help to shift the focus from metricized research productivity to research achievements. Through the activities already completed and those planned, we aim to develop a path to collective action on the optimization and application of narrative CVs as a tool for responsible research assessment.

Anna Hatch is the DORA program director (info@sfdora.org)

Rochelle Fritch is the leader of FORGEN CoP and Scientific Programme Manager in Science Foundation Ireland (diversity@sfi.ie)

Resources

Speakers are highlighted in bold

Narrative CVs: Supporting applicants and review panels to value the range of contributions to research
Elizabeth Adams, Tanita Casci, Miles Padgett, and Jane Alfred

Measuring the invisible: Development and multi-industry validation of the Gender Bias Scale for Women Leaders
Amy B. Diehl, Amber L. Stephenson, Leanne M. Dzubinski, David C. Wang

Quality over quantity: How the Dutch Research Council is giving researchers the opportunity to showcase diverse types of talent
Kasper Gossink-Melenhorst

Getting on the same page: The effect of normative feedback interventions on structured interview ratings
Christopher J. Hartwell, and Michael A. Campion

The Structured Employment Interview: Narrative and Quantitative Review of the Research Literature
Julia Levashina, Christopher J. Hartwell, Frederick P. Morgeson, Michael A. Campion

The predictive utility of word familiarity for online engagements and funding
David M. Markowitz and Hillary C. Shulman

What Words Are Worth: National Science Foundation Grant Abstracts Indicate Award Funding
David M. Markowitz

Engaging Gatekeepers, Optimizing Decision Making, and Mitigating Bias: Design Specifications for Systemic Diversity Interventions
Claartje J. Vinkenburg

Selling science: optimizing the research funding evaluation and decision process
Claartje J. Vinkenburg, Carolin Ossenkop, Helene Schiffbaenker

Are gender gaps due to evaluations of the applicant or the science? A natural experiment at a national funding agency
Holly O Witteman, Michael Hendricks, Sharon Straus, Cara Tannenbaum

Studying grant decision-making: a linguistic analysis of review reports
Peter van den Besselaar, Ulf Sandström, Hélène Schiffbaenker

Gender, Race, and Grant Reviews: Translating and Responding to Research Feedback
Monica Biernat, Molly Carnes, Amarette Filut, Anna Kaatz

When Performance Trumps Gender Bias: Joint vs. Separate Evaluation
Iris Bohnet, Alexandra van Geen, Max Bazerman

Initial investigation into computer scoring of candidate essays for personnel selection
Michael C. Campion, Michael A. Campion, Emily D. Campion, Matthew H. Reider

How Gender Bias Corrupts Performance Reviews, and What to Do About It
Paolo Cecchi-Dimeglio

Inside the Black Box of Organizational Life: The Gendered Language of Performance Assessment
Shelley J. Correll, Katherine R. Weisshaar, Alison T. Wynn, JoAnne Delfine Wehner

The Gender Gap In Self-Promotion
Christine L. Exley, Judd B. Kessler

How Do You Evaluate Performance During a Pandemic?
Lori Nishiura Mackenzie, JoAnne Wehner, Sofia Kennedy

Why Most Performance Evaluations Are Biased, and How to Fix Them
Lori Nishiura Mackenzie, JoAnne Wehner, Shelley J. Correll

The Language of Gender Bias in Performance Reviews
Nadra Nittle

Exploring the performance gap in EU Framework Programmes between EU13 and EU15 Member States
Gianluca Quaglio, Sophie Millar, Michal Pazour, Vladimir Albrecht, Tomas Vondrak, Marek Kwiek, Klaus Schuch

How Stereotypes Impair Women’s Careers in Science
Ernesto Reuben, Paolo Sapienza, Luigi Zingales

Grant Peer Review: Improving Inter-Rater Reliability with Training
David N. Sattler, Patrick E. McKnight, Linda Naney, Randy Mathis

The post Cross-funder action to improve the assessment of researchers for grant funding appeared first on DORA.

]]>