Events Archives | DORA https://sfdora.org/category/events/ San Francisco Declaration on Research Assessment (DORA) Wed, 29 May 2024 18:45:19 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://sfdora.org/wp-content/uploads/2020/11/cropped-favicon_512-1-32x32.png Events Archives | DORA https://sfdora.org/category/events/ 32 32 Improving pre-award processes for equitable and transparent research assessment https://sfdora.org/2024/04/26/improving-pre-award-processes-for-equitable-and-transparent-research-assessment/ Fri, 26 Apr 2024 16:29:52 +0000 https://sfdora.org/?p=160619 Click here to download this report. This report may also be found in DORA’s Resource Library. Background Research funding organizations are well positioned to be drivers of change towards more responsible research assessment practices. They are in a unique position to shape research culture, determining the allocation of resources, shaping research priorities, and influencing who…

The post Improving pre-award processes for equitable and transparent research assessment appeared first on DORA.

]]>
Diagram showing how different actors (researchers, publishers, funders, research institutes, etc) in the scholarly ecosystem are connected with each other.
Chhajed, DeMeester, Lee, Wong, and Schmidt (2020). “System and Tensions Mapping”

Click here to download this report. This report may also be found in DORA’s Resource Library.

Background

Research funding organizations are well positioned to be drivers of change towards more responsible research assessment practices. They are in a unique position to shape research culture, determining the allocation of resources, shaping research priorities, and influencing who receives funding and for what projects. Funders around the world are experimenting with formats and processes, like the narrative CV, that move beyond the use of journal impact factor or journal prestige to assess research quality. These new formats aim to better assess research on its own merits, address unconscious biases, and value a wider range of research outputs. The design and supporting processes that go into creating funding calls or programmes influence factors like: who has access to funding, applicant population, the formatting of proposal materials, the types of work outputs that applicants highlight, and how proposals are assessed by reviewers. 

DORA’s Asia-Pacific (A-P) and Africa, Americas, Europe (AAE) funder discussion groups were created in 2020 to support communication between funding organizations about research assessment reform, and to accelerate the development of new policies and practices that lead to positive changes in research culture. During their September 2022 meetings, these groups participated in a mapping exercise to identify existing research assessment interventions and areas to align on for future work. One of the key themes that emerged from this exercise was the importance of considering all steps in the funding process as part of responsible research assessment. This included the processes that take place before research is submitted for funding (pre-award processes), such as the timing of proposal calls and deadlines, transparency and guidance, phrasing and language use, selection and training of reviewers, and planning for “evaluating the evaluators” or “evaluating the evaluation process”.

In January 2023, the Elizabeth Blackwell Institute for Health Research (EBI) at the University of Bristol, in collaboration with the MoreBrains Cooperative, organized a symposium of researchers to analyze how pre-award processes can present obstacles to researchers and unintentionally reinforce biases like the Matthew effect, in which resources flow to those who have them. The findings from this symposium, and from subsequent discussions with research funders, resulted in a report with general recommendations on how the scholarly community can support transparency and equality*, diversity, and inclusion in pre-award processes. 

A natural next step for the EBI report was to identify which recommendations were most practically applicable. Given the interest of the DORA research funders discussion groups in this topic, we partnered with EBI and MoreBrains to identify which of the report recommendations were most actionable for research funders to undertake. In September and October 2023, DORA, EBI, and MoreBrains organized a set of parallel symposia and workshops for each funder discussion group. The groups used the EBI report recommendations as a tool to develop and prioritize areas for intervention.

Key takeaways

Feasibility and impact

During the virtual symposia held in September 2023, Ian Penton-Voak (EBI at University of Bristol), Josh Brown and Alice Meadows (MoreBrains) presented the report rationale and findings to the groups. This was followed by a presentation from Gearoid Maguire (Wellcome) on the Equitable Funding Practice Library to prime the subsequent breakout discussions. Members of the research funder groups then conducted a deep dive into select recommended actions from the EBI report and identified the 1) feasibility and 2) anticipated impact(s) of each recommendation. Here, the term “impact” was used in the context of the anticipated benefit that researchers might have from a particular recommendation, in the form of reduced barriers to funding applications and/or greater transparency. Funders in both groups discussed several mechanisms by which this recommendation could be implemented. The actions and potential ways to address them practically included: 

  • Avoid training becoming yet another burden. 
  • Diversify the idea of research careers, and break down cultural silos. 
  • Explore and experiment fairly and transparently. 
  • Consider the impact of how funding opportunities are designed, structured, and shared. 
  • Leverage providing applicants with reviewer feedback to support reapplication and reviewer integrity. 

During the large group discussion, several important points of consideration were highlighted by the group: 

  1. Communication is a key facet of pre-award processes that can either encourage or discourage applicants. 
  2. Funders should work to focus on inclusivity and reach (e.g., gender neutral language, opportunities to apply in other languages, broaden communications channels used to share funding calls, target smaller organizations and early career researchers). Of note, the value of protected or targeted opportunities in various forms was mentioned in the majority of Asia-Pacific breakout sessions. 
  3. Unconscious biases around what constitutes research excellence can present a barrier to more inclusive call design. For example, if a call is co-created with researchers, the biases and priorities of those academics influence call design and limit flexibility on the part of the funder. 
Reform in action

The work from the September symposia informed the design of the October workshops, which featured recorded lightning talks from members of each discussion group. These lightning talks highlighted practical examples of how funders have worked to reduce barriers for applicants and support transparency and/or EDI in their funding calls. We heard from: 

Shea Robin-Underwood of the Ministry of Business, Innovation & Employment (MBIE),  Aotearoa New Zealand. Robin-Underwood’s talk Supporting Indigenous Research focused on MBIE’s He aka ka toro investment fund for projects that advance iwi, hapū, hapori, and Māori research, science, and innovation (RSI). Robin-Underwood highlighted the importance of involving Indigenous peoples in every aspect of the fund design process, which was done for the RSI investment fund, and shared that there is a strong appetite and interest in the Indigenous researcher community for this type of protected funding.

Shomari Lewis-Wilson of Wellcome, United Kingdom. Lewis-Wilson’s talk Research culture and Communities focused on Wellcome’s values-based approach to supporting research culture in all of its programming: supporting ethical, open, and engaged research; enabling equitable, diverse, and supportive research cultures; and being an inclusive partner. There are several ways that Wellcome puts this approach into practice: embedding “Research Environment” in applications, recently launching the Institutional Fund for Research Culture, implementing a narrative CV, and supporting Black-led grassroots organizations.

Julie Glover of the Australian National Health and Medical Research Council (NHMRC). Glover’s talk Examples of Equity, Diversity and Inclusion Initiatives focused on initiatives to support NHMRC’s goals for equitable funding for female and non-binary chief investigators and for Aboriginal and Torres Strait Islander researchers. Glover highlighted that NHMRC uses structural priority funding as a mechanism for awarding funds to meet these goals, and this has resulted in more women and Indigenous researchers receiving funding. To address the attrition of more senior female applicants, the NHMRC has also recently prioritized funding equal numbers of grants to men and women.

Sarah Smith of the Natural Sciences and Engineering Research Council (NSERC), Canada. Smith’s talk NSERC’s Approach to Equity, Diversity and Inclusion in the Pre-Award Process highlighted that transparency and EDI are codified as organizational priorities in NSERC’s strategic plans, which also supports NSERC’s goals to support Indigenous researchers. Smith outlined several examples of how NSERC is working to implement these goals, including piloting a narrative CV, regular community engagement, removing procedural barriers within its programs, proportional representation of awardees, and target funding for Black and Indigenous trainees.

Click to watch the lightning talk recordings

Identifying realistic and transformative actions

Following the lightning talks, the attendees reviewed the list of actions that were discussed during the symposia and prioritized which actions they would like to focus on. During the subsequent breakout sessions, participants drafted brief action plans on how best to put each proposed action into practice. Examples include:

“Our group focused on making call structure simpler, clearer, and more inclusive. The activities needed to implement this idea are working with the communications department (including editing and creating plain language), and community consultation on what is needed. The resources needed to implement this idea are time and staff, for inter- and intra-organizational communications, access to experts, and community members. The people or groups who would need to be involved in this are community members, communications department, policy staff, EDI advisors, IT department. We think we should all take this idea forward because it’ll increase applications.”

Funders then voted on the top actions that would be 1) most realistic for funders to implement and 2) most transformative to the pre-award ecosystem. The areas of activity that were most impactful or transformative were:

  • Making assessment more holistic
  • Making call structures simpler, clearer, and more inclusive 
  • Improving training for reviewers and assessors 

The areas of activity that were most feasible and realistically achieved were:

  • Improving training for reviewers and assessors
  • Making call structures simpler, clearer, and more inclusive

Across the two workshops, which represent different geographic regions, the participants felt that making call structures simpler, clearer, and more inclusive was one of the most realistically achievable for funders and potentially most transformative for applicants and the pre-award landscape.

“It has been so gratifying to see the work we did with EBI to improve equity, diversity, and inclusion in pre-award processes being taken up by the DORA community. It’s inspiring to hear about their existing efforts in this area, and we look forward to continuing to work with them to develop concrete guidance on opportunities for improvement across the wider research funding community.” – Alice Meadows, MoreBrains co-founder 

Next steps

Building on the recommendations identified by the DORA funder discussion group in 2023 symposia and workshops, we are pleased to announce that EBI has secured additional funding from the University of Bristol for a new project to look at how three of the recommendations could be implemented. The three areas of focus are: 1) simplification of funding call structures; 2) changes to application processes to reduce likelihood of bias in outcomes (e.g. recognizing a broader range of research outputs, narrative CV formats); and 3) improvement in training for reviewers and evaluators. This project will include two parallel workshops to bring together members of the DORA funder discussion groups and research administrators. Participants will discuss how interventions relating to the three key areas could be implemented, practical barriers could be reduced, and progress could be benchmarked. The outcomes of this project will also include a report and other materials to support research funders in their implementation of evidence-based best practices.

Haley Hazlett is DORA’s Program Manager


*  In the report, the EBI and MoreBrains team used the term equality” as in the acronym EDI, for equality, diversity, and inclusion to refer to approaches and values that may lead to equitable outcomes, noting that “the terms ‘equality’ and ‘equity’ are different and [that they] felt…  the focus on equality of rights and opportunities was highly pertinent in the context of the pre-award process.”

The post Improving pre-award processes for equitable and transparent research assessment appeared first on DORA.

]]>
Community Event: DORA 10th Anniversary Asia-Pacific Plenary https://sfdora.org/2023/04/21/community-event-dora-10th-anniversary-asia-pacific-plenary/ Fri, 21 Apr 2023 14:06:13 +0000 https://sfdora.org/?p=157736 Recording and materials from this event The accomplishments of the DORA Community Engagement grant Awardees (slideshow below). Event information Date: This event took place on May 16, 2023Time: 12:00 AM UTC (time conversion here) As part of DORA’s 10th Anniversary Celebration, we held two plenary sessions to capture as many time zones as possible and ensure…

The post Community Event: DORA 10th Anniversary Asia-Pacific Plenary appeared first on DORA.

]]>


Recording and materials from this event


The accomplishments of the DORA Community Engagement grant Awardees (slideshow below).


Event information


Date: This event took place on May 16, 2023
Time: 12:00 AM UTC (time conversion here)

As part of DORA’s 10th Anniversary Celebration, we held two plenary sessions to capture as many time zones as possible and ensure opportunities for community participation worldwide.

During the two-hour Asia-Pacific plenary, we discussed the state of the field, our past decade of work, and the future of DORA and responsible research assessment. The plenary opened with introductory statements from DORA’s Vice-Chair, Ginny Barbour (Open Access Australasia) followed by a keynote address by Mai Har Sham (The Chinese University of Hong Kong), co-author of the Hong Kong Principles for assessing researchers, and an audience Q&A.

Keynote
Research Assessment from 2013 to 2032, Mai Har Sham
“Large numbers of institutional and individual DORA signatories support the need of improving our research outcome evaluation practice. However, our systems of research assessment have not changed significantly in the last 10 years. In the Asia-Pacific regions, assessment of research performance for recruitment, funding, promotion and career progression remains reliant on traditional forms of publications and metrics. What are the challenges in implementing the DORA recommendations? In this plenary session I will review the research assessment practices over the last 10 years from the perspectives of different stakeholders; and discuss the challenges and requirements for a paradigm shift in research assessment in the next 10 years.”

Following the opening remarks, we held a panel session with experts from across the Asia-Pacific region. The panel session brought together subject-matter experts to share their unique experiences developing and implementing reform approaches across a range of disciplines and contexts. They provided insightful responses to key questions such as: What has DORA’s impact been and what does responsible research assessment look like in your part of the world? What does responsible research assessment look like in 10 years time?

Asia-Pacific DORA Plenary Program

  • Opening remarks from DORA’s Vice-Chair
    • Ginny Barbour, Open Access Australasia
  • Keynote Address
    • Mai Har Sham, The Chinese University of Hong Kong
  • Audience Q&A
  • Break
  • Panel session
    • Dasapta Erwin Irawan, Bandung Institute of Technology, Indonesia
    • Moumita Koley, Indian Institute of Science, India
    • Spencer Lilley, Victoria University of Wellington, New Zealand
    • Yu Sasaki, Kyoto University Research Administration Center, Japan
  • Closing remarks from DORA’s Acting Program Director
    • Haley Hazlett, Declaration on Research Assessment

The post Community Event: DORA 10th Anniversary Asia-Pacific Plenary appeared first on DORA.

]]>
Community Event: DORA 10th Anniversary Africa, Americas, Europe Plenary https://sfdora.org/2023/04/21/community-event-dora-10th-anniversary-africa-americas-europe-plenary/ Fri, 21 Apr 2023 14:05:18 +0000 https://sfdora.org/?p=157734 Recording and materials from this event The accomplishments of the DORA Community Engagement grant Awardees (slideshow below). Event information Date: This event took place on May 16, 2023Time: 12:00 PM UTC (time conversion here) As part of DORA’s 10th Anniversary Celebration, we held two plenary sessions to capture as many time zones as possible and ensure…

The post Community Event: DORA 10th Anniversary Africa, Americas, Europe Plenary appeared first on DORA.

]]>


Recording and materials from this event


The accomplishments of the DORA Community Engagement grant Awardees (slideshow below).

Event information

Date: This event took place on May 16, 2023
Time: 12:00 PM UTC (time conversion here)

As part of DORA’s 10th Anniversary Celebration, we held two plenary sessions to capture as many time zones as possible and ensure opportunities for community participation worldwide.

During the two-hour Africa, Americas, Europe plenary, we discussed the state of the field, our past decade of work, and the future of DORA and responsible research assessment. The plenary opened with introductory statements from DORA’s Chair, Stephen Curry of Imperial College London. Opening statements were followed by a keynote address by Sarah de Rijcke, Centre for Science and Technology Studies at Leiden University (CWTS).

Keynote
The Future of Research Evaluation, Sarah de Rijcke
“On DORA’s 10th anniversary, I will place into context the Declaration and other high-profile calls for reforming research assessment that have extended to include movements for open science, research integrity, and diversity, equity and inclusion. I will also address how scientists and scholars are responding to calls to reform their practices, by turning to the roles and values they attach to metrics in academic evaluations and research. Finally, and based on forthcoming work with the GYA, IAP, and ISC, I will also provide regional overviews and national examples of experimentation and reform for further insight.”

Following the opening remarks, we held a panel session with experts from across the region. The panel session brought together subject-matter experts to share their unique experiences developing and implementing reform approaches across a range of disciplines and contexts. They provided insightful responses to key questions such as: What has DORA’s impact been and what does responsible research assessment look like in your part of the world? What does responsible research assessment look like in 10 years time?

Africa, Americas, Europe DORA Plenary Program

  • Opening remarks from DORA’s Chair
    • Stephen Curry, Imperial College London, England
  • Keynote Address
    • Sarah de Rijcke, Centre for Science and Technology Studies at Leiden University, Netherlands
  • Audience Q&A
  • Break
  • Panel session
    • Andiswa Mfengu, University of Cape Town, South Africa
    • Karen Stroobants, Vice-Chair CoARA, United Kingdom
    • Judith Sutz, Universidad de la República, Uruguay
    • Stephanie Warner, University of Calgary, Canada
  • Closing remarks from DORA’s Acting Program Director
    • Haley Hazlett, Declaration on Research Assessment

The post Community Event: DORA 10th Anniversary Africa, Americas, Europe Plenary appeared first on DORA.

]]>
DORA at 10: Looking back at the history and forward to the future of research assessment https://sfdora.org/2023/01/25/dora-at-10-looking-back-at-the-history-and-forward-to-the-future-of-research-assessment/ Wed, 25 Jan 2023 20:54:22 +0000 https://sfdora.org/?p=157112 DORA will be 10 years old in May 2023 and we are planning to mark the occasion! We’ll be holding a weeklong celebration for DORA’s 10th Anniversary and we’re inviting you to join in by organizing an event on research assessment for your local community. We want to have conversations about what DORA has done…

The post DORA at 10: Looking back at the history and forward to the future of research assessment appeared first on DORA.

]]>
DORA will be 10 years old in May 2023 and we are planning to mark the occasion! We’ll be holding a weeklong celebration for DORA’s 10th Anniversary and we’re inviting you to join in by organizing an event on research assessment for your local community. We want to have conversations about what DORA has done and what we still need to do all over the globe! DORA’s 10th Anniversary Celebration will be comprised of two parts:

DORA’s 10th Anniversary Celebration will be comprised of two parts:

  • Two plenary online sessions to discuss the state of the field, our past decade of work, and our future plans.
  • A global program of local or regional events that will allow communities to share insights and challenges in reforming, innovating, and researching responsible research assessment policies and practices.

Plenary sessions

To commemorate DORA’s publication date, we are holding two plenary sessions on May 16, 2023 to capture as many time zones as possible and ensure opportunities for community participation worldwide.

Asia-Pacific Plenary
When: May 16, 2023. 12AM UTC.
Draft Program: Introductory statements from DORA’s Vice-Chair, Ginny Barbour (Open Access Australasia), keynote address by Mai Har Sham (The Chinese University of Hong Kong), followed by a panel discussion with experts from across the globe.


Africa, Americas, Europe Plenary
When: May 16, 2023. 12PM UTC.
Draft Program: Introductory statements from DORA’s Chair, Stephen Curry (Imperial College London), keynote address by Sarah de Rijcke (Leiden University), followed by a panel discussion with experts from across the globe.


A global program of local or regional events

Held over one week in May 2023, the DORA 10th Anniversary Celebration (#DORAat10) aims to feature events that highlight reform efforts, innovation in evaluation, research into evaluation systems, and more! It will be a chance to reflect on how far we have come, and to galvanize ourselves to face the remaining tasks in ensuring that our research assessment practices are the best they can be. 

For our 10th Anniversary Celebration, we want to encourage and elevate community members across the globe to organize their own events on responsible research assessment during one week in May 2023. We will promote these events on our website, and so we are calling for organizers who will lead virtual, hybrid, or in-person events about responsible research assessment.


When: Your event should take place within the week scheduled for the DORA@10 celebrations: May 15-19th 2023.  Details of the event will be featured on DORA’s website.

What types of events qualify? Be imaginative! Events can be of any format including webinars, conferences, seminars, and workshops. If you have a question, contact us at info@sfdora.org!

Examples of the types of events and topics for the 10th Anniversary Celebration:

  • Reforming research assessment policies and practices
    • An event for university faculty that outlines a new promotion path
    • An event held by a publisher on how they are committing to responsible use of metrics, improving peer- review processes (bias mitigation,  reviewers engagement, inclusive assessment, etc) and/or introducing responsible open research assessment
  • Innovation in research assessment policies and practices
    • A funder-led event discussing new models for funding decisions
  • Research on research assessment policies and practices
    • A seminar on recent research on assessment
  • The challenges for researchers in reforming assessment
    • A panel discussion led by early career researchers
    • A cross disciplinary discussion of research assessment 

Who can submit an event proposal? We are seeking event proposals from anyone interested in good research assessment. This includes, for example, academic staff or faculty, researchers studying biases in evaluation, librarians, research managers, early career researchers, bibliometricians, funders, initiatives for responsible research assessment, publishers, and societies.

How event proposals will be evaluated: The evaluation will be light touch, mainly to ensure that the proposal is relevant to DORA’s mission to reform research assessment policies and practices. Creativity is welcome! We are keen to highlight the broad range of ways that research assessment reform has taken place over the past 10 years and to explore ideas for accelerating the pace of reform. We strongly encourage submissions from all global regions and disciplines. Proposals will be evaluated by DORA staff on a rolling basis to ensure they are relevant to research assessment reform. 

Deadline for submission: May 19, 2023.

Note: If you would like to organize an event but are unable to do so during the week of May 15-19, please reach out to us at info@sfdora.org. We are happy to accommodate exceptions to feature local events that are held in May.

What happens when my proposal is accepted? Once your proposal is accepted, we will reach out to you for further information about your event to feature on the DORA website. For example, we will need the registration link for your event and other related links. You will also be invited to submit a blog summary of your event to be featured on the DORA website.

For more information on, please see our local event FAQ.

The post DORA at 10: Looking back at the history and forward to the future of research assessment appeared first on DORA.

]]>
Announcing plans for DORA’s 10th Anniversary Celebration https://sfdora.org/2022/12/14/announcing-plans-for-doras-10th-anniversary-celebration/ Wed, 14 Dec 2022 19:44:48 +0000 https://sfdora.org/?p=156330 DORA at 10: Looking back at the history and the future of research assessment The San Francisco Declaration on Research Assessment (DORA) will celebrate its 10th birthday in May next year and we want to invite the worldwide research community to help us mark the occasion. We aim to coordinate a series of events across…

The post Announcing plans for DORA’s 10th Anniversary Celebration appeared first on DORA.

]]>
DORA at 10: Looking back at the history and the future of research assessment

The San Francisco Declaration on Research Assessment (DORA) will celebrate its 10th birthday in May next year and we want to invite the worldwide research community to help us mark the occasion. We aim to coordinate a series of events across the globe that will look back at what the declaration has achieved and forward to how, together, we can address the remaining challenges in reform of research assessment. 

Published on May 16, 2013, the declaration was a bold attempt to address the pervasive use of inappropriate proxy measures of quality for hiring, promotion, tenure, and funding decisions. The declaration is a set of recommendations that individuals and organizations sign to show their commitment to reform assessment practices to better recognize scholarly work. Over the past decade, over 19,600 individuals and over 2,600 organizations across 159 countries have signed DORA, demonstrating an ever increasing level of awareness, passion, and effort toward reforming the way we think about the valuation of scholarly work.

A lot has changed in ten years. DORA has evolved from a declaration of recommendations into a global initiative that works with and in the community to raise awareness and support the development of best practices in research assessment. To showcase DORA’s growth and work over the past decade, and to share where we’re going for the next decade, we are proud to announce plans for DORA’s 10th Anniversary Celebration (#DORAat10) in May 2023. 

The program will be comprised of two parts. DORA will organize two plenary sessions to bring together international experts to discuss progress and future goals. But we are also inviting members of the worldwide community to organize local events that we will build into a week-long program of celebration and reflection.  

Plenary sessions: “DORA at 10: A look at our history and the bright future of responsible research assessment”

We will hold two plenary sessions to capture as many time zones as possible and ensure opportunities for community participation worldwide. During each plenary, we will discuss the state of the field, our past decade of work, and the future of DORA and responsible research assessment. Following introductory statements from DORA’s Chair or Vice-Chair, and a keynote address by a renowned speaker, we will hold a panel discussion with experts from across the globe.

The sessions will be held on May 16, 2023 to commemorate DORA’s publication date. Registration for the plenaries will open in January, 2023.

  • Asia-Pacific Plenary: May 16, 2023. 12AM UTC (time conversion here)
  • Africa, Americas, Europe Plenary: May 16, 2023. 12PM UTC (time conversion here)

A weeklong, worldwide program of events

DORA’s success is critically dependent on the work of many people in many different parts of the world. To elevate those contributions and to encourage further action, we want to include a roster of local or regional events in our 10th Anniversary Celebration. We would therefore like to invite YOU to organize an event on research assessment along with us! Held over the week of May 15-19, 2023, #DORAat10 aims to feature events that highlight reform efforts, innovation in evaluation, research into evaluation systems, or whatever aspect of responsible research assessment YOU think is important. 

We will be delighted to promote these events on our website, and so we will be calling for event submissions from organizers who will lead virtual, hybrid, or in-person events about responsible research assessment. The call for submissions will be launched formally in January, 2023 – but please get your thinking caps on now!

Sign up for DORA’s newsletter at sfdora.org and follow us @DORAssessment for all of the latest 10th Anniversary updates and announcements!

The post Announcing plans for DORA’s 10th Anniversary Celebration appeared first on DORA.

]]>
Community Call: Introducing the 2022 Project TARA tools to support responsible research assessment https://sfdora.org/2022/10/25/community-call-introducing-the-2022-project-tara-tools-to-support-responsible-research-assessment-2/ Tue, 25 Oct 2022 21:24:07 +0000 https://sfdora.org/?p=155977 Introducing two new tools for debiasing committee composition and recognizing the many facets of “impact” On October 25, 2022 DORA hosted a community call to introduce two new responsible research evaluation tools and provide feedback on future tool development. The toolkit is part of Project TARA, which aims to identify, understand, and make visible the…

The post Community Call: Introducing the 2022 Project TARA tools to support responsible research assessment appeared first on DORA.

]]>
Introducing two new tools for debiasing committee composition and recognizing the many facets of “impact”

On October 25, 2022 DORA hosted a community call to introduce two new responsible research evaluation tools and provide feedback on future tool development. The toolkit is part of Project TARA, which aims to identify, understand, and make visible the criteria and standards universities use to make hiring, promotion, and tenure decisions. The interactive call introduced and explored these new tools, which covered two important topics:

  • Debiasing Committee Composition and Deliberative Processes: It is increasingly recognized that more diverse decision-making panels make better decisions. This one-page brief can be used to learn how to debias your committees and decision-making processes.
  • Building Blocks for Impact: Capturing scholarly “impact” often relies on familiar suspects like h-index, JIF, and citations, despite evidence that these indicators are narrow, often misleading, and generally insufficient to capture the full richness of scholarly work. This one-page brief can be used to learn how to consider a wider breadth of contributions in assessing the value of academic achievements.

The tools were created by Ruth Schmidt, Associate Professor at the Institute of Design at Illinois Tech and member of the Project TARA team.

In the above clip from the October 25, 2022 call, Schmidt introduces the 2022 Project TARA tools: “Debiasing Committee Composition and Deliberative Processes” and “Building Blocks for Impact”.

Participants also provided their thoughts and feedback on the next round of Project TARA tools, slated for release in 2023. Sign up for DORA’s email list to be notified when the blog summary of the community discussion-portion of the call is published: https://sfdora.org/.

The event was be moderated by Haley Hazlett, DORA Acting Program Director and Ruth Schmidt. DORA Policy Associates Sudeepa Nandi and Queen Saikia provided chat moderation and support.

 

Project TARA is supported by Arcadia, a charitable foundation that works to protect nature, preserve cultural heritage, and promote open access to knowledge.

The post Community Call: Introducing the 2022 Project TARA tools to support responsible research assessment appeared first on DORA.

]]>
Building community around institutional change https://sfdora.org/2022/07/05/building-community-around-institutional-change/ Tue, 05 Jul 2022 18:33:36 +0000 https://sfdora.org/?p=155298 By Penny Weber — HuMetricsHSS On June 1, 2022, the HuMetricsHSS Initiative partnered with the Declaration on Research Assessment (DORA) to bring together the HuMetricsHSS Community Fellows and the DORA Community Engagement Grant Awardees to share knowledge and strategies on their respective—but interconnected—areas of work. The meeting brought together 23 academics working on projects looking…

The post Building community around institutional change appeared first on DORA.

]]>
By Penny Weber — HuMetricsHSS

On June 1, 2022, the HuMetricsHSS Initiative partnered with the Declaration on Research Assessment (DORA) to bring together the HuMetricsHSS Community Fellows and the DORA Community Engagement Grant Awardees to share knowledge and strategies on their respective—but interconnected—areas of work. The meeting brought together 23 academics working on projects looking to develop values-enacted frameworks or systems for evaluating scholarly work, implementing responsible research assessment practices, or both. After brief introductions, the meeting was split into breakout rooms, each of which contained a mix of HuMetricsHSS Community Fellows and DORA Community Engagement Awardees. 

While there is much overlap in terms of their research interests, the HuMetricsHSS Community Fellows are all based at institutions in the United States, while the DORA Community Engagement Awardees are based at institutions in Africa, Asia, Australia, and Latin America. Because international systems of academic assessment vary widely among countries and global regions, including differences in their level of adoption of alternative, values-based, or responsible metrics, the combination of perspectives presented in this meeting was particularly fruitful for all the participants. 

In previous meetings of the HuMetricsHSS Community Fellows, conversation had centered around opportunities and challenges each of the Fellows have faced at their institutions while working to bring about transformation on concepts of responsible, equitable, and expansive evaluation writ large. Drawing on those conversations, discussion in the breakout rooms for this joint meeting focused on questions of trust: how to build trust in a specific community and context, how to maintain that trust once it’s built, and what one might need (in terms of time, resources, networks, etc.) to do both.

One major theme that emerged across the groups was that of transparency and the need for a shared vocabulary of evaluation. Many terms in use when talking about alternative or values-based assessment can translate in different ways to different communities. One example that was discussed among the group was how the use of “qualitative” indicators without clearly defined evaluation guidelines and criteria can introduce the exact “subjective” biases that evaluators are attempting to eliminate. To this point, explicitly articulated and transparent evaluation criteria are a critical factor to help define values and expectations among organizations, evaluators, and applicants. Making evaluation criteria clear can help to foster trust between parties and address unintended biases. Shared vocabulary, as well as shared and transparent guidelines for evaluation, needs to be understood before trust can be built. Transparency of evaluation practices is also necessary for promoting trust within stakeholder communities outside of academia (e.g., patients, local communities, etc.)

Another major theme, although also related to that of transparency, was the importance of mentorship and guidelines for early-career scholars, as well as those stepping into positions for the first time, such as department chairs. Current systems of evaluation tend to treat mentorship as tertiary service work, but mentorship is often where interpersonal trust is built; interpersonal trust is necessary to establish trust in institutions and systems as well. By simultaneously modeling and rewarding excellent mentorship, those in the academy can build a new generation of scholars—and eventual evaluators—with increased trust in their colleagues, themselves, and their institutions. This model also addresses a need that was stressed in the discussion: to rework evaluation on multiple levels at once. Change can neither happen solely from the top down or the bottom up; faculty and staff are conditioned to resist top-down “dictatorial” change, but often feel they have little power themselves to enact change. Effective strategies for change must include offering resources and support to shift norms and guidelines at every level.

Finally, discussion also focused around the need for stable infrastructure regarding new forms of assessment, designed to maintain momentum and foster trust within the academic community. Ideally such infrastructure would be formal — an office of responsible metrics, for example, or an institution paying faculty for their time if they are doing work as champions of responsible evaluation — but formal infrastructure requires funding, campus support, and other resources the academics in the room may not have. In the absence of those resources, informal but fruitful collaborations and communities of practice can allow these champions to feel less alone in their work, provide perspective, and grant some semblance of authority or at least a scholarly community to which one can refer when facing pushback.

We hope, going forward, to foster such scholarly communities of practice and provide a space for champions, like the HuMetricsHSS Community Fellows and the DORA Community Engagement Awardees, to collaborate and learn from like-minded colleagues. To learn more about their work, please check out the HuMetricsHSS Community Fellows projects and the DORA Community Engagement Awardees’ projects


This post originally appeared on the HuMetricsHSS page and has been reposted to the DORA website with permission.

Haley Hazlett, DORA’s Program Manager, provided input on this piece as an editor.

The post Building community around institutional change appeared first on DORA.

]]>
Reimagining Academic Career Assessment: New case studies from DORA, EUA, and SPARC Europe https://sfdora.org/2022/03/30/reimagining-academic-career-assessment-new-case-studies-from-dora-eua-and-sparc-europe/ Wed, 30 Mar 2022 18:23:40 +0000 https://sfdora.org/?p=154759   In recent years, an increasing number of organizations have started to consider how to implement practical changes toward responsible academic assessment practices. However, reform is a process that takes time and resources, and there is no one-size-fits-all approach. In December 2020, to begin to address the increasing need for clear, structured examples of the…

The post Reimagining Academic Career Assessment: New case studies from DORA, EUA, and SPARC Europe appeared first on DORA.

]]>
 

In recent years, an increasing number of organizations have started to consider how to implement practical changes toward responsible academic assessment practices. However, reform is a process that takes time and resources, and there is no one-size-fits-all approach. In December 2020, to begin to address the increasing need for clear, structured examples of the processes behind responsible assessment reform, DORA, the European University Association (EUA), and SPARC Europe launched the case study repository “Reimagining academic assessment: stories of innovation and change.”

In 2021, three new case studies were added to the repository to offer the community additional examples of organizations in various stages of academic assessment reform. On December 7, 2021, DORA held a webinar featuring these organizations. The goal of this webinar was for participants to hear about the process of implementing reform, including: motivations, methodologies, and challenges encountered. Speakers included Laura Rovelli of the Latin American Forum for Research Assessment (FOLEC), Argentina; Johanna McEntyre of the European Molecular Biology Laboratory (EMBL), United Kingdom; and Richard Holliman of the Open University (OU), United Kingdom. The webinar was moderated by Haley Hazlett, DORA Program Manager, and Anna Hatch, DORA Program Director.

Reform in an international interdisciplinary consortium: FOLEC

FOLEC is an initiative of the Latin American Council of Social Sciences (CLACSO) that aims to facilitate the creation of a collective and improved Latin American and Caribbean standard of practice for academic research assessment. The motivation to create FOLEC stemmed from an internal desire from researchers in the region to create evaluation practices that were not centered on journal-based metrics and that made sense in their local contexts.

Rovelli explained that CLACSO’s structure, which spans across Latin America and the Caribbean, allows FOLEC to facilitate systems change on an international level. FOLEC does this by creating a space for knowledge exchange between CLACSO research member institutions and regional policymakers, and by working to translate and contextualize evaluation practices that are meaningful for the region. This dynamic enables FOLEC to promote a bottom-up, participatory and context-specific approach to implement change in research evaluation.

Rovelli noted that FOLEC has focused on developing resources to raise awareness and build expertise on research assessment practices in the region. In the first years of its operation, FOLEC developed a diagnosis of the region’s evaluation practices and shared its proposals for change. Moving forward, in 2021, FOLEC developed a series of advocacy papers on critical topics for systemic change in Latin America and the Caribbean:

On a systems level, FOLEC’s challenges include developing interoperable platforms that reflect the diversity of research outputs in the region, broadening the scope of scientific outcomes that are rewarded, and promoting the social diversification of evaluators. To address these challenges, foster international dialogue, build consensus, and raise awareness, FOLEC plans to launch a declaration of research assessment principles at the 9th Latin American and Caribbean Conference on Social Sciences in June 2022.

Reform in an intergovernmental research organization: EMBL

EMBL is an intergovernmental life sciences research organization whose mission includes research, training, and services. After signing DORA in 2018, EMBL established a Research Assessment Working Group that created four responsible research assessment recommendations to guide their reform efforts (see graphic below).

EMBL’s 4 Research Assessment Recommendations. Image Credit: Johanna McEntyre, EMBL-EBI

McEntyre explained that EMBL’s motivation for assessment reform came from a mix of internal and external forces. Internally, EMBL recognized the importance of codifying and providing examples of responsible assessment practices for the scientists trained at its sites. EMBL’s nine-year term limit for faculty members creates a unique training opportunity to equip alumni to recognize and support responsible evaluation practices as they progress through their academic careers. External influences for implementing change included the reform efforts of funding agencies like Wellcome.

While EMBL’s reform was initiated by senior leadership, there was also strong support and engagement from EMBL’s researchers and research support staff. For example, when the EMBL formed their DORA Working Group, they sought to compose a group that was representative of the different nationalities, sites, work cultures, and departments involved in the assessment process. After collecting feedback on what their staff understood as “hot areas” for interventions, the working group presented its findings to EMBL’s leadership and received buy-in to move forward. McEntyre said that this joint vision between leadership and staff has been key in promoting engagement from institutional leadership, researchers, and support staff, such as human resources.

In conclusion, McEntyre stressed that EMBL still has a long journey ahead. Still, she remains confident about the importance of reform, stressing that a clear communication strategy is critical to foster the conversation about research culture change.

Reform in a university: The OU

The OU is a distance learning university in the UK with a strong focus on knowledge exchange. According to Holliman, the OU’s motivation for research assessment reform was triggered when the Research Councils UK (now part of the UKRI) launched the Concordat for Engaging the Public with Research in 2013. The Concordat recognized the importance of public engagement to help maximize the social and economic impact of UK research. In response to this call to action, the OU carried out the “An open research university” project, which was designed to create conditions in which publicly engaged research could flourish. 

Holliman explained that the 2015 project report identified the need to revise faculty promotion criteria, resulting in the creation of a working group to do so. With the National Co-ordinating Centre for Public Engagement’s “EDGE tool” as their conceptual framework, the working group used a combination of strategies to inform the development of new promotion criteria. For example, they interviewed senior leadership and research staff members to identify new ways to recognize research excellence. One key output of these efforts was the creation of an additional promotion route for faculty with strong track records of knowledge exchange in their scholarly work. The addition of this promotion route better allows for the recognition and reward of work that is publicly engaged, enabling faculty to submit promotion portfolios based on what best fits their expertise. 

Here, Holliman noted that one of the challenges encountered during this process was the uncovering of an implicit hierarchy between different faculty priorities, with research on top, followed by teaching in the middle, and knowledge transfer at the bottom. To address this, he emphasized the importance of aligning values and goals using tools like the SPACE rubric.

Lessons learned

The speakers highlighted several important features of organizational change. First, they emphasized that signing DORA is the first step in a long and iterative process toward responsible research assessment. Second, there is no one-size-fits-all reform model. Each organization is working to implement change in its own context and has experienced unique challenges. Finally, a common thread throughout the discussion was that each one of these institutions is striving for the co-creation of a reform process that best fits their goals, purposes, and circumstances.

Amanda Akemi is DORA’s Policy Intern

The post Reimagining Academic Career Assessment: New case studies from DORA, EUA, and SPARC Europe appeared first on DORA.

]]>
Dashboard development: identifying and categorizing good practice https://sfdora.org/2022/01/14/dashboard-development-identifying-and-categorizing-good-practice/ Fri, 14 Jan 2022 16:24:24 +0000 https://sfdora.org/?p=153863 Introduction to Project TARA As more institutions begin to examine and refine their approach to academic assessment, there is growing awareness of the value of knowledge-sharing resources that support the development of new policies and practices. In light of this need, DORA is building an interactive online dashboard as part of Project TARA to monitor…

The post Dashboard development: identifying and categorizing good practice appeared first on DORA.

]]>
Introduction to Project TARA

As more institutions begin to examine and refine their approach to academic assessment, there is growing awareness of the value of knowledge-sharing resources that support the development of new policies and practices. In light of this need, DORA is building an interactive online dashboard as part of Project TARA to monitor and track trends in responsible research assessment for faculty hiring and promotion at academic institutions. 

We held a community workshop on November 19, 2021, to identify responsible research assessment practices and categorize source material for the dashboard. This event expanded on our first community call in September where we gathered early-stage input on intent and desired use of the dashboard. 

Overview of the dashboard and discussion

Institutions are at different stages of readiness for research assessment reform and have implemented new practices in a variety of academic disciplines, career stages, and evaluation processes. The dashboard aims to capture the depth and breadth of progress made toward responsible research assessment and provide counter-mapping to common proxy measures of success (e.g., Journal Impact Factor (JIF), H-index, and university rankings). To do this, dashboard users will be able to search, visualize, and track university policies for faculty hiring, promotion, and tenure.

Participants identified faculty evaluation policies at academic institutions as good practice using the definition for responsible research assessment from the 2020 working paper from the Research on Research Institute (RoRI) as a guide. For example, the University of Minnesota’s Review Committee on Community-Engaged Scholarship assesses “the dossier of participating scholar/candidates who conduct community-engaged work, then produce a letter offering an evaluation of the quality and impact of the candidate’s community-engaged scholarship.” 

“Approaches to assessment which incentivize, reflect and reward the plural characteristics of high-quality research, in support of diverse and inclusive research cultures.”

Responsible research assessment definition from ​The changing role of funders in responsible research assessment: progress, obstacles & the way ahead.​ RoRIWorking Paper No. 3., November 2020.

In addition to identifying good practice, participants were asked to explore and discuss how the practices fell into four broad categories: impact, research integrity, openness, and equity and inclusion. From this exercise, we heard that flexibility will need to be built into the dashboard to account for “cross-categorization” of policies and practices. For instance, the Academic Promotions Framework at the University of Bristol in the United Kingdom, which rewards the production of open research outputs like data, code, and preprints, was originally placed in the openness category. But that prompted the group to examine whether these were the best categories to use given how closely related openness is to research integrity. Likewise, participants had some difficulty placing policies to recognize public impact within the four categories. While such work may fit in the Impact category, it could also be applied across the other three. Indeed, many responsible research assessment policies could fall into multiple categories. In another example, the University of Zurich has assessment policies that include recognition of openness, leadership, and commitment to transparency in hiring or promotion. A flexible, broadly applicable system of categorization, such as meta-tagging, may prove to be a benefit because it allows policies to have multiple metatags.

Another key facet of the workshop discussion was that many academic institutions adopt national or international responsible academic initiatives. An example of this is the university-specific adoption of the nationally-developed Norwegian Career Assessment Matrix (NOR-CAM) or the Netherlands Recognition and Rewards Programme. National or regional context is also important to keep in mind given that organizations might be governmentally limited in how much they can alter their assessment policies. For example, the University of Catalonia has autonomy in the promotion and hiring of postdoctoral researchers, but is limited by external accreditation systems in its ability to change assessment practices for faculty.

Policies that focus on improving the ability of academic institutions to retain new faculty and staff (e.g., retention policies) were also seen as an important addition to the dashboard. It was pointed out that many of these policies focus on diversity and inclusion, given that many policies that make environments more supportive to minoritized faculty and staff also help with retention of those groups in academia. For example, Western Washington University in the United States created Best Practices: Recruiting & Retaining Faculty and Staff of Color, which offers suggestions and guidance such as making leadership opportunities available and keeping good data on why faculty of color leave universities to ensure that policy changes to entice them to stay are informed by data.

Example from the exercise to identify how policies and practices fell into different categories

Participants also examined the distribution of the responsible academic assessment examples after categorization: Was there an over- or under-representation of examples in certain categories? Was there difficulty categorizing certain examples? Through this exercise, we hoped to gain insight into areas of high and low activity in the development of responsible research assessment policies and practices. For example, many of the policies identified by the group focused on faculty promotion, yet fewer policies were related to faculty hiring and retention. As the dashboard structure design evolves, we will work to ensure that while findings like this may point to a current lack of activity, highlighting and visualizing these gaps can also indicate potential opportunities to address oversights and galvanize new institutional activity to address them.  

Finally, we heard about the importance of focusing on the process of assessing researchers rather than products, which was not strongly represented in the examples brought by participants. Such policies, like the Ghent University evaluation framework in Belgium, could provide staff and faculty with the flexibility to be assessed based on their accomplishments in-context with their own goals.

Next steps

The discussion from the workshop, in addition to the source material identified, will refine how we visualize and classify source material for the dashboard. Feedback on how well different policies “fit” into the four proposed categories will be used to inform the development of how different policies will be meta-tagged or classified to make them easy to find by users. 

The next phase of dashboard development is to determine data structure and visualization. Our goal is to begin web development in the spring of 2022. Updates on dashboard progress and further opportunities for community engagement can be found on the TARA webpage.

The dashboard is one of the key outputs of Tools to Advance Research Assessment (TARA), which is a DORA project sponsored by Arcadia – a charitable fund of Lisbet Rausing and Peter Baldwin to facilitate the development of new policies and practices for academic career assessment.

Haley Hazlett is DORA’s Program Manager.

The post Dashboard development: identifying and categorizing good practice appeared first on DORA.

]]>
Gathering input for an online dashboard highlighting good practices in research assessment https://sfdora.org/2021/11/05/gathering-input-for-an-online-dashboard-highlighting-good-practices-in-research-assessment/ Fri, 05 Nov 2021 11:06:22 +0000 https://sfdora.org/?p=153297   Introduction to Project TARA As institutions experiment with and refine academic assessment policies and practices, there is a need for knowledge sharing and tools to support culture change. On September 9, 2021, we held a community call to gather early-stage input for a new resource: an interactive online dashboard to identify, track, and display…

The post Gathering input for an online dashboard highlighting good practices in research assessment appeared first on DORA.

]]>
 

Introduction to Project TARA

As institutions experiment with and refine academic assessment policies and practices, there is a need for knowledge sharing and tools to support culture change. On September 9, 2021, we held a community call to gather early-stage input for a new resource: an interactive online dashboard to identify, track, and display good practices for academic career assessment. The dashboard is one of the key outputs of Tools to Advance Research Assessment (TARA), which is a DORA project sponsored by Arcadia – a charitable foundation that works to protect nature, preserve cultural heritage, and promote open access to knowledge – to facilitate the development of new policies and practices for academic career assessment.

Overview of the dashboard

It comes as no surprise that academic assessment reform is complex. Institutions are at different stages of readiness for reform and have implemented new practices in a variety of academic disciplines, career stages, and evaluation processes. The dashboard aims to capture this progress and provide counter-mapping to common proxy measures of success (e.g., Journal Impact Factor (JIF), H-index, and university rankings). Currently, we picture the general uses of the dashboard will include:

  • Tracking policies: Collecting academic institutional standards for hiring, promotion, and tenure.
  • Capturing new and innovative policies: Enabling the ability to share new assessment policies and practices.
  • Visualizing content: Displaying source material to see or identify patterns or trends in assessment reform.

Because the dashboard will highlight positive trends and examples in academic career assessment, it is important to define what constitutes good practice. One idea comes from the 2020 working paper from the Research on Research Institute (RoRI), where the authors define responsible research assessment as: approaches to assessment which incentivize, reflect and reward the plural characteristics of high-quality research, in support of diverse and inclusive research cultures.

Desired use of the dashboard: who will use it and how?

We envision a number of situations where the dashboard can support research assessment efforts. For example, a postdoc who is entering the academic job market may use the dashboard to identify assessment norms and emerging trends. Whereas, a department chair may find the dashboard helpful in finding examples of how contributions to research integrity and open science can be evaluated for an upcoming faculty search.

Throughout the meeting, we heard from attendees that the dashboard should be useful to as wide a range of individuals as possible. Specifically, it should include source material that is relevant to users at all levels of professional development and geographical locations. It is especially important to account for regional context, given that different countries may have national policies that impact the extent of institutional reform.

These points led to a deeper discussion on the need to consider user experience. For example, we heard that it is important for users to be able to apply filters when searching for different types of source material on the dashboard, because filters can allow users to identify material that is relevant to their unique context (e.g., professional development, global region, etc.).

Dashboard functionality: what types of source material can be included in the dashboard?

The next phase in dashboard development is identifying source material. We learned about important considerations to keep in mind when identifying what types of material to include in the dashboard. Moving away from a narrow set of evaluation criteria is essential for holistic research assessment. But we also heard new evaluation criteria need to be carefully considered to avoid capturing biased or bad practices. This is an instance where applying a shared definition of good practice in academic assessment can be useful in selecting source material. Another idea is to include a wide variety of materials in the dashboard, such as responsible research assessment frameworks, policies, and tools. Here we heard unintended blind spots may influence the selection of source material. To this point, non-traditional work (e.g., teaching, outreach, diversity and equity work, community engagement) is often suppressed by faculty in their tenure portfolios out of fear of penalization. This can lead to skewed perceptions toward institutional norms during evaluation.

Another challenge is the lack of clarity as to what the term “impact” means in academic assessment. Impact is difficult to define and often conflated with citation metrics. In addition to the dashboard, we aim to develop resources to articulate how impact is captured and considered as part of Project TARA’s associated toolkit. A central theme of the conversation was the need to clearly articulate the meaning of “impact” in academic evaluation criteria. Participants suggested the dashboard would benefit from holistic evaluation practices that recognize and reward a variety of researcher contributions. This could include policies that recognize preprints, news coverage, white papers, teaching, mentorship, and research that addresses major national, international, or societal issues.

We learned from participants that users may also be interested in understanding how universities account for impact that is specifically local in nature (e.g., contributing to governmental policy formation and local engagement). Here it was reiterated that an important mechanism for capturing impact outside of academia is the use of narrative material (e.g., governmental policies or briefs, articles on local engagement activities and locally focused research, reports on quality of engagement drafted by patient advocacy groups). While outputs like these are useful for capturing important qualitative information about a researcher’s impact, tracking and generating them can be challenging for universities. Indeed, a similar challenge for the dashboard is determining how to capture the rich qualitative information stored in research assessment policies and practices. To gather a more holistic image of researchers for assessment, institutions must recognize and reward a broader range of academic outputs. Doing so would, by extension, provide the dashboard with more robust source material on assessment practices.

Next steps

The input gathered during the community call is being used to refine DORA’s vision and thinking for the dashboard. The next phase of dashboard development is identifying source material and determining data structure and visualization. DORA is organizing a November workshop to examine what makes “good” practice in research assessment and surface source material for the dashboard.

Haley Hazlett is DORA’s Program Manager.

The post Gathering input for an online dashboard highlighting good practices in research assessment appeared first on DORA.

]]>