Project TARA Archives | DORA https://sfdora.org/category/project-tara/ San Francisco Declaration on Research Assessment (DORA) Thu, 01 Aug 2024 15:39:51 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://sfdora.org/wp-content/uploads/2020/11/cropped-favicon_512-1-32x32.png Project TARA Archives | DORA https://sfdora.org/category/project-tara/ 32 32 Encouraging innovation in open scholarship while fostering trust: A responsible research assessment perspective https://sfdora.org/2024/08/01/encouraging-innovation-in-open-scholarship-while-fostering-trust-a-responsible-research-assessment-perspective/ Thu, 01 Aug 2024 15:39:51 +0000 https://sfdora.org/?p=161267 The following post originally appeared on the Templeton World Charity Foundation blog. It is reposted here with their permission.  Emerging policies to better recognize preprints and open scholarship Research funding organizations play an important role in setting the tone for what is valued in research assessment through the projects they fund and the outputs they…

The post Encouraging innovation in open scholarship while fostering trust: A responsible research assessment perspective appeared first on DORA.

]]>
The following post originally appeared on the Templeton World Charity Foundation blog. It is reposted here with their permission. 

Emerging policies to better recognize preprints and open scholarship

Research funding organizations play an important role in setting the tone for what is valued in research assessment through the projects they fund and the outputs they assign value to. Similarly, academic institutions signal what they value through how they assess researchers for hiring, promotion, and tenure. An increasing number of research funding organizations and academic institutions have codified open scholarship into their research assessment policies and practices. Examples include Wellcome, the Chan Zuckerberg Initiative, the Ministry of Business, Innovation & Employment of Aotearoa New Zealand, the University of Zurich and the Open University.

This shift is accompanied by policies that recognize preprints as evidence of research activity (e.g. NIHJapan Science and Technology AgencyWellcomeEMBO, and some UKRI Councils). Some funders are now formally recognizing peer-reviewed preprints at the same level of journal articles, such as EMBO and many of the cOAlition S funders. A preprint is a scholarly manuscript that the authors upload to a public server but has not (yet) been accepted by a journal (it is usually the version submitted to a journal if the authors do decide to take it further for journal publication). It can be accessed without charge and, depending on the preprint server, is screened and typically posted within a couple of days, making it available to be read and commented on. Because preprints offer a means outside of journals to share research results, they have the potential to support responsible assessment by decoupling journal prestige from assumptions on the quality of research findings. Preprints also enable sharing of a range of outputs and results that may not be attractive to a journal (for example, research that is technically sound but has a limited scope, or null/negative findings). Because they are also usually free to post, preprints can also help reduce author-facing cost-associated barriers often associated with traditional open access publications, although these services are typically therefore reliant on ongoing grant funding to maintain their sustainability.

One of the most recent examples of a substantial policy change is the Bill and Melinda Gates Foundation’s March 2024 announcement of its upcoming Open Access Policy in 2025. The 2025 policy will introduce two changes for grantees: grantees will have to share preprints of their research, and the Gates Foundation will stop paying article processing charges (APCs). As with many shifts towards open access policies, these changes were motivated by several factors, including the Gates Foundation’s desire to provide journal-agnostic avenues for research assessment and to empower their grantee authors to share different versions of their work openly on preprint servers and without the costs of APCs. Reducing the costs associated with publishing research and increasing readers’ accessibility to research via preprint servers also supports more equitable access to research products.

The Gates Foundation used ten years of existing data to inform its decision to refine its Open Access Policy, has engaged actively with community dialogue, and has made it clear that this is a first step on a longer path to better evaluate research on its own merits and increase accessibility to research. Notably, the Gates Foundation took this step after taking into account the existing shortcomings of current open access models that rely on APCs that effectively limit global access to research outputs. Given that the Gates Foundation is “the wealthiest major research funder to specifically mandate the use of preprints,” this approach is groundbreaking in its emphasis on preprints and its shift away from spending on APCs. It has also placed a spotlight on these issues and catalyzed discourse around trust in preprints. Policy changes like this indicate a willingness among research funders to take steps toward change and move away from recognizably flawed processes. This is an important step, since flawed processes are often retained because fixing them or adopting new processes is perceived as too high effort and too high risk (also known as the status quo bias).

Overcoming the status quo and tackling new challenges

Overcoming the status quo bias is difficult, but not impossible. Indeed, a common concern around changing research assessment processes to include new ways of sharing knowledge is taking a leap into the unknown. Because these new policies are on the leading edge of change, there are gaps in our knowledge around their effects on research culture and assessment. For example, will assessors penalize researchers who include preprints in their CVs or will research culture shift what it values?

Another key question centers on how preprints will impact our concept of traditional manuscript peer review processes. Traditionally, journals select a panel of peer reviewers, ideally field experts, who voluntarily review manuscript submissions for free. These detailed reviews inform an editor’s decision on whether to publish a manuscript. Generally, preprints are only lightly checked before being made public, after which anyone can read and comment on them and provide peer feedback. One common concern is that preprints are only subject to light checking before being made public, though it is important to note that issues with rigor and reproducibility exist within current peer-review publication systems. Preprint peer feedback holds the potential for positive change, opening up the opportunity for authors to receive a wide range of community input and making it easier to spot issues early.

One step to foster trust in preprints will be to create a shared understanding of what preprint “review” is. What qualifies as review in the context of a preprint was recently defined via expert consensus as “A specific type of preprint feedback that has: Discussion of the rigor and validity of the research. Reviewer competing interests declared and/or checked. Reviewer identity disclosed and/or verified, for example, by an editor or service coordinator, or ORCID login.” Additionally, there are a growing number of preprint review services available, for example VeriXiv (created through a partnership between the Gates Foundation and F1000), Peer Community In and Review Commons who have all created infrastructure and pipelines to verify preprints and facilitate structured and invited expert peer review of preprints, post publication. They provide journal-independent assessment of the preprint, typically using various forms of open peer review practices, making it more transparent, fostering accountability and enabling reviewers to be rewarded for their contributions to the field. However, some have raised concerns about whether greater transparency increases risk of retaliation, particularly for early career researchers, although recent evidence suggests that more research is needed to determine if repercussions occur.

Questions like these are legitimate and highlight the value of the organizations that are actively seeking to answer them, like the Research on Research Institute, which studies the results of research policy reform using the same scholarly rigor that reform efforts are trying to foster in the academic ecosystem. Organizations like ASAPbio are working to address concerns around the agility of preprint servers to correct or retract preprints and to support rigorous and transparent preprint peer review processes.

In the meantime, fear of unintended consequences is not reason enough to avoid trying to improve research incentives and the culture associated with it. The changes that research funders are implementing to recognize and incentivize open scholarship practices are on the leading edge of reform efforts, pushing research culture forward in new ways that aim to address existing burdens caused by APCs and journal prestige. As with all policies that aim to shift assumptions around what can and should be valued in research, gaps in knowledge will need to be filled through iteration, open dialogue with groups that new policies will impact, and careful study of how new policies change research culture.

Responsible research assessment and open scholarship are interconnected

Responsible research assessment: An umbrella term for “approaches to assessment which incentivise, reflect and reward the plural characteristics of high-quality research, in support of diverse and inclusive research cultures.” –RoRI Working Paper No.3

As well as progress in the reform of research assessment, a further fundamental change in the research ecosystem over the past decade has been the emergence of open scholarship (also known as open science or open research)¹. The UNESCO 2021 Recommendation on Open Science outlined a consensus definition of open science that comprises open scientific knowledge (including open access to research publications), open dialogues with other knowledge systems, open engagement of societal actors, and open science infrastructures. It is an inclusive movement to “make multilingual scientific knowledge openly available, accessible and reusable for everyone, to increase scientific collaborations and sharing of information for the benefits of science and society, and to open the processes of scientific knowledge creation, evaluation and communication to societal actors beyond the traditional scientific community.” This definition captures the broad nature of open scholarship: it is both a movement to change how scholarly knowledge is shared, and to address global inequities in scholarly culture itself.

DORA (Declaration On Research Assessment) is a global non-profit initiative that actively works with the scholarly community to support responsible research assessment for hiring, promotion, tenure, and funding decisions. DORA is part of a global movement that aims to reform research assessment equitably including expanding the definition of what gets assessed, and changing the way the assessment takes place. Reducing emphasis on flawed proxy measures of quality such as the Impact Factor or h-index, broadening the type of work that is rewarded, and challenging assumptions about quality and excellence are critical facets of the movement towards responsible research assessment. However, these core concepts do not exist in a vacuum (see Venn diagram by Hatch, Barbour and Curry).


The concepts of (i) research assessment reform, (ii) open scholarship, and (iii) equality and inclusion cannot be treated separately. They interact strongly and in many complex ways – presented only in broad outline here – and are converging to create a research culture that is centred on people (practitioners and beneficiaries) and on values that embody the highest aspirations of a diverse world.

The concepts of research assessment reform, open scholarship, and equality and inclusion cannot be treated separately.


Responsible research assessment is intricately linked with open scholarship, and also with equity and inclusion initiatives. Biases and assumptions about research quality can determine who is assessed and how they are assessed, and decoupling research products from journal prestige is an important step to address these biases.

Greater transparency and accessibility enables the recognition of a broader range of scholarly outputs (including datasets, protocols, and software). In alignment with the aims of open scholarship, DORA seeks to address the “publish or perish” culture by recognizing and rewarding transparency, rigor, and reproducibility. Ultimately, enabling and rewarding rigor and transparency serve to foster trust both within academia and with the broader public.

The intersection between these movements is apparent in many policies and practices being adopted at academic institutions around the world. Reformscape, a database of research assessment reform at academic institutions, contains over twenty examples of how institutions are incorporating aspects of open scholarship into their hiring, promotion, and tenure practices. Many of DORA’s in-depth case studies of institutional research assessment reform include mention of the institution codifying open scholarship into their practices.

Fostering public trust through open scholarship requires a systems approach

A critical part of research culture is trust. There are many ways to build trust. Traditional peer reviewed journals emphasize building trust by invoking expert opinion. Open scholarship emphasizes building trust through transparency: making all stages of the research process, from conception to data collection to analysis and review visible to all.

The two approaches are not mutually exclusive. Peer reviewed journals have created policies to promote openness, and several forms of open scholarship have sought ways to solicit expert opinions. However, peer review has been used as the main signal of trustworthiness for so long that it can be difficult to convince researchers that having an article pass peer review isn’t necessarily a definitive signal to say that the article is trustworthy. Consequently, many have not been convinced about the value of sharing research ahead of peer review, and have been concerned that removing that vetting would open the gates to flawed research and increase the risk of misinformation.

In practice, a couple of studies (here and here) have suggested that the distinctions between peer reviewed journal articles and preprints can be minimal. Preprints have gained increased acceptance from researchers who are posting more preprints, and from reporters who are writing more stories based on preprints (although more work needs to be done to ensure that disclaimers about the lack of peer review are always added).

Understanding what scholarly communication can be viewed as trustworthy was, is, and always will be a complex task for both experts and non-experts alike. Experts are expected to gain this through advanced training and experience. Non-experts might benefit from increased media literacy, a subject that is taught to less than half of US high school students.

Call to Action

Reforming research assessment requires new policies and practices that embrace diverse scholarly outputs, reduce the emphasis on journal prestige as an implicit indicator of research quality, and evaluate research based on its intrinsic value. As we expand the type of work that we recognize to include preprints and other “non-traditional” outputs, we can foster trust in these new outputs by 1) recognizing and rewarding transparency, rigor, and high-quality review, and 2) by developing resources to foster and support responsible preprint review. For example, a growing number of bibliographic indexers are starting to index preprints (e.g., Europe PMCPubMed Central) and there are several efforts to index and link reviews to preprints (e.g. ScietyCOAR Notify). There are also a number of efforts underway to develop a range of consistent trust signals and markers. Alongside these efforts lies the crucial task of consistently educating and communicating about innovations in publishing and open scholarship practices to cultivate public trust and literacy.

Change on this scale is not immediate, nor should it be. DORA has long advocated for the strategy of iteratively fine tuning policies over time using data and input from their target communities. As more and more research funders and institutions test new ways of rewarding researchers for their open scholarship practices, it is important to seize the opportunity for careful review, refinement, and fostering an open dialogue on what works and what doesn’t.

Zen Faulkes is DORA’s Program Director
Haley Hazlett is DORA’s Program Manager

Acknowledgements: The co-authors would like to thank DORA co-Chairs, Ginny Barbour and Kelly Cobey, and DORA Vice-Chair, Rebecca Lawrence for their editorial input on the piece.


¹ The term “open scholarship” will be used throughout this piece for consistency. Different organizations also use the terms “open research” and “open science” to describe broad policies that encourage openness, though often all three terms generally have a holistic focus on fostering a culture of openness, transparency, and accessibility. DORA uses “open scholarship” to better encapsulate all scholarly disciplines.

The post Encouraging innovation in open scholarship while fostering trust: A responsible research assessment perspective appeared first on DORA.

]]>
DORA Newsletter May 2024 https://sfdora.org/2024/05/07/dora-newsletter-may-2024/ Tue, 07 May 2024 10:00:07 +0000 https://sfdora.org/?p=161192 Announcements Reformscape in Full Swing Since its release in January, Reformscape has been serving the research community in providing 230 documents encouraging openness & transparency. New documents and information continue to be added and are always publicly available. Upgrades have also been made to make the platform more user-friendly. You can now search for responsible…

The post DORA Newsletter May 2024 appeared first on DORA.

]]>

Announcements


Reformscape in Full Swing

Since its release in January, Reformscape has been serving the research community in providing 230 documents encouraging openness & transparency. New documents and information continue to be added and are always publicly available. Upgrades have also been made to make the platform more user-friendly. You can now search for responsible research assessment resources based on type, making it easy to find:

  • Action plans
  • Policies to reform hiring, promotion or tenure
  • Outcomes of new policies
New guidance released

DORA’s opposition to the overuse of the Journal Impact Factor is well known, but the original declaration did not specifically address other forms of indicators that are sometimes used as proxy measures of quality in research assessment. In a new guidance document, we examine the potential problems of not only the Journal Impact Factor, but the h-index, altmetrics, and various other citation measures. None of these indicators are without problems, but the guidance provides five principles to help reduce some of the concerns.

This guidance is available on the DORA website and Zenodo. For questions about this guidance, email info@sfdora.org.

New Narrative CV Report

DORA is pleased to announce a new report on the implementation and monitoring of narrative CVs for grant funding. This report was created in collaboration with FORGEN CoP, Science Foundation Ireland, the Swiss National Science Foundation, UK Research and Innovation, and the University of Bristol, Elizabeth Blackwell Institute for Health Research. The report summarizes takeaways and recommended actions from a joint workshop held in February 2022 on identifying shared objectives for and monitoring the effectiveness of narrative CVs for grant evaluation. More than 180 people from over 30 countries and 50 funding organizations participated.

Read the report

New report on improving pre-award processes

The processes that take place before research is submitted for funding (pre-award processes) serve as important scaffolding to support equitable and transparent research assessment. This report summarizes the key recommendations from DORA’s Funder Discussion Group symposia and workshops to improve pre-award processes, which were held in collaboration with the Elizabeth Blackwell Institute for Health Research (EBI) at the University of Bristol and the MoreBrains Cooperative.

Read the report

Building on this work, we are pleased to also announce that DORA, EBI, and MoreBrains are continuing their collaboration and are developing a new project to look at how three of the recommendations could be implemented. In May 2024, we will host two workshops that will bring DORA’s Funder Discussion Groups together with research administrators and managers to generate tools and guidance that address practical implementation of these recommendations.

DORA seeks new steering committee member from Asia

DORA is looking for Steering Committee members based in Asia. To be considered for this position, please complete this self-nomination form by May 31, 2024.

Approaching 25,000 signatories

There are nearly 25,000 individuals and organizations that have recognized a need to promote responsible research assessment. Every day the number of signatories for DORA increases and we anticipate reaching 25,000 signatories by summer. Signing the San Francisco Declaration on Research Assessment signifies a dedication to the principle that scientists should be evaluated on their individual achievements and quality of their work rather than on journal-based metrics, such as Journal Impact Factor. We are excited to reach the 25,000 mark and share this milestone with the research community!

The post DORA Newsletter May 2024 appeared first on DORA.

]]>
DORA Initiatives Meeting: Reformscape https://sfdora.org/2024/03/07/dora-initiatives-meeting-reformscape/ Thu, 07 Mar 2024 15:34:32 +0000 https://sfdora.org/?p=160012 Each quarter, DORA holds a Community of Practice (CoP) meeting for National and International Initiatives working to address responsible research assessment reform. This CoP is a space for initiatives to learn from each other, make connections with like-minded organizations, and collaborate on projects or topics of common interest. Meeting agendas are shaped by participants. If…

The post DORA Initiatives Meeting: Reformscape appeared first on DORA.

]]>
Each quarter, DORA holds a Community of Practice (CoP) meeting for National and International Initiatives working to address responsible research assessment reform. This CoP is a space for initiatives to learn from each other, make connections with like-minded organizations, and collaborate on projects or topics of common interest. Meeting agendas are shaped by participants. If you lead an initiative, coalition, or organization working to improve research assessment and are interested in joining the group, please find more information here.

During this quarters’ initiatives meeting, our very own Haley Hazlett, Program Manager, presented about Reformscape, a searchable collection of assessment criteria and standards from academic institutions. Reformscape is a tool developed by the TARA team at DORA and has been molded and improved over the past two and a half years by including community discussions and user testing. Reformscape is meant to inform institutions on how reform occurs by providing information on reform policies, processes, action plans, announcements etc. for multiple disciplines and levels of career. The tool has been active since January, 2024, and will continue to grow and expand in information, materials and geographics.

During the round table discussion we heard updates from multiple organizations including the following:

  • HELIOS Open organized a workshop in which 50 senior leaders (presidents, VPs, provosts etc.) joined in Miami to discuss modernizing tenure and promotion procedures.
  • The Southern African Regional Universities Association (SARUA) is creating a community of practice for responsible research assessment at academic institutions in South Africa mentioning their interest in the participation of experts to help drive the discussion.
  • The Natural Sciences and Engineering Research Council of Canada (NSERC) has published guidelines on research assessment, developing an evaluation framework for research, training and mentoring.

Suggested Reading List:

Casey Donahoe is DORA’s Policy Associate

The post DORA Initiatives Meeting: Reformscape appeared first on DORA.

]]>
Reformscape in the news https://sfdora.org/2024/02/26/reformscape-news/ Mon, 26 Feb 2024 15:53:04 +0000 https://sfdora.org/?p=159875 Since Reformscape launched on January 30, 2024, we have been very grateful for the community response. This blog post will serve as an ongoing link round-to articles about Reformscape from sources other than DORA. This post will be updated every week that we spot new articles about Reformscape. January 30, 2024 to February 4, 2024…

The post Reformscape in the news appeared first on DORA.

]]>
Since Reformscape launched on January 30, 2024, we have been very grateful for the community response. This blog post will serve as an ongoing link round-to articles about Reformscape from sources other than DORA. This post will be updated every week that we spot new articles about Reformscape.

January 30, 2024 to February 4, 2024

The Reformscape press release was picked up by Research Information and Information Today.

The Reformscape launch was mentioned in the CARL-ABRC newsletter (Canadian Association of Research Libraries or Association des bibliothèques de recherche du Canada) and the GraspOS newsletter this week.

We thank the contributor who added Reformscape to DORA’s Wikipedia page.

Brian Owens wrote an article about Reformscape for the journal Nature: How to make academic hiring fair: database lists innovative policies. (Originally free to read, now paywalled.) This was also shared in the Nature Daily Briefing for January 31, 2024.

We also want to thank the many people who liked and shared our posts on social media during this first week that Reformscape was available to everyone!

February 2024

Our colleagues in CLACSO shared news about Reformscape’s launch (in Spanish).

March 2024

Social Science Space (S3) picked up the Reformscape press release.

April 2024

Illinois Tech featured the contributions of Project TARA’s Ruth Schmidt to Reformscape.

The post Reformscape in the news appeared first on DORA.

]]>
Announcing Reformscape: a new online tool to explore responsible academic career assessment and drive positive change https://sfdora.org/2024/01/30/announcing-reformscape/ Tue, 30 Jan 2024 14:00:19 +0000 https://sfdora.org/?p=159873 Washington DC, USA – The San Francisco Declaration on Research Assessment (DORA) is delighted to announce the launch of Reformscape – a new online resource enabling the global academic community to explore and share examples of how to make hiring, promotion and tenure fairer, more robust and more diverse. For years, the traditional way in…

The post Announcing Reformscape: a new online tool to explore responsible academic career assessment and drive positive change appeared first on DORA.

]]>
Washington DC, USA – The San Francisco Declaration on Research Assessment (DORA) is delighted to announce the launch of Reformscape – a new online resource enabling the global academic community to explore and share examples of how to make hiring, promotion and tenure fairer, more robust and more diverse.

For years, the traditional way in which academic careers are assessed has been criticized as being unfair, biased and unfit for purpose. An excessive focus on narrow criteria and publication metrics, with an overreliance on journal impact factors and quantity of output rather than the quality and diversity of research, has left talented people overlooked and held back progress in diversity, equity and inclusion.

Free to use through a user-friendly online portal, Reformscape is a rich, organized dataset bringing together hundreds of real-life examples showing how universities and other academic institutions around the world are bringing in fairer, more responsible and more informative approaches to academic career assessment. Reformscape is populated with policies, action plans and other documents from more than 200 institutions from all over the world, together with trends and expertly curated insights.

Administrators, faculty and others in the academic community can explore Reformscape for ideas and inspiration around how to implement new approaches to career assessment and progression in their own institution, and also share their own policies and plans with the wider world. 

For example:

  • Read institutional profiles describing detailed summaries of their actions and progress. 
  • Access source materials to dig into the details of announcements, action plans, policies and published practices. 
  • Search and filter examples by location, career stage, discipline and date.  
  • Read expertly curated insights and commentary by the DORA team drawn straight from the data.
  • View graphs and visualizations showing trends in responsible academic career assessment over time and by country.
  • Share institutional profiles and insights with your networks to spark conversations and celebrate progress. 
  • Keep up to date with new materials added over time by using meta tags as well as the latest insights shared by the DORA team.  

Reformscape has been developed as part of Project TARA (Tools to Advance Research Assessment) – a collaboration between Sarah de Rijcke, Alex Rushforth and Marta Sienkiewicz at the Centre for Science and Technology Studies (CWTS) at Leiden University, Netherlands, Ruth Schmidt at the Institute of Design at Illinois Tech, Stephen Curry at Imperial College London, and Anna Hatch, Haley Hazlett and Zen Faulkes at DORA. The project was co-created with members of the academic community and is supported by Arcadia, a charitable foundation that works to protect nature, preserve cultural heritage and promote open access to knowledge.

Professor Stephen Curry is the former chair of DORA who worked on the Project TARA team. He said, “DORA is very much a community effort to discover, develop and share solutions to the knotty problems of research assessment and this approach is very much at the heart of our new Reformscape tool. We are immensely proud of what we’ve achieved and excited to see how it will be used to foster the uptake of fairer and more robust career assessment in institutions across the world.”

Sarah de Rijcke, Professor of Science, Technology and Innovation Studies at Leiden University, said, “Reformscape is a significant step in evolving the way we assess academic careers. Its launch is an encouraging development towards more equitable and inclusive academic evaluations. I’m hopeful about the potential impact of this tool, especially in expanding our perspective on academic achievements beyond conventional measures. I’m keen to see how it grows to include more institutions and diverse career stages, offering a more nuanced view of academic merit.”

Notes and resources:

About DORA

The San Francisco Declaration on Research Assessment (DORA) is a worldwide initiative to improve how scholarly research is evaluated, starting at the American Society for Cell Biology 2012 meeting in San Francisco. DORA covers all scholarly disciplines and all key stakeholders including funders, publishers, professional societies, institutions, and researchers. We encourage all individuals and organizations who are interested in developing and promoting best practice in the assessment of scholarly research to sign DORA. Find out more: https://sfdora.org 

About Arcadia

Arcadia is a charitable foundation that works to protect nature, preserve cultural heritage and promote open access to knowledge. All of Arcadia’s awards are granted on the condition that any materials produced are made available for free online. Find out more: http://www.arcadiafund.org.uk  

Media contact:

Zen Faulkes, DORA Program Director
reformscape@sfdora.org

– 30 –

The post Announcing Reformscape: a new online tool to explore responsible academic career assessment and drive positive change appeared first on DORA.

]]>
Stimulating engaged research through changes in the promotions process https://sfdora.org/2023/12/19/stimulating-engaged-research-through-changes-in-the-promotions-process/ Tue, 19 Dec 2023 15:49:50 +0000 https://sfdora.org/?p=159645 This is part of a series of curated insights for Reformscape, a new tool from Project TARA. The Open University is an interesting example of how changes in assessment can contribute to a broader strategic goal. It also offers inspiration for how to create a participatory change process. Between 2012-2015, the Open University took action…

The post Stimulating engaged research through changes in the promotions process appeared first on DORA.

]]>
This is part of a series of curated insights for Reformscape, a new tool from Project TARA.

The Open University is an interesting example of how changes in assessment can contribute to a broader strategic goal. It also offers inspiration for how to create a participatory change process.

Between 2012-2015, the Open University took action to strengthen their knowledge exchange with society. One of the solutions they devised was to introduce a new knowledge exchange-focused career track for academic faculty. Assessment criteria for promotions were revised to include, for instance, contributions to business, community, policy, practice or product/service development; leadership in knowledge exchange; and contributions from knowledge exchange to the university’s teaching and learning. 

This change was informed by action research conducted with staff at all levels of the organization. Involving staff revealed that they had a relatively narrow understanding of ‘engaged research’ and the ‘communities’ with whom they could exchange knowledge. Therefore, the change process focused on formulating broader definitions to deepen future engagement and capture their breadth in promotions, in line with the strategic knowledge-exchange mission of the University.

The follow-up surveys and evaluations of the program were published in a peer-reviewed article to promote both internal and community-wide learning.

Curious to know more? Read Open University’s background material in which they documented their process and results.

The post Stimulating engaged research through changes in the promotions process appeared first on DORA.

]]>
A guide to good conversations https://sfdora.org/2023/12/19/a-guide-to-good-conversations/ Tue, 19 Dec 2023 15:43:43 +0000 https://sfdora.org/?p=159644 This is part of a series of curated insights for Reformscape, a new tool from Project TARA. Utrecht University (UU) and Utrecht University Medical Centre (UMC) have been steadily developing policies and practices aimed to reward a variety of academic activities in assessments. Their most recent tool that captured our attention is a set of…

The post A guide to good conversations appeared first on DORA.

]]>
This is part of a series of curated insights for Reformscape, a new tool from Project TARA.

Utrecht University (UU) and Utrecht University Medical Centre (UMC) have been steadily developing policies and practices aimed to reward a variety of academic activities in assessments.

Their most recent tool that captured our attention is a set of conversational guidelines. This tool helps to translate high-level commitments to improving assessments into new practices on team and individual levels. It equips leaders and colleagues with better ways to reflect on their achievements and goals.

This tool speaks to the fact that individual academics are embedded in various teams and should be assessed in specific team contexts. They were designed to support more tailored ways of working and evaluating in different parts of the University.  

The guidelines can be used to organize structured discussions on themes of team spirit and leadership; team impact; mapping collective activities, competences, and qualities. They also include questions and approaches useful for discussing individual development in annual reviews or for stimulating ad-hoc reflections and feedback.

Curious to know more? Check out the background material that details this and other initiatives undertaken by Utrecht University and Utrecht University Medical Center.

The post A guide to good conversations appeared first on DORA.

]]>
Two new tools to support responsible research assessment: Debiasing committee composition and building blocks for impact https://sfdora.org/2023/03/02/introducing-two-new-tools-to-support-responsible-research-assessment-debiasing-committee-composition-and-building-blocks-for-impact/ Thu, 02 Mar 2023 20:07:08 +0000 https://sfdora.org/?p=157328 Introduction to the Project TARA toolkit Since 2013, DORA has advocated a number of principles for responsible research evaluation, including avoiding the use of journal-based metrics as surrogate measures of research quality, recognizing a broader range of scholarly outputs in promotion, tenure, and review practices, and the importance of responsible research assessment as a mechanism…

The post Two new tools to support responsible research assessment: Debiasing committee composition and building blocks for impact appeared first on DORA.

]]>
Introduction to the Project TARA toolkit

Since 2013, DORA has advocated a number of principles for responsible research evaluation, including avoiding the use of journal-based metrics as surrogate measures of research quality, recognizing a broader range of scholarly outputs in promotion, tenure, and review practices, and the importance of responsible research assessment as a mechanism to help  address the structural inequalities in academia

To help put these objectives into practice, DORA was awarded funding by Arcadia – a charitable foundation that works to protect nature, preserve cultural heritage, and promote open access to knowledge – in 2021 to support Tools to Advance Research Assessment (Project TARA). One of the outputs of Project TARA is a series of tools to support community advocacy for responsible research assessment policies and practices, created in collaboration with Ruth Schmidt, Associate Professor at the Institute of Design at the Illinois Institute of Technology and informed by internal and external community member input.

On October 25, 2022, we shared the first two tools in a community call that included researchers, publishers and faculties from across the world. In addition to introducing the new tools, the secondary aim of the call was to gather community feedback on future tool development for 2023. The session was moderated by Schmidt and Haley Hazlett, DORA’s Acting Program Director. DORA’s Policy Associates, Queen Saikia and Sudeepa Nandi assisted the session through chat moderation and technical support.

Figure: Debiasing Committee Composition and Deliberative Processes  and Building Blocks for Impact Tools

 

Feedback from the participants

During the first half of the call, Schmidt introduced Debiasing Committee Composition and Deliberative Processes  and Building Blocks for Impact.  The creation of Debiasing Committee Composition and Deliberative Processes was prompted because, despite best intentions, a reliance on old decision making norms can still reinforce biases and existing power dynamics. Taking a more deliberate approach to the construction of committees can reduce the likelihood of biased outcomes. The Building Blocks for Impact tool was designed to stretch current mental models that focus primarily on traditional forms of scholarly impact (e.g., citations, grants). This tool introduces a framework of impact characteristics that reflect a wide range of different forms of scholarly contributions.

One of the key points of the community discussion was understanding who the tools were designed for. Schmidt highlighted that Debiasing Committee Composition and Deliberative Processes would be particularly useful for academic faculty and staff who participate in panels for hiring, promotion, tenure, or funding decisions. This tool outlines ideas for advocating transparencydecision-making based on portfoliosfostering a diversity of opinions that invite all viewpoints, and expanding possibilities beyond historical norms. Building Blocks for Impact  is a model that outlines how a researchers’ contributions could be evaluated and considered impactful based on scale of influence and new audiences reached. For example, this model outlines that research with societal implications should be considered valid and impactful. Therefore, this tool would be particularly useful for academic faculty and staff seeking to better demonstrate the value of the different work they do, and for leadership seeking to include different forms of impact in hiring, promotion, and tenure policies and practices.

Discussing future tool ideas

During the second half of the call, Schmidt outlined early thinking on what the next set of Project TARA tools will focus on and opened the floor for questions and feedback from participants. The first future tool concept was an “expansion” of the Building Blocks for Impact tool. Such an expansion could contain, for instance, concrete examples of “impact in action”, practical strategies for adoption of broader definitions of “impact”, and visual frameworks. The second future tool concept centered on the fact that different career stages, including transitions into or out of academia, require different forms of support. After hearing the general overview, participants raised questions and provided feedback about the future tool ideas. This feedback fell under the following themes:

Representation: Promotion of a diverse intellectual community is essential for economic, scientific and societal progress in any organization. To this end, there was a general suggestion that the tools could be adapted for a more representative audience by incorporating Equity, Diversity, Inclusion and Accessibility principles. 

Transparency: Some of the academics who joined the call highlighted the importance of transparency in hiring, retention and promotion policies. This tied into our second future tool concept to outline the broad range of career paths that researchers can follow. Sometimes decisions can be skewed due to unconscious bias, subjectivity, values or expertise of the evaluator. When evaluation practices are not transparent, these biases often go unnoticed or unchallenged. For more on this, see the Debiasing Committee Composition tool. 

Advocacy: One of the participants highlighted the need for assessment principles that encourage new and unique ideas (i.e., ideas that are high risk and high reward). This will be important to consider when developing the 2023 tool that outlines concrete examples of recognizing impact in different contexts.  

Inclusivity: Some participants highlighted the need for the academic evaluation system to be more inclusive, especially when it comes to showcasing “excellence” or success stories. Participants pointed out that academic “standards” should be more flexible and inclusive to the broad range of skills and experiences that researchers have to offer. It is also often the case that career paths are more nuanced and complex than a simple transition from PhD to postdoc to faculty. The other aspects of career pathways away from academia, which one might consider “non-traditional”, also require fair judgement from the evaluation panels. Relating to this, the feedback session ended with an interesting question from one of the participants: Does a robust history of work ensure a strong future impact?  

Next steps

Feedback from the community call will be incorporated during the development of the future tools. We encourage you to let us know if you use or adapt Debiasing Committee Composition and Deliberative Processes and Building Blocks for Impact in your own assessment practices at info@sfdora.org.

Queen Saikia is DORA’s Policy Associate.

The post Two new tools to support responsible research assessment: Debiasing committee composition and building blocks for impact appeared first on DORA.

]]>
Community Call: Introducing the 2022 Project TARA tools to support responsible research assessment https://sfdora.org/2022/10/25/community-call-introducing-the-2022-project-tara-tools-to-support-responsible-research-assessment-2/ Tue, 25 Oct 2022 21:24:07 +0000 https://sfdora.org/?p=155977 Introducing two new tools for debiasing committee composition and recognizing the many facets of “impact” On October 25, 2022 DORA hosted a community call to introduce two new responsible research evaluation tools and provide feedback on future tool development. The toolkit is part of Project TARA, which aims to identify, understand, and make visible the…

The post Community Call: Introducing the 2022 Project TARA tools to support responsible research assessment appeared first on DORA.

]]>
Introducing two new tools for debiasing committee composition and recognizing the many facets of “impact”

On October 25, 2022 DORA hosted a community call to introduce two new responsible research evaluation tools and provide feedback on future tool development. The toolkit is part of Project TARA, which aims to identify, understand, and make visible the criteria and standards universities use to make hiring, promotion, and tenure decisions. The interactive call introduced and explored these new tools, which covered two important topics:

  • Debiasing Committee Composition and Deliberative Processes: It is increasingly recognized that more diverse decision-making panels make better decisions. This one-page brief can be used to learn how to debias your committees and decision-making processes.
  • Building Blocks for Impact: Capturing scholarly “impact” often relies on familiar suspects like h-index, JIF, and citations, despite evidence that these indicators are narrow, often misleading, and generally insufficient to capture the full richness of scholarly work. This one-page brief can be used to learn how to consider a wider breadth of contributions in assessing the value of academic achievements.

The tools were created by Ruth Schmidt, Associate Professor at the Institute of Design at Illinois Tech and member of the Project TARA team.

In the above clip from the October 25, 2022 call, Schmidt introduces the 2022 Project TARA tools: “Debiasing Committee Composition and Deliberative Processes” and “Building Blocks for Impact”.

Participants also provided their thoughts and feedback on the next round of Project TARA tools, slated for release in 2023. Sign up for DORA’s email list to be notified when the blog summary of the community discussion-portion of the call is published: https://sfdora.org/.

The event was be moderated by Haley Hazlett, DORA Acting Program Director and Ruth Schmidt. DORA Policy Associates Sudeepa Nandi and Queen Saikia provided chat moderation and support.

 

Project TARA is supported by Arcadia, a charitable foundation that works to protect nature, preserve cultural heritage, and promote open access to knowledge.

The post Community Call: Introducing the 2022 Project TARA tools to support responsible research assessment appeared first on DORA.

]]>
Dashboard development: identifying and categorizing good practice https://sfdora.org/2022/01/14/dashboard-development-identifying-and-categorizing-good-practice/ Fri, 14 Jan 2022 16:24:24 +0000 https://sfdora.org/?p=153863 Introduction to Project TARA As more institutions begin to examine and refine their approach to academic assessment, there is growing awareness of the value of knowledge-sharing resources that support the development of new policies and practices. In light of this need, DORA is building an interactive online dashboard as part of Project TARA to monitor…

The post Dashboard development: identifying and categorizing good practice appeared first on DORA.

]]>
Introduction to Project TARA

As more institutions begin to examine and refine their approach to academic assessment, there is growing awareness of the value of knowledge-sharing resources that support the development of new policies and practices. In light of this need, DORA is building an interactive online dashboard as part of Project TARA to monitor and track trends in responsible research assessment for faculty hiring and promotion at academic institutions. 

We held a community workshop on November 19, 2021, to identify responsible research assessment practices and categorize source material for the dashboard. This event expanded on our first community call in September where we gathered early-stage input on intent and desired use of the dashboard. 

Overview of the dashboard and discussion

Institutions are at different stages of readiness for research assessment reform and have implemented new practices in a variety of academic disciplines, career stages, and evaluation processes. The dashboard aims to capture the depth and breadth of progress made toward responsible research assessment and provide counter-mapping to common proxy measures of success (e.g., Journal Impact Factor (JIF), H-index, and university rankings). To do this, dashboard users will be able to search, visualize, and track university policies for faculty hiring, promotion, and tenure.

Participants identified faculty evaluation policies at academic institutions as good practice using the definition for responsible research assessment from the 2020 working paper from the Research on Research Institute (RoRI) as a guide. For example, the University of Minnesota’s Review Committee on Community-Engaged Scholarship assesses “the dossier of participating scholar/candidates who conduct community-engaged work, then produce a letter offering an evaluation of the quality and impact of the candidate’s community-engaged scholarship.” 

“Approaches to assessment which incentivize, reflect and reward the plural characteristics of high-quality research, in support of diverse and inclusive research cultures.”

Responsible research assessment definition from ​The changing role of funders in responsible research assessment: progress, obstacles & the way ahead.​ RoRIWorking Paper No. 3., November 2020.

In addition to identifying good practice, participants were asked to explore and discuss how the practices fell into four broad categories: impact, research integrity, openness, and equity and inclusion. From this exercise, we heard that flexibility will need to be built into the dashboard to account for “cross-categorization” of policies and practices. For instance, the Academic Promotions Framework at the University of Bristol in the United Kingdom, which rewards the production of open research outputs like data, code, and preprints, was originally placed in the openness category. But that prompted the group to examine whether these were the best categories to use given how closely related openness is to research integrity. Likewise, participants had some difficulty placing policies to recognize public impact within the four categories. While such work may fit in the Impact category, it could also be applied across the other three. Indeed, many responsible research assessment policies could fall into multiple categories. In another example, the University of Zurich has assessment policies that include recognition of openness, leadership, and commitment to transparency in hiring or promotion. A flexible, broadly applicable system of categorization, such as meta-tagging, may prove to be a benefit because it allows policies to have multiple metatags.

Another key facet of the workshop discussion was that many academic institutions adopt national or international responsible academic initiatives. An example of this is the university-specific adoption of the nationally-developed Norwegian Career Assessment Matrix (NOR-CAM) or the Netherlands Recognition and Rewards Programme. National or regional context is also important to keep in mind given that organizations might be governmentally limited in how much they can alter their assessment policies. For example, the University of Catalonia has autonomy in the promotion and hiring of postdoctoral researchers, but is limited by external accreditation systems in its ability to change assessment practices for faculty.

Policies that focus on improving the ability of academic institutions to retain new faculty and staff (e.g., retention policies) were also seen as an important addition to the dashboard. It was pointed out that many of these policies focus on diversity and inclusion, given that many policies that make environments more supportive to minoritized faculty and staff also help with retention of those groups in academia. For example, Western Washington University in the United States created Best Practices: Recruiting & Retaining Faculty and Staff of Color, which offers suggestions and guidance such as making leadership opportunities available and keeping good data on why faculty of color leave universities to ensure that policy changes to entice them to stay are informed by data.

Example from the exercise to identify how policies and practices fell into different categories

Participants also examined the distribution of the responsible academic assessment examples after categorization: Was there an over- or under-representation of examples in certain categories? Was there difficulty categorizing certain examples? Through this exercise, we hoped to gain insight into areas of high and low activity in the development of responsible research assessment policies and practices. For example, many of the policies identified by the group focused on faculty promotion, yet fewer policies were related to faculty hiring and retention. As the dashboard structure design evolves, we will work to ensure that while findings like this may point to a current lack of activity, highlighting and visualizing these gaps can also indicate potential opportunities to address oversights and galvanize new institutional activity to address them.  

Finally, we heard about the importance of focusing on the process of assessing researchers rather than products, which was not strongly represented in the examples brought by participants. Such policies, like the Ghent University evaluation framework in Belgium, could provide staff and faculty with the flexibility to be assessed based on their accomplishments in-context with their own goals.

Next steps

The discussion from the workshop, in addition to the source material identified, will refine how we visualize and classify source material for the dashboard. Feedback on how well different policies “fit” into the four proposed categories will be used to inform the development of how different policies will be meta-tagged or classified to make them easy to find by users. 

The next phase of dashboard development is to determine data structure and visualization. Our goal is to begin web development in the spring of 2022. Updates on dashboard progress and further opportunities for community engagement can be found on the TARA webpage.

The dashboard is one of the key outputs of Tools to Advance Research Assessment (TARA), which is a DORA project sponsored by Arcadia – a charitable fund of Lisbet Rausing and Peter Baldwin to facilitate the development of new policies and practices for academic career assessment.

Haley Hazlett is DORA’s Program Manager.

The post Dashboard development: identifying and categorizing good practice appeared first on DORA.

]]>