Miscellaneous Archives | DORA https://sfdora.org/category/miscellaneous/ San Francisco Declaration on Research Assessment (DORA) Thu, 15 Aug 2024 18:05:43 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://sfdora.org/wp-content/uploads/2020/11/cropped-favicon_512-1-32x32.png Miscellaneous Archives | DORA https://sfdora.org/category/miscellaneous/ 32 32 Encouraging innovation in open scholarship while fostering trust: A responsible research assessment perspective https://sfdora.org/2024/08/01/encouraging-innovation-in-open-scholarship-while-fostering-trust-a-responsible-research-assessment-perspective/ Thu, 01 Aug 2024 15:39:51 +0000 https://sfdora.org/?p=161267 The following post originally appeared on the Templeton World Charity Foundation blog. It is reposted here with their permission.  Emerging policies to better recognize preprints and open scholarship Research funding organizations play an important role in setting the tone for what is valued in research assessment through the projects they fund and the outputs they…

The post Encouraging innovation in open scholarship while fostering trust: A responsible research assessment perspective appeared first on DORA.

]]>
The following post originally appeared on the Templeton World Charity Foundation blog. It is reposted here with their permission. 

Emerging policies to better recognize preprints and open scholarship

Research funding organizations play an important role in setting the tone for what is valued in research assessment through the projects they fund and the outputs they assign value to. Similarly, academic institutions signal what they value through how they assess researchers for hiring, promotion, and tenure. An increasing number of research funding organizations and academic institutions have codified open scholarship into their research assessment policies and practices. Examples include Wellcome, the Chan Zuckerberg Initiative, the Ministry of Business, Innovation & Employment of Aotearoa New Zealand, the University of Zurich and the Open University.

This shift is accompanied by policies that recognize preprints as evidence of research activity (e.g. NIHJapan Science and Technology AgencyWellcomeEMBO, and some UKRI Councils). Some funders are now formally recognizing peer-reviewed preprints at the same level of journal articles, such as EMBO and many of the cOAlition S funders. A preprint is a scholarly manuscript that the authors upload to a public server but has not (yet) been accepted by a journal (it is usually the version submitted to a journal if the authors do decide to take it further for journal publication). It can be accessed without charge and, depending on the preprint server, is screened and typically posted within a couple of days, making it available to be read and commented on. Because preprints offer a means outside of journals to share research results, they have the potential to support responsible assessment by decoupling journal prestige from assumptions on the quality of research findings. Preprints also enable sharing of a range of outputs and results that may not be attractive to a journal (for example, research that is technically sound but has a limited scope, or null/negative findings). Because they are also usually free to post, preprints can also help reduce author-facing cost-associated barriers often associated with traditional open access publications, although these services are typically therefore reliant on ongoing grant funding to maintain their sustainability.

One of the most recent examples of a substantial policy change is the Bill and Melinda Gates Foundation’s March 2024 announcement of its upcoming Open Access Policy in 2025. The 2025 policy will introduce two changes for grantees: grantees will have to share preprints of their research, and the Gates Foundation will stop paying article processing charges (APCs). As with many shifts towards open access policies, these changes were motivated by several factors, including the Gates Foundation’s desire to provide journal-agnostic avenues for research assessment and to empower their grantee authors to share different versions of their work openly on preprint servers and without the costs of APCs. Reducing the costs associated with publishing research and increasing readers’ accessibility to research via preprint servers also supports more equitable access to research products.

The Gates Foundation used ten years of existing data to inform its decision to refine its Open Access Policy, has engaged actively with community dialogue, and has made it clear that this is a first step on a longer path to better evaluate research on its own merits and increase accessibility to research. Notably, the Gates Foundation took this step after taking into account the existing shortcomings of current open access models that rely on APCs that effectively limit global access to research outputs. Given that the Gates Foundation is “the wealthiest major research funder to specifically mandate the use of preprints,” this approach is groundbreaking in its emphasis on preprints and its shift away from spending on APCs. It has also placed a spotlight on these issues and catalyzed discourse around trust in preprints. Policy changes like this indicate a willingness among research funders to take steps toward change and move away from recognizably flawed processes. This is an important step, since flawed processes are often retained because fixing them or adopting new processes is perceived as too high effort and too high risk (also known as the status quo bias).

Overcoming the status quo and tackling new challenges

Overcoming the status quo bias is difficult, but not impossible. Indeed, a common concern around changing research assessment processes to include new ways of sharing knowledge is taking a leap into the unknown. Because these new policies are on the leading edge of change, there are gaps in our knowledge around their effects on research culture and assessment. For example, will assessors penalize researchers who include preprints in their CVs or will research culture shift what it values?

Another key question centers on how preprints will impact our concept of traditional manuscript peer review processes. Traditionally, journals select a panel of peer reviewers, ideally field experts, who voluntarily review manuscript submissions for free. These detailed reviews inform an editor’s decision on whether to publish a manuscript. Generally, preprints are only lightly checked before being made public, after which anyone can read and comment on them and provide peer feedback. One common concern is that preprints are only subject to light checking before being made public, though it is important to note that issues with rigor and reproducibility exist within current peer-review publication systems. Preprint peer feedback holds the potential for positive change, opening up the opportunity for authors to receive a wide range of community input and making it easier to spot issues early.

One step to foster trust in preprints will be to create a shared understanding of what preprint “review” is. What qualifies as review in the context of a preprint was recently defined via expert consensus as “A specific type of preprint feedback that has: Discussion of the rigor and validity of the research. Reviewer competing interests declared and/or checked. Reviewer identity disclosed and/or verified, for example, by an editor or service coordinator, or ORCID login.” Additionally, there are a growing number of preprint review services available, for example VeriXiv (created through a partnership between the Gates Foundation and F1000), Peer Community In and Review Commons who have all created infrastructure and pipelines to verify preprints and facilitate structured and invited expert peer review of preprints, post publication. They provide journal-independent assessment of the preprint, typically using various forms of open peer review practices, making it more transparent, fostering accountability and enabling reviewers to be rewarded for their contributions to the field. However, some have raised concerns about whether greater transparency increases risk of retaliation, particularly for early career researchers, although recent evidence suggests that more research is needed to determine if repercussions occur.

Questions like these are legitimate and highlight the value of the organizations that are actively seeking to answer them, like the Research on Research Institute, which studies the results of research policy reform using the same scholarly rigor that reform efforts are trying to foster in the academic ecosystem. Organizations like ASAPbio are working to address concerns around the agility of preprint servers to correct or retract preprints and to support rigorous and transparent preprint peer review processes.

In the meantime, fear of unintended consequences is not reason enough to avoid trying to improve research incentives and the culture associated with it. The changes that research funders are implementing to recognize and incentivize open scholarship practices are on the leading edge of reform efforts, pushing research culture forward in new ways that aim to address existing burdens caused by APCs and journal prestige. As with all policies that aim to shift assumptions around what can and should be valued in research, gaps in knowledge will need to be filled through iteration, open dialogue with groups that new policies will impact, and careful study of how new policies change research culture.

Responsible research assessment and open scholarship are interconnected

Responsible research assessment: An umbrella term for “approaches to assessment which incentivise, reflect and reward the plural characteristics of high-quality research, in support of diverse and inclusive research cultures.” –RoRI Working Paper No.3

As well as progress in the reform of research assessment, a further fundamental change in the research ecosystem over the past decade has been the emergence of open scholarship (also known as open science or open research)¹. The UNESCO 2021 Recommendation on Open Science outlined a consensus definition of open science that comprises open scientific knowledge (including open access to research publications), open dialogues with other knowledge systems, open engagement of societal actors, and open science infrastructures. It is an inclusive movement to “make multilingual scientific knowledge openly available, accessible and reusable for everyone, to increase scientific collaborations and sharing of information for the benefits of science and society, and to open the processes of scientific knowledge creation, evaluation and communication to societal actors beyond the traditional scientific community.” This definition captures the broad nature of open scholarship: it is both a movement to change how scholarly knowledge is shared, and to address global inequities in scholarly culture itself.

DORA (Declaration On Research Assessment) is a global non-profit initiative that actively works with the scholarly community to support responsible research assessment for hiring, promotion, tenure, and funding decisions. DORA is part of a global movement that aims to reform research assessment equitably including expanding the definition of what gets assessed, and changing the way the assessment takes place. Reducing emphasis on flawed proxy measures of quality such as the Impact Factor or h-index, broadening the type of work that is rewarded, and challenging assumptions about quality and excellence are critical facets of the movement towards responsible research assessment. However, these core concepts do not exist in a vacuum (see Venn diagram by Hatch, Barbour and Curry).


The concepts of (i) research assessment reform, (ii) open scholarship, and (iii) equality and inclusion cannot be treated separately. They interact strongly and in many complex ways – presented only in broad outline here – and are converging to create a research culture that is centred on people (practitioners and beneficiaries) and on values that embody the highest aspirations of a diverse world.

The concepts of research assessment reform, open scholarship, and equality and inclusion cannot be treated separately.


Responsible research assessment is intricately linked with open scholarship, and also with equity and inclusion initiatives. Biases and assumptions about research quality can determine who is assessed and how they are assessed, and decoupling research products from journal prestige is an important step to address these biases.

Greater transparency and accessibility enables the recognition of a broader range of scholarly outputs (including datasets, protocols, and software). In alignment with the aims of open scholarship, DORA seeks to address the “publish or perish” culture by recognizing and rewarding transparency, rigor, and reproducibility. Ultimately, enabling and rewarding rigor and transparency serve to foster trust both within academia and with the broader public.

The intersection between these movements is apparent in many policies and practices being adopted at academic institutions around the world. Reformscape, a database of research assessment reform at academic institutions, contains over twenty examples of how institutions are incorporating aspects of open scholarship into their hiring, promotion, and tenure practices. Many of DORA’s in-depth case studies of institutional research assessment reform include mention of the institution codifying open scholarship into their practices.

Fostering public trust through open scholarship requires a systems approach

A critical part of research culture is trust. There are many ways to build trust. Traditional peer reviewed journals emphasize building trust by invoking expert opinion. Open scholarship emphasizes building trust through transparency: making all stages of the research process, from conception to data collection to analysis and review visible to all.

The two approaches are not mutually exclusive. Peer reviewed journals have created policies to promote openness, and several forms of open scholarship have sought ways to solicit expert opinions. However, peer review has been used as the main signal of trustworthiness for so long that it can be difficult to convince researchers that having an article pass peer review isn’t necessarily a definitive signal to say that the article is trustworthy. Consequently, many have not been convinced about the value of sharing research ahead of peer review, and have been concerned that removing that vetting would open the gates to flawed research and increase the risk of misinformation.

In practice, a couple of studies (here and here) have suggested that the distinctions between peer reviewed journal articles and preprints can be minimal. Preprints have gained increased acceptance from researchers who are posting more preprints, and from reporters who are writing more stories based on preprints (although more work needs to be done to ensure that disclaimers about the lack of peer review are always added).

Understanding what scholarly communication can be viewed as trustworthy was, is, and always will be a complex task for both experts and non-experts alike. Experts are expected to gain this through advanced training and experience. Non-experts might benefit from increased media literacy, a subject that is taught to less than half of US high school students.

Call to Action

Reforming research assessment requires new policies and practices that embrace diverse scholarly outputs, reduce the emphasis on journal prestige as an implicit indicator of research quality, and evaluate research based on its intrinsic value. As we expand the type of work that we recognize to include preprints and other “non-traditional” outputs, we can foster trust in these new outputs by 1) recognizing and rewarding transparency, rigor, and high-quality review, and 2) by developing resources to foster and support responsible preprint review. For example, a growing number of bibliographic indexers are starting to index preprints (e.g., Europe PMCPubMed Central) and there are several efforts to index and link reviews to preprints (e.g. ScietyCOAR Notify). There are also a number of efforts underway to develop a range of consistent trust signals and markers. Alongside these efforts lies the crucial task of consistently educating and communicating about innovations in publishing and open scholarship practices to cultivate public trust and literacy.

Change on this scale is not immediate, nor should it be. DORA has long advocated for the strategy of iteratively fine tuning policies over time using data and input from their target communities. As more and more research funders and institutions test new ways of rewarding researchers for their open scholarship practices, it is important to seize the opportunity for careful review, refinement, and fostering an open dialogue on what works and what doesn’t.

Zen Faulkes is DORA’s Program Director
Haley Hazlett is DORA’s Program Manager

Acknowledgements: The co-authors would like to thank DORA co-Chairs, Ginny Barbour and Kelly Cobey, and DORA Vice-Chair, Rebecca Lawrence for their editorial input on the piece.


¹ The term “open scholarship” will be used throughout this piece for consistency. Different organizations also use the terms “open research” and “open science” to describe broad policies that encourage openness, though often all three terms generally have a holistic focus on fostering a culture of openness, transparency, and accessibility. DORA uses “open scholarship” to better encapsulate all scholarly disciplines.

The post Encouraging innovation in open scholarship while fostering trust: A responsible research assessment perspective appeared first on DORA.

]]>
Introduction to DORA: a short presentation at the Global Research Council’s virtual Responsible Research Assessment Conference https://sfdora.org/2020/12/01/introduction-to-dora-a-short-presentation-at-the-global-research-councils-virtual-responsible-research-assessment-conference/ Tue, 01 Dec 2020 20:55:55 +0000 https://sfdora.wpengine.com/?p=2145 DORA chair, Prof. Stephen Curry made a short introduction to DORA for the Global Research Council conference on Responsible Research Assessment, which was held online over the week of 23-27 November 2020. He briefly explains the origins of DORA, the meaning of the declaration, and how DORA developed into an active initiative campaigning for the world-wide reform of research assessment.

The post Introduction to DORA: a short presentation at the Global Research Council’s virtual Responsible Research Assessment Conference appeared first on DORA.

]]>

DORA chair, Prof. Stephen Curry made a short introduction to DORA for the Global Research Council conference on Responsible Research Assessment, which was held online over the week of 23-27 November 2020. He briefly explains the origins of DORA, the meaning of the declaration, and how DORA developed into an active initiative campaigning for the world-wide reform of research assessment. In advance of the conference, Curry, and Program Director, Dr. Anna Hatch, contributed to a working paper outlining the state of play regarding responsible research assessment, exploring what it means and describing existing initiatives in the space.

The post Introduction to DORA: a short presentation at the Global Research Council’s virtual Responsible Research Assessment Conference appeared first on DORA.

]]>
DORA seeks nominations for the Advisory Board from North and South America https://sfdora.org/2020/06/18/dora-seeks-nominations-for-the-advisory-board-from-north-and-south-america/ Thu, 18 Jun 2020 17:40:13 +0000 https://sfdora.wpengine.com/?p=1915 DORA seeks nominations and self-nominations from North and South America to fill two open positions on our international Advisory Board.

The post DORA seeks nominations for the Advisory Board from North and South America appeared first on DORA.

]]>
DORA seeks nominations and self-nominations from North and South America to fill two open positions on our international Advisory Board.

The role of the advisory board is strategic in nature. It complements the work of the steering committee and provides globally-relevant strategic guidance for future DORA activities. Advisory board members serve a term of three years that can be renewed once. More information about the board and its operation can be found in DORA’s governance procedures.

To maintain the board’s geographic distribution, nominations at this time are limited to individuals from North America and South America. Nominations will be reviewed by the DORA steering committee. The ideal nominee will provide a balance to the experiences and strengths of the other advisory board members. Nominations and self-nominations from underrepresented and minoritized scholars and early career researchers are encouraged.

Advisory board members as much as possible reflect DORA stakeholders, including scholarly societies, publishers, funders, researchers, and universities. Advisory board members come from diverse backgrounds, academic fields, and geographic locations.

To nominate someone or yourself to the advisory board, please complete DORA’s short nomination form by July 7, 2020. Please reach out to DORA’s program director Dr. Anna Hatch (info@sfdora.org) with any questions.

The post DORA seeks nominations for the Advisory Board from North and South America appeared first on DORA.

]]>
Rethinking Research Assessment: Ideas for Action https://sfdora.org/2020/05/19/rethinking-research-assessment-ideas-for-action/ Tue, 19 May 2020 15:11:20 +0000 https://sfdora.wpengine.com/?p=1834 We are pleased to announce a new briefing document from DORA and colleagues, “Rethinking Research Assessment: Ideas for Action,” which provides five design principles to help universities and research institutions improve their research assessment policies and practices.

The post Rethinking Research Assessment: Ideas for Action appeared first on DORA.

]]>
We are pleased to announce a new briefing document from DORA and colleagues, “Rethinking Research Assessment: Ideas for Action,” which provides five design principles to help universities and research institutions improve their research assessment policies and practices.

The brief is a collaboration between DORA and Ruth Schmidt, associate professor at Illinois Institute of Technology, and is based on discussions held at a meeting co-sponsored by DORA and the Howard Hughes Medical Institute (HHMI) in October 2019一Driving Institutional Change for Research Assessment Reform. The meeting convened a diverse group of stakeholders invested in research assessment reform, including faculty, university administrators, librarians, funders, scientific professional society staff, culture change experts, and representatives from other non-profit initiatives.

The brief outlines five common myths about research evaluation to help universities better understand barriers to change and provides analogous examples to illustrate how these myths exist inside and outside of academia. It also offers five design principles to help institutions experiment with and develop better research assessment practices, including:

  1. Instilling standards and structure into research assessment processes
  2. Fostering a sense of personal accountability in faculty and staff
  3. Prioritizing equity and transparency of research assessment processes
  4. Taking a big picture or portfolio view toward researcher contributions
  5. Refining research assessment processes through iterative feedback

Over the years, DORA has rallied support for responsible research assessment practices by asking academic institutions to focus on the content of researchers’ contributions, rather than relying on proxy measures of success. DORA also calls on institutions to consider the value and impact of all outputs of academic work for assessing research. These are significant requests, and we hope that these design principles will be a useful tool for institutions to move forward with research assessment reform.

 

Download: DORA_IdeasForAction.pdf

 

The post Rethinking Research Assessment: Ideas for Action appeared first on DORA.

]]>
DORA statement on hiring, promotion, and funding decisions during the COVID-19 pandemic https://sfdora.org/2020/04/06/dora-statement-on-hiring-promotion-and-funding-decisions-during-the-covid-19-pandemic/ Mon, 06 Apr 2020 14:40:18 +0000 https://sfdora.wpengine.com/?p=1786 The COVID-19 pandemic has upended daily life around the world, forcing individuals and organizations to adapt rapidly to unanticipated circumstances. The changes in universities and research institutions have been dramatic.

The post DORA statement on hiring, promotion, and funding decisions during the COVID-19 pandemic appeared first on DORA.

]]>
The COVID-19 pandemic has upended daily life around the world, forcing individuals and organizations to adapt rapidly to unanticipated circumstances. The changes in universities and research institutions have been dramatic.

With limited notice, academics and researchers have been asked to convert in-person classes to remote learning formats. Research operations have largely been restricted to tasks that can be accomplished working from home, which in many cases will reduce researcher productivity. Many academics and researchers (including research students) are caregivers, who must re-balance work and family responsibilities because of the impact of school closures or the added pressure of looking after relatives who may be ill or at high risk from coronavirus infection.

This is not business as usual, and we need to adjust the expectations placed on faculty and scholars during the pandemic and in the wake of it. Postdocs preparing for the job market and other early career researchers (e.g., pre-tenure faculty) are at the most vulnerable career stages. In these unprecedented times, universities and research institutions should be flexible in their hiring, promotion, and tenure processes. In particular, they will need to account for the disruption caused by COVID-19.

DORA calls on all universities and research institutions to:

  • Redefine their expectations for productivity in the wake of the present pandemic.
  • Communicate clearly to academics and researchers how they will modify research evaluation procedures for hiring, promotion, and tenure.

We applaud those that have already taken action (e.g. see: here). In a similar vein, DORA calls on research funders to reassure applicants that they will also take account of the productivity impact of the COVID-19 crisis.

The post DORA statement on hiring, promotion, and funding decisions during the COVID-19 pandemic appeared first on DORA.

]]>
Universities are adjusting review, promotion, and tenure expectations due to COVID-19 https://sfdora.org/2020/03/24/universities-are-adjusting-review-promotion-and-tenure-expectations-due-to-covid-19/ Tue, 24 Mar 2020 13:32:10 +0000 https://sfdora.wpengine.com/?p=1774 The emergence of COVID-19 has drastically upended the academic enterprise. Because of physical distancing, many non-tenured faculty members are facing additional, unexpected obstacles in their promotion and tenure trajectory. Transitioning classes to online learning environments will detract from research efforts, and winding down laboratory operations will result in a more direct reduction in research output. While trying to stay healthy themselves, many faculty members are also balancing job responsibilities with kids at home, adapting to telework, etc.

The post Universities are adjusting review, promotion, and tenure expectations due to COVID-19 appeared first on DORA.

]]>
The emergence of COVID-19 has drastically upended the academic enterprise. Because of physical distancing, many non-tenured faculty members are facing additional, unexpected obstacles in their promotion and tenure trajectory. Transitioning classes to online learning environments will detract from research efforts, and winding down laboratory operations will result in a more direct reduction in research output. While trying to stay healthy themselves, many faculty members are also balancing job responsibilities with kids at home, adapting to telework, etc.

Universities are recognizing the strain that the COVID-19 pandemic is having on non-tenured faculty. Some institutions are choosing to extend tenure and promotion timelines. For example, Creighton University announced a one-year extension of the tenure probationary period for pre-tenure faculty.  Vanderbilt University posted a message on its website from the Interim Chancellor and Provost Dr. Susan Wente assuring faculty they will be granted a one year extension:

“In light of these increased demands at work—and for many, at home—that could slow down or temporarily interrupt your professional career, the deans of all our schools and colleges are in agreement with the provost and interim chancellor that we should take the unusual measure of granting an automatic one-year extension to the tenure clock of all probationary faculty who are not currently under review for tenure.”

Other universities are readjusting their standards for promotion and tenure. The Dean of the College of Arts and Letters at Michigan State University, Dr. Christopher Long promised to consider the challenges caused by COVID-19 in promotion and tenure decisions:

“I promise to take the challenges of the present moment into consideration during annual staff and faculty reviews and in the tenure, reappointment, and promotion process. When evaluating your colleagues, I will ask you to do the same. We have entered uncharted territory, and it would be unjust to expect business as usual.”

Researchers are using crowdsourcing to generate a list of the universities reevaluating tenure and promotion processes to account for the disruption caused by COVID-19. DORA will continue to support researchers by highlighting these examples of good practice on social media.

The post Universities are adjusting review, promotion, and tenure expectations due to COVID-19 appeared first on DORA.

]]>
Identifying progress on the path to research assessment reform https://sfdora.org/2020/03/12/identifying-progress-on-the-path-to-research-assessment-reform/ Thu, 12 Mar 2020 15:44:38 +0000 https://sfdora.wpengine.com/?p=1759 As Alison Mudditt described in her Scholarly Kitchen post last month, the path to reforming research assessment has been met with significant challenges. We agree with her that culture change is often a slow process. However, as DORA demonstrates, it is possible to identify tangible progress on the path to large-scale research assessment reform.

The post Identifying progress on the path to research assessment reform appeared first on DORA.

]]>
As Alison Mudditt described in her Scholarly Kitchen post last month, the path to reforming research assessment has been met with significant challenges. We agree with her that culture change is often a slow process. However, as DORA demonstrates, it is possible to identify tangible progress on the path to large-scale research assessment reform.

Thanks to DORA there is an ongoing conversation to rethink research assessment and what we value in academia.  By adding their name to the declaration, DORA signers are making public commitments and holding themselves accountable to their staff and the broader research community. In many cases, these institutions are taking real steps to implement DORA principles and drive research assessment reform.

For example, the DORA-inspired working group at the Universitat Oberta de Catalunya (UOC) met monthly for almost a year to develop a plan (still underway) for research assessment reform. The first step was to revise the criteria for hiring postdoctoral researchers by removing the Journal Impact Factor as an indicator of success and reclassifying “scientific output” to “main scientific achievements.”

In fact, working groups are a common first step for many institutions. The working group at Cardiff University in the UK developed an action plan and appointed a Responsible Research Assessment Officer to support its implementation efforts. In Finland, the University of Oulu established a working group to develop its policy for responsible researcher assessment and create a plan for implementation. So far, the university has identified a list of considerations, including equity, differences between disciplines, and how metrics can support qualitative assessment. In France, the CNRS is using a new system to evaluate individual researchers on DORA principles.

Some academic departments are positioned to make their own changes. For example, the Cell Biology Department at UT Southwestern Medical Center reinvented the traditional faculty search process by getting rid of the search committee to involve the entire faculty in the hiring process. Applicants for their part are directed to focus on the cover letter, which plays a large role in the initial review process. The candidates selected for Skype interviews receive questions ahead of time to identify more thoughtful candidates in addition to those who process information quickly. And each candidate has a designated faculty advocate who can speak to their strengths during hiring deliberations.

For more widespread change, a coalition of Dutch universities and funders is working together to advance research assessment so that emphasis is placed on the content of a contribution rather than the reputation of the venue where it is published. The Wellcome Trust recently released draft guidance for the institutions they fund to implement DORA principles, and this will undoubtedly have an impact on processes on the ground. The meeting that DORA co-organized with the Howard Hughes Medical Institute (HHMI) in October 2019, Driving Institutional Change for Research Assessment Reform, was also a source of input for the guidance. Change is not limited to DORA signatories. Ghent University in Belgium and the Quest Center for Transforming Biomedical Research at the Berlin Institute for Health have taken significant steps to improve research assessment practices.

Culture change may be slow, and although we have not reached our end goal, DORA is making meaningful progress to advance research assessment reform. We have a group of innovating institutions providing models of change, which serve as a guide for others to follow. Together, these steps forward will bring us closer to systems change for research assessment reform.

Anna Hatch is the Program Director for DORA

The post Identifying progress on the path to research assessment reform appeared first on DORA.

]]>
2019 in review: DORA’s list of the top 10 advances in research assessment https://sfdora.org/2019/12/19/2019-in-review-doras-list-of-the-top-10-advances-in-research-assessment/ Thu, 19 Dec 2019 20:24:31 +0000 https://sfdora.wpengine.com/?p=1692 As 2019 winds down, the DORA steering committee and advisory board wanted to highlight the ways research assessment reform has advanced in the last year. From new data on assessment policies to the development of new tools, the scholarly community is taking action to improve research assessment in concrete ways.

The post 2019 in review: DORA’s list of the top 10 advances in research assessment appeared first on DORA.

]]>
As 2019 winds down, the DORA steering committee and advisory board wanted to highlight the ways research assessment reform has advanced in the last year. From new data on assessment policies to the development of new tools, the scholarly community is taking action to improve research assessment in concrete ways. Below, we highlight our 10 favorites:

  1. New research from the Scholarly Communications Lab at Simon Fraser University and their collaborators has mapped out how widely the Journal Impact Factor is used in review, promotion, and tenure decisions in the United States and Canada. Studies like this one help us to understand the scale of the challenge facing DORA, but also serve as a stimulus for institutions to innovate and improve their assessment policies.
  2. One institution that has already taken steps to reform its research assessment practices is the Universitat Oberta de Catalunya, which has signed DORA and released an action plan and timeline to improve research assessment on campus. By the end of 2020, the university aims to propose indicators to measure the social impact of research. They also intend to expand the list of research outputs to assess the impact and transfer of research. The plan was developed by the university’s DORA working group and approved by the Research and Innovation Commission. While it was released at the end of last year, we wanted to recognize the university’s efforts throughout 2019 to implement widespread changes.
  3. In September, the Research on Research Institute (RORI) was launched in London to advance more strategic, open, diverse and inclusive research. The institute is a partnership among Wellcome Trust, Digital Science, and the Universities of Sheffield and Leiden. An important aspect of their work will focus on measurement and evaluation, including broadening and diversifying criteria and indicators used in evaluation of research. More broadly, by providing a deeper understanding of the machinations of the research landscape, RORI aims to improve decision-making, career advancement and the culture of research.
  4. The Hong Kong Principles for Assessing Researchers: Fostering Research Integrity were developed as part of the 6th World Conference on Research Integrity. Institutions and funders can adopt the principles to recognize scholars for activities that lead to trustworthy research. Similar to DORA, the Hong Kong principles emphasize the importance of considering the value of all contributions, outputs, and outcomes of scholarly work.
  5. In May, the revised implementation guidance for Plan S established research assessment reform—in accordance with DORA—as a core principle. Members of cOAlition S are committing to revise their policies by January 2021 to assess research on its own merits rather than the venue where it is published.
  6. DORA advisory board member Needhi Bhallla published a perspective summarizing several proven strategies to improve equity in faculty hiring. Some examples include evaluating and revising departmental review and promotion processes and developing a rubric to assess diversity statements at the beginning of a search. By adopting these practical measures, institutions also increase transparency and consistency of faculty searches.
  7. Many organizations are rethinking how to structure job and grant applications to encourage robust but time-efficient review processes that are detached from journal impact factors. In October, the Royal Society released the Résumé for Researchers to recognize a wider range of scholarly outputs and accomplishments. The Dutch Research Council (NWO) is also piloting a narrative CV format for its Veni funding scheme and announced in December that it was going to do the same for the Vici round. Narratives permit researchers to contextualize their work and varied contributions in a way that numbers cannot.
  8. In October, DORA and the Howard Hughes Medical Institute convened a diverse group of stakeholders to consider how to improve policy and practice by exploring different approaches to culture and systems change. The background readings and participant commentaries provide a better understanding of the opportunities and challenges research institutions face with research assessment reform. Toolkits to help institutions adopt new policies and practices will be made available in 2020.
  9. The Latin American Council of Social Sciences (CLACSO) and the National Research and Technology Council of Mexico (CONACYT) organized a Forum on Research Evaluation in November to kick off a two-year working group to identify alternative evaluation procedures for social sciences in Latin America. Stay tuned for updates on their work.
  10. A number of cross-sector initiatives are aiming to develop more positive visions of research culture, which is intimately connected to and impacted by research assessment practices. The Wellcome Trust, for instance, is looking to redefine ‘excellence’ by including consideration of how research is conducted. And the Royal Society is following up its conference on research culture by sharing examples of how different organizations are working to eliminate bullying and harassment; to improve equity, diversity, and inclusion; and to foster greater collaboration.

A lot can happen in a year. It is difficult to limit ourselves to just 10 advances. While the following did not fit in the top 10, we think they deserve recognition too:

  • The number of DORA translations is still growing. It can now be read in 21 languages!
  • Released in May, the Helsinki Initiative on Multilingualism in Scholarly Communication promotes language diversity in research assessment.
  • Published in April, the feature article “How Journals and Publishers Can Help to Reform Research Assessment” In Science Editor includes a call to action with 10 specific actions for journals and publishers.
  • In addition, there was a call to action in May for the expansion of indicators to describe a scholarly journal’s qualities.
  • More organizations are supporting DORA than ever before. We now have 14 members contributing financially.
  • The SCOPE process developed by the INORMS Research Evaluation Working Group was announced in December and can be used to help institutions create new policies and practices.

The post 2019 in review: DORA’s list of the top 10 advances in research assessment appeared first on DORA.

]]>
Redefining Impact in the Humanities and Social Sciences https://sfdora.org/2019/05/24/redefining-impact-in-the-humanities-and-social-sciences/ Fri, 24 May 2019 13:49:30 +0000 https://sfdora.wpengine.com/?p=1171 On Wednesday April 3, 2019, we hosted a #sfDORA community interview with Janet Halliwell to learn more about the 2017 report from the Federation of the Humanities and Social Sciences in Canada, Approaches to Assessing Impacts in the Humanities and Social Sciences. We also wanted to hear about a new project she is working on to improve assessment in the Humanities and Social Sciences (HSS) by creating a standardized vocabulary of terms related to research assessment.

The post Redefining Impact in the Humanities and Social Sciences appeared first on DORA.

]]>

On Wednesday April 3, 2019, we hosted a #sfDORA community interview with Janet Halliwell to learn more about the 2017 report from the Federation of the Humanities and Social Sciences in Canada, Approaches to Assessing Impacts in the Humanities and Social Sciences. We also wanted to hear about a new project she is working on to improve assessment in the Humanities and Social Sciences (HSS) by creating a standardized vocabulary of terms related to research assessment.

Engaging colleagues in conversations about impact and assessment can be challenging. According to Halliwell, many in the HSS community feel that focusing too heavily on impact undermines the value of curiosity-driven research. And that is not the only problem. Impact can take on a number of different meanings, which makes measuring it difficult. But by providing a framework for assessment, the report encourages bottom-up discussions in the community.

To make these discussions manageable, the report bins impacts into five categories:

  1. Scholarship
  2. Capacity
  3. Economy
  4. Society and Culture
  5. Practice and Policy

While scholarship and capacity are internal to academe, the other categories are focused on societal impacts. Taken together, the broad categories form an overall impact landscape. However, projects will not always have an impact in every location. And this is a key point, because it highlights the need for a certain degree of flexibility in how we approach assessment. Impacts vary; there is not a one-size-fits-all approach.  But by separating them into categories, evaluators can select what impacts are most appropriate for the project and choose the best way to assess them.

Timing isn’t everything

The report uses case studies to demonstrate how some of its principles can be applied and used for the purposes of research assessment. One case study examines how long it takes to recognize HSS impacts, where the research produced by a professor of folklore studies is used as evidence in a court case many years after he made the discovery. The professor never learns what role his work played in the case. Impact takes time. Even a 5-10 year period may be too short to capture the return-on-investment of a research project.

Halliwell believes that while timing and research recognition can’t be reconciled, they can be managed. “The most you can ask for curiosity-driven research is to think about and address possible uses and impacts at the start.” Managing the expectations of funders and institutions is vital. Researchers should not be penalized if they do not achieve something they never set out to do.

Creating Clarity

Now Halliwell is involved with a project that takes research assessment reform one step further by developing a standardized vocabulary for evaluation. The project is a joint initiative of the Federation for the Humanities and Social Sciences in Canada and CASRAI, which is the group that devised the Contributor Roles Taxonomy (CRediT). Halliwell quoted the abstract of a recent paper from Brian Belcher, Royal Roads University, and Markus Palenberg, Institute for Development Strategy, to emphasize why clarity is needed.

“The terms “outcome” and “impact” are ubiquitous in evaluation discourse. However, there are many competing definitions that lack clarity and consistency and sometimes represent fundamentally different meanings. This leads to profound confusion, undermines efforts to improve learning and accountability, and represents a challenge for the evaluation profession.”

So far the group has three draft definitions to share:

  • Output—new knowledge, technical or institutional advances, or other direct products or services produced by research;
  • Outcome—a change in knowledge, attitude, skills, and/or relationships that manifests a change in behavior that would result in whole or in part from research;
  • Impact—a change in flow or a change in state resulting in whole or in part from a chain of events that may have many other actors and stakeholders involved.

When we apply these definitions, outputs, outcomes, and impacts can be arranged as a sequence of events.  Outputs are the first step and within a researcher’s sphere of control. Outputs then influence outcomes, which can lead to impacts. Halliwell sees an attenuation of influence, control, and attribution moving through the chain. One approach they are taking now is to see if they can measure and optimize impact along on a given path.

The post Redefining Impact in the Humanities and Social Sciences appeared first on DORA.

]]>
DORA 6 years out: A global community 14,000 strong https://sfdora.org/2019/05/16/dora-6-years-out-a-global-community-14000-strong/ Thu, 16 May 2019 14:26:11 +0000 https://sfdora.wpengine.com/?p=1151 DORA turns 6 years old this week. Or, as we like to say, this year DORA reached 14,000—that’s how many people have signed DORA, and they come from more than 100 countries! Each signature represents an individual committed to improving research assessment in their community, in their corner of the world. And 1,300 organizations in more than 75 countries, in signing DORA, have publicly committed to improving their practices in research evaluation and to encouraging positive change in research culture.

The post DORA 6 years out: A global community 14,000 strong appeared first on DORA.

]]>
DORA turns 6 years old this week. Or, as we like to say, this year DORA reached 14,000—that’s how many people have signed DORA, and they come from more than 100 countries! Each signature represents an individual committed to improving research assessment in their community, in their corner of the world. And 1,300 organizations in more than 75 countries, in signing DORA, have publicly committed to improving their practices in research evaluation and to encouraging positive change in research culture.

Organizations from more than 75 countries have signed DORA. Here, size represents the relative amount of DORA signatories from a given country.

In honor of DORA’s 6th anniversary, we would like to highlight some of the progress being made in diverse research communities around the globe:

In Latin America, where 36% of our organizational signatories come from, several far-reaching scientific bodies have had an active role in reforming research assessment. In September, Redalyc, a platform of 1,308 scholarly journals in Latin America, the Caribbean, Spain and Portugal, mandated that the journals it indexes sign and adhere to DORA. Founder and CEO Eduardo Aguado-López & Executive Director Arianna Becerril-García clarified Redalyc’s position in a blog post. “For Redalyc it is important to base the value of a journal on its articles and contents and not on its impact based on citations; it is crucial that research results are assessed by its [sic] own merits and not by where they are published.“

After this public commitment and mandate emerged, we observed a surge in participation in DORA from organizations and individuals across Latin America. In January of this year, Redalyc partnered with Latindex and the Latin American Council of Social Sciences (CLACSO) to write a joint-letter reiterating their support for DORA  and encouraging others to join.

Taking action on their public commitment to research assessment reform, representatives from CLASCO met with Conacyt, the Mexican National Council of Science and Technology, to plan a seminar to develop “guidelines that are in agreement with our social science disciplines and with our Latin American and Caribbean reality… and not only work with indicators that come, in many cases, from other scientific disciplines and other geographical realities,” said Karina Batthyány, Executive Secretary of CLACSO (translated from Spanish). We are looking forward to hearing about the outcomes of their ongoing discussions, and learning about indicators they identify as best suited to social sciences research and research culture in Latin America.

In France, the National Agency for Research (ANR) publicly encourages consideration of the quality and importance of all research outputs, including data sets, patents, and mediation actions. To improve their own selection practices, the ANR has established a training program for chairpersons involved in the Agency’s main call for project proposals. The ANR also pledges to follow DORA in establishing their methods of impact analysis and look to the Leiden Manifesto to guide them in selecting suitable quantitative indicators.

The Universitat Oberta de Catalunya (UOC) signed DORA thanks to the efforts of a task-force committed to seeing change. Reforming policies and practice takes time, and UOC is demonstrating their commitment to DORA by first revising their call for postdoctoral fellows. The task-force is now looking towards other institutional changes.

In Asia, the Indian National Government’s Department of Biotechnology, in collaboration with Wellcome Trust, established a cooperative funding mechanism to promote excellence in research across the country by providing fellowships to the brightest scholars. The Wellcome Trust/DBT India Alliance, as it is called, ultimately hopes to foster the development of the next generation of scientific leaders in India. In their evaluation processes, they focus on the profiles of the people they fund, because, as Shahid Jameel, CEO of the Alliance, puts it, “people are the most important thing in an institution. Hiring the right people and assessing them well should really be our priority.” Evaluation procedures should be designed to help selection committees identify the individuals with the skills, talents, and expertise best fit to whatever role for which they are being assessed.

In the United States, the American Society for Cellular Biology, is in the process of implementing selection procedures for awardees that adhere to the spirit of DORA, according to Sandra Schmid. She also points out that many scientific journals, such as “Molecular Biology of the Cell, Science, PLoS, eLife, and all American Society for Microbiology journals, are distancing themselves from their Journal Impact Factors, by no longer displaying these metrics on their websites.”

With our international community of signers, supporters, and campaigners, we are eager to support each of our many diverse communities in implementing similar changes. What do you envision for your country or your discipline? What has your institution done to reward scientists for diverse contributions to research? Reach out to us on twitter at @DORAssessment – we’d love to hear from you!

Helen Sitar is a Science Policy Programme Officer at EMBO, and Anna Hatch is the DORA Community Manager.

 

 

 

The post DORA 6 years out: A global community 14,000 strong appeared first on DORA.

]]>