Guest posts Archives | DORA https://sfdora.org/category/guest/ San Francisco Declaration on Research Assessment (DORA) Wed, 13 Nov 2024 22:52:25 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://sfdora.org/wp-content/uploads/2020/11/cropped-favicon_512-1-32x32.png Guest posts Archives | DORA https://sfdora.org/category/guest/ 32 32 A publication’s “what” should count more than its “where”: why we should waive journal titles. https://sfdora.org/2024/11/10/a-publications-what-should-count-more-than-its-where-why-we-should-waive-journal-titles/ Mon, 11 Nov 2024 04:48:25 +0000 https://sfdora.org/?p=161810 Adrian Barnett, Queensland University of Technology The winter months can get cold in Belfast, the largest city in Northern Ireland where the Titanic was designed, built and launched in the early 1900s. Not seriously cold of course, never sub-zero depths of cold that were the undoing of the Titanic on its maiden voyage, but a…

The post A publication’s “what” should count more than its “where”: why we should waive journal titles. appeared first on DORA.

]]>
Adrian Barnett, Queensland University of Technology

The winter months can get cold in Belfast, the largest city in Northern Ireland where the Titanic was designed, built and launched in the early 1900s.

Not seriously cold of course, never sub-zero depths of cold that were the undoing of the Titanic on its maiden voyage, but a ‘nippy’ cold… a cold that comes with biting winds and rain that means only the bravest of souls venture outside in the winter months without the insurance policy of a good coat.

It’s a type of cold that resulted in it being a big outlier for winter deaths in our analysis of multiple cities across the world, with a death rate that was much higher than cities with a similar climate such as Gothenburg.

So it was with some delight that my mum, a proud Belfast girl, recently told me she was getting a new boiler installed courtesy of the city’s council – a small win that means this winter should be warmer than the last one.

“Wow, how did that happen?” I asked, as she shared the good news in one of our regular calls.

“I don’t know,” she said, “but it’s nice to be getting something for a change”.

Thinking about how my parents were getting a free boiler, I wondered if our analysis of winter deaths had played a part.

I remember hearing that our results caused some concern in Northern Ireland and led to government reports that examined the issue. Without knowing for certain, it seems that these investigations led to policies to improve housing in Northern Ireland, and a new boiler for my folks.

These are the happy moments that highlight how your research can make a positive difference. We did the work, published it, it was read, and then acted on, in a long chain of events that took many years.

How crucial was the journal? 

The publication in the Journal of Epidemiology and Community Health was a vital link in the chain, as having a peer reviewed and easily accessible version of our work was necessary for its uptake.

But would the paper have had the same impact if it was published in another journal? I think it would.

A “better” journal might have got more media attention, and our findings might have been picked up faster by policy makers. Conversely a “lesser” journal may have slowed its uptake, although it would still be findable by anyone doing a thorough review.

Where versus What

The journal is important, but it is a means to an end rather than the end itself. Alas for many researchers the journal is now the end game, with an enormous focus on “where” to publish rather than “what”, with the “where” meaning the highest journal impact factor.

This mania for impact factors is driven by fellowship and hiring committees who often rely on the journal impact factor as a heuristic of research quality. These committees use shortcuts because researchers are now publishing so much that there’s no time to read every candidate’s work. Hence the “where” stands in for the “what”.

But policy makers care about the “what”. For example, in our recent study of journal impact factors a researcher commented that policy makers “Don’t care about journal impact factor, they only want you to give them a half-page summary.” For real impact, researchers should publish where their target audience is most likely to read it.

Goodbye to some of that

I am now so disillusioned with the research world’s focus on journal prestige that I recently removed all the journal names from my CV. I want any interest in my work to be based on the paper’s title, not any perceived entitlement from the “top” journals.

I’ve been called a hypocrite for publishing this change in Nature, which is a key target for “where” papers. But where better to reach those who are focused on the “where”?

Removing journal titles from publication lists is a simple step that could be used in all forms of research assessment.

Funders and universities could ask for CVs without journal titles in their fellowship and job applications. A determined committee member could still find the impact factors, but it should eventually sink-in that it is what people have achieved that matters.

Google Scholar could remove journal titles from users’ profiles. All papers on Scholar are hyperlinked, so readers don’t need to know the journal if they want to read it. Academia is clinging to an antiquated and fiddly system of writing out the full reference with the journal title, volume, issue, etc.

That warm feeling

I enjoyed explaining to my parents how my research potentially helped them get a new boiler, especially as it’s usually hard to explain what I do.

I didn’t get into research for personal benefit. Like most researchers I started as idealistic about how data and evidence can improve the world. Unfortunately, many researchers have become sidetracked by self-serving competitions. They need to remember why they started their career and forget the journal rankings that they were happily ignorant of when their career began.

Adrian Barnett is a statistician at Queensland University of Technology.

The post A publication’s “what” should count more than its “where”: why we should waive journal titles. appeared first on DORA.

]]>
Narrative CVs: How do they change evaluation practices in peer review for grant funding? https://sfdora.org/2024/11/05/narrative-cvs-how-do-they-change-evaluation-practices-in-peer-review-for-grant-funding/ Wed, 06 Nov 2024 04:05:30 +0000 https://sfdora.org/?p=161807 Judit Varga & Wolfgang Kaltenbrunner Research on Research Institute (RoRI); Centre for Science & Technology Studies, Leiden University. Contact: w.kaltenbrunner@cwts.leidenuniv.nl This blog post reports some preliminary findings from a project designed to investigate the evaluative use of narrative CVs. When funding organizations organize review panels to assess grant applications, reviewers need to agree on what…

The post Narrative CVs: How do they change evaluation practices in peer review for grant funding? appeared first on DORA.

]]>
Judit Varga & Wolfgang Kaltenbrunner Research on Research Institute (RoRI); Centre for Science & Technology Studies, Leiden University.

Contact: w.kaltenbrunner@cwts.leidenuniv.nl

This blog post reports some preliminary findings from a project designed to investigate the evaluative use of narrative CVs. When funding organizations organize review panels to assess grant applications, reviewers need to agree on what constitutes “quality” and how to compare different applicants and their submissions. Historically, reviewers have tended to facilitate this process by making use of quantitative metrics, such as the number of publications or the prestige of the journals and institutions associated with an applicant. These indicators, theorized as so-called “judgment devices” by researchers like Musselin (2009) and Hammarfelt & Rushforth (2017), help reduce the complexity of the decision-making involved in comparing candidates and their suitability to carry out projects by breaking them down to a more simple, quantitative comparison.

However, there is growing concern that relying too heavily on these traditional markers might be doing more harm than good. By focusing on numbers and proxies for academic prestige, reviewers may be losing sight of the achievements and quality of work of an applicant in a broader sense. Narrative CVs are designed to encourage them to consider the achievements and competence of a candidate in suitable detail and in the context of their proposed projects. At the same time, very little is known about the practical effects and real-word use of narrative CVs by reviewers in funding panels.

To remedy this, researchers at the Research on Research Institute (RoRI) have co-designed a research project in collaboration with the Dutch Research Council (NWO), the Swiss National Science Foundation (SNSF) and the Volkswagen Foundation. The project is currently ongoing and draws mainly on participant observation in consecutive review panel meetings as well as interviews with reviewers. The quotes presented in this blogpost document discussions within peer review panel meetings at NWO and SNSF.

Multiplying forms of excellence

Our findings so far suggest that the introduction of narrative CVs can trigger debates about the nature of scientific excellence in review panels. We encountered multiple moments where the use of the narrative CV format prompted reviewers to gradually broaden the range of achievements they valued in applicants, partly depending on the outlook of the respective project proposals. In one representative situation, a reviewer in the SNSF sciences panel expressed their surprise at the fact that the applicant had foregrounded the collaborative nature of their work in the CV, instead of focusing on publications. The reviewer initially scored the applicant lower as a result:

“One surprising thing about achievements [the narrative aspects of the SNSF CV]: there are significant ones, from the postdoc there are publications, but somehow [they are] not described as an achievement, I was wondering why (…) Maybe the candidate thought it was better to emphasize other achievements, like the collaborative nature of work, but anyway this is why I gave [a lower score].”

Later, following a discussion among panelists about how to interpret the proposal and the submitted narrative CV, another reviewer began to explicitly characterize the applicant’s profile as a ‘team player’. A subtle but important shift appeared to have taken place in the evaluative reasoning: Rather than assessing the applicant against a singular default ideal of a scientist whose standing can be inferred from the quantity of high-impact publications, a reviewer introduced a frame of reference where collaborative qualities and the ability to facilitate joint work in a laboratory context were legitimate criteria. The question then was, is this the right profile for the proposed grant and the research project?

“In conclusion, a strong candidate, I was a bit too harsh, especially on the project […] the profile is a bit ambiguous to me, but I’m happy to raise [the points] […] I think the candidate is a team player in the lab, you can interpret it positively or negatively but that’s the profile.

This situation is a particularly clear example of a dynamic that we recurrently observed throughout all of our case studies, namely a gradual pluralization of the notion of excellence over the course of the panel meetings.

Resistance

The above example illustrates evaluative learning in the form of rethinking publication-centric assessment criteria in light of narrative CVs and related guidelines. Yet on a number of occasions, some reviewers explicitly doubled down on those more ‘traditional’ criteria. For example, NWO instructed reviewers to omit naming concrete journals in which an applicant has published, to avoid that they infer the quality of a publication from the perceived prestige of the venue in which it had been published. In line with this, in the first review round of the NWO social sciences panel, reviewers did not mention any journals by name. Yet in the second round, one reviewer evoked journal prestige twice when assessing (two different) applicants.

“As for the applicant, I can’t judge the quality of publications, but the applicant published in Nature Genetics, maybe someone can tell me if it’s good but “everything with Nature sounds very good to me” [laughs a bit], I was very impressed with the candidate.”

When discussing another applicant, the same reviewer again made a reference to the applicant’s publications in prestigious journals as a proxy for their quality:

“Quality candidate. 5 publications in Nature Scientific Reports and other prestigious journals. […]

This comment sparked some confusion, as reviewers failed to locate the publication mentioned. After a while, the NWO program officer who helped chair the panel meeting cautioned that the perceived prestige of the publication venue should not be taken into account as a factor in the evaluation in the first place. Yet rather than giving up, the reviewer noted this comment with a disapproving gesture and continued the effort to locate the publication in question.

We propose that in order to make sense of such situations and devise practical strategies for handling them in future panel meetings, it is important to disentangle the different motivations reviewers might have for doubling-down on publication-centric evaluation criteria, even when they are explicitly cautioned not to use them. Sometimes, they might do so simply because they feel it makes sense in the context of a given application, for example projects aiming primarily for traditional academic impact. Yet on other occasions, resistance to the narrative format might be better understood as a response to what reviewers perceive as an unjustified intervention by funders and other reform-minded actors. After all, narrative CV formats can be seen not simply as a well-intentioned attempt to improve the fairness of evaluative decision-making in peer review, but also a threat to the autonomy of reviewers and academic communities to define notions of quality.

Story-telling skills as a new bias?

An important concern for many observers appears to be the emphasis narrative CV formats place on writing skills and the ability or willingness to present oneself in the best possible light. These cultural competences and inclinations may be unequally distributed among different groups of applicants, for example at the disadvantage of applicants with working class backgrounds, female applicants, or applicants from different cultural backgrounds. Yet typically, discussions about bias this may create focus solely on the input on the applicant’s side, and they implicitly presuppose that a highly positive self-representation is always a good thing. We instead found that reviewers may react negatively when they feel that applicants exaggerate their achievements. During panel meetings, reviewers flagged cases where they thought applicants exaggerated their achievements.

For example, in the social sciences panel of SNSF, a reviewer felt that an applicant had grossly overstated their achievements:

“[The Scientific_Chair reading the evaluation of a Reviewer]: [The Reviewer] had problems with the tone of the CV as well as the proposal, [they contained] self aggrandising statements about having invented a new field.

In another situation, another reviewer explicitly admitted “to be turned off” by an applicant using similarly hyperbolic language in their narrative CV, noting that it was “not grounded in science.”

Conversely, a situation we observed in the natural sciences panel shows that reviewers do appreciate enthusiastic narrations, but the fundamental requirement is for the narratives to be credible:

“Reviewer: (…) also the description of academic career is credible and enthusiastic.”

In sum, whilst narrative CVs might require applicants to write more than traditional CVs, this does not mean that they will appreciate academic self-aggrandizement or inflationary rhetoric. Instead, it appears that narrative elements place the emphasis on a new form of credibility in the relation between biographical self-representation and the achievements of a peer, which we suggest requires continued study.

Conclusions

This blog post documents in equal measure the success and challenges of narrative CV formats, and also the demand for more research on its practical use. It is clear even on the basis of our preliminary observations that narrative CVs do on many occasions stimulate productive reflections on the meaning of excellence in specific contexts, thus multiplying its meaning and perhaps contributing to challenging the terminology of excellence. We also feel that attention to nuance is crucial for understanding resistance to narrative CVs. Some forms of resistance might well provide input for further development of narrative CV formats. Where resistance is more related to a (perceived) struggle about reviewer autonomy, a different type of response will be required – one that addresses questions of power relations between scientists and institutions and funders in a more explicit way. Our finding that reviewers tend to react negatively to self-aggrandizing language in narrative CVs in turn caution us that evaluation reform is a moving target that can only be studied as it unfolds.

As should have become clear, narrative CVs are not an easy fix for peer review. They instead prompt reviewers and the  institutions who introduce them to ask fundamental questions of fairness and fit of quality criteria in peer review afresh. While not ‘efficient’ in a practical sense, we feel that this disruption to established routines of evaluative problem-solving is a crucial benefit in its own right.

The academic community can benefit from narrative CVs, particularly if the questions and complexities they raise are embraced as opportunities for discussions. For example, the findings presented in this blogpost signal opportunities to further discuss notions of  excellence, values in  academic culture and governance, training about narrative CVs for applicants, and CV design in light of the potential biases the new format may introduce. However, this process requires careful management, drawing on curious and innovative ideas for academic futures: In the absence of this, given the time-constrained nature of review meetings and academic life, it can be all too easy to glide over the opportunities and challenges afforded by narrative CVs.

References

Hammarfelt, B. & Rushforth, A.D. (2017). Indicators as judgment devices: An empirical study of citizen bibliometrics in research evaluation. Research Evaluation 3(1): 169–180.

Musselin, C. (2009). The Markets for Academics. New York: Routledge.

The post Narrative CVs: How do they change evaluation practices in peer review for grant funding? appeared first on DORA.

]]>
DORA’s 10th Anniversary in Bilbao: the starting point of a new journey https://sfdora.org/2023/07/17/doras-10th-anniversary-in-bilbao-the-starting-point-of-a-new-journey/ Mon, 17 Jul 2023 14:04:40 +0000 https://sfdora.org/?p=158314 A DORAat10 Local Event Report In May 2023, DORA celebrated it’s 10th Anniversary with two plenary sessions and a decentralized weeklong program of local events organized by community members from around the world. Event organizers were given the option to write brief reports on their events that summarize key takeaways and recommendations. By SOMMa Open…

The post DORA’s 10th Anniversary in Bilbao: the starting point of a new journey appeared first on DORA.

]]>

A DORAat10 Local Event Report

In May 2023, DORA celebrated it’s 10th Anniversary with two plenary sessions and a decentralized weeklong program of local events organized by community members from around the world. Event organizers were given the option to write brief reports on their events that summarize key takeaways and recommendations.

By SOMMa Open Science Working Group

  • Eva Mendez, representing CoARA and Pilar Paneque, director of the ANECA participated in the event
  • The round table discussion “The reform of science evaluation and its impact on attracting and retaining talent” was led by several experts in the field


Representatives of the 65 entities that form part of the Alliance of Excellence of Centres Severo Ochoa Centres and María de Maeztu Units (SOMMa) met in Bilbao in an event organised by the Basque Center for Applied Mathematics (BCAM) and Catalan Institute of Nanoscience and Nanotechnology (ICN2) to analyse the latest trends in the evaluation of scientific research.

To celebrate the occasion of the 10th anniversary of the San Francisco Declaration on Research Assessment (DORA), the SOMMa Alliance brought its community of open science professionals together with experts on new policies in research evaluation to discuss their impact on the research activity and the promotion of research careers.

SOMMa actively participated in the process of drafting the first National Open Science Strategy (ENCA) for the period between 2023 and 2027, drawn up by the Ministry of Science and Innovation and the Ministry of Universities.

During the meeting, the SOMMa open science working group and its members assessed the possibility of SOMMa becoming a member of the Coalition for Advancing Research Assessment (CoARA).

Eva Méndez, representing the International Coalition for the Advancement of Research Assessment (CoARA), presented on the agreement, which provides a basis for promoting the necessary reforms to ensure that research assessment is based on qualitative criteria.

Pilar Paneque, director of the National Agency for Quality Assessment and Accreditation (ANECA) shared with the audience the role of quality agencies in the reform of research assessment.

This was followed by the round table “The reform of science evaluation and its impact on attracting and retaining talent” with the participation of Eva Méndez, representing CoARA, Ismael Ràfols from the Centre for Science and Technology Studies (CWTS) Leiden University, Pilar Rico, head of the Open Access Unit FECYT and Fernando Orejas, expert in the area of information technologies. During the round table, the Open Science group considered that the following commitments would apply to SOMMa:

  • Recognise the diversity of contributions to and careers in, research in accordance with the needs and nature of the research
  • Base research assessment primarily on qualitative evaluation for which peer review is central, supported by responsible use of quantitative indicators
  • Abandon inappropriate uses in research assessment of journal- and publication-based metrics, in particular, inappropriate uses of Journal Impact Factor (JIF) and h-index
  • Commit resources to reform research assessment as is needed to achieve the organisational changes committed to
  • Raise awareness of research assessment reform and provide transparent communication, guidance, and training on assessment criteria and processes as well as their use
  • Exchange practices and experiences to enable mutual learning within and beyond the Coalition
  • Communicate progress made on adherence to the Principles and implementation of the Commitments

In conclusion of the event, the working group recommended that SOMMa join CoARA. This will be the starting point for a new 5-year journey.

The post DORA’s 10th Anniversary in Bilbao: the starting point of a new journey appeared first on DORA.

]]>
Panel “Potencialidades de los Repositorios de Acceso Abierto para la evaluación inclusiva, diversa y equitativa de la investigación en ALC” https://sfdora.org/2023/07/12/panel-potencialidades-de-los-repositorios-de-acceso-abierto-para-la-evaluacion-inclusiva-diversa-y-equitativa-de-la-investigacion-en-alc/ Wed, 12 Jul 2023 13:00:00 +0000 https://sfdora.org/?p=158252 A DORAat10 Local Event Report In May 2023, DORA celebrated it’s 10th Anniversary with two plenary sessions and a decentralized weeklong program of local events organized by community members from around the world. Event organizers were given the option to write brief reports on their events that summarize key takeaways and recommendations.Scroll down for the…

The post Panel “Potencialidades de los Repositorios de Acceso Abierto para la evaluación inclusiva, diversa y equitativa de la investigación en ALC” appeared first on DORA.

]]>

A DORAat10 Local Event Report

In May 2023, DORA celebrated it’s 10th Anniversary with two plenary sessions and a decentralized weeklong program of local events organized by community members from around the world. Event organizers were given the option to write brief reports on their events that summarize key takeaways and recommendations.
Scroll down for the English translation of this report: Potentialities of Open Access Repositories for inclusive, diverse and equitable research assessment in LAC

By Ana Luna Gonzalez

El panel “Potencialidades de los Repositorios de Acceso Abierto para la evaluación inclusiva, diversa y equitativa de la investigación en ALC” se centró en las formas en que los repositorios de acceso abierto pueden contribuir con procesos de evaluación de la investigación más inclusivos, diversos y equitativos en Latinoamérica y el Caribe. 

En el marco del 10 Aniversario de DORA y como parte de la Reunión Anual del COAR 2023, organizada conjuntamente por el Consejo Nacional de Rectores (CONARE) y LA Referencia, el evento fue apoyado por el CLACSO-FOLEC. Con la participación de especialistas de distintas instituciones involucradas en la temática, se presentaron los avances y desafíos pendientes de los repositorios de acceso abierto y su potencial contribución con una evaluación más responsable en la región. La moderación estuvo a cargo de Kathleen Shearer, directora ejecutiva de COAR. 

En primer lugar, Andrea Marin Campos (Universidad de Costa Rica) disertó sobre el rol de los repositorios en la evaluación de la investigación y sus alcances. Realizó una historización de los repositorios en Costa Rica e identificó dos objetivos principales: el primero, relacionado al Acceso Abierto y los beneficios para el desarrollo de la ciencia y la tecnología en el país y el segundo, vinculado a robustecer la evaluación de la investigación en diversas escalas.  A su vez, planteó distintos interrogantes: ¿cómo conocer los usos que hacen los diversos actores de la sociedad de la producción científica alojada en los repositorios? ¿Cómo evaluar la calidad de otros formatos de producción científica que nos son artículos? ¿Cómo implementar la evaluación de diversos contenidos y formatos de producción científica académica?

Por su parte, Laura Rovelli (CLACSO FOLEC) destacó que desde CLACSO-FOLEC existe un consenso creciente acerca de la necesidad de incorporar nuevas prácticas de evaluación que incentiven el acceso abierto en revistas diamante y en repositorios, pues no excluyen autores por razones económicas, y permiten concentrar la evaluación de pares “más en la calidad de la investigación que en la revista donde se publica”, siguiendo uno de los principios de la pionera Declaración de DORA sobre la Evaluación de la Investigación, en esta semana en la que estamos celebrando su 10° aniversario.

Marin Dacos (Francia) planteó algunos desafíos pendientes para los repositorios. En primer lugar, señaló la necesidad de visibilizar la diversidad de formatos de la producción científica de manera que sea clara para diferentes actores sociales, no solo para la academia. Asimismo, llamó a fortalecer los estándares de calidad de la evaluación para editores y construirlos para los repositorios. Además, sugirió que hay que aprovechar el momento político en el que actores de fuerte peso se encuentran interesados en la implementación de la Ciencia Abierta. 

Por su parte, Rodolfo Barrere (RICYT/Observatorio CTS OEI) exploró las complejidades en torno a la construcción de indicadores de evaluación de la producción científica que sean más representativos de la producción existente en la región y presentó un conjunto de desafíos para poder fortalecer la producción existente en los repositorios. 

Por último, Lautaro Matas (COAR Notify, Executive and Technical Director, La Referencia) identificó algunos problemas existentes de los repositorios en relación con la cobertura, la duplicación y la calidad de los metadatos. Además, presentó el COAR Notify Project, en el se asiste a los socios a adoptar un modelo común e interoperable de apoyo a las revisiones y aprobaciones de recursos distribuidos en repositorios, preprints y archivos.

La grabación de las intervenciones se encuentra disponibles aquí, en español e inglés.

Potentialities of Open Access Repositories for inclusive, diverse and equitable research assessment in LAC

By Ana Luna Gonzalez

The panel “Potentialities of Open Access Repositories for inclusive, diverse and equitable research assessment in LAC” focused on the ways in which open access repositories can contribute to more inclusive, diverse and equitable research assessment processes in Latin America and the Caribbean (LAC).

In the framework of DORA’s 10th Anniversary and as part of the COAR 2023 Annual Meeting, jointly organised by the National Council of Rectors (CONARE) and LA Referencia, the event was supported by CLACSO-FOLEC. Specialists from different institutions involved in the subject presented the advances and pending challenges of open access repositories and their potential contribution to a more responsible research assessment in the region. Kathleen Shearer, executive director of COAR, moderated the event.

First, Andrea Marin Campos (University of Costa Rica) spoke about the role of repositories in research assessment and their scope. She gave an overview of repositories in Costa Rica and identified two main objectives: the first, related to Open Access and the benefits for the development of science and technology in the country, and the second linked to strengthening the research assessment at different scales.  At the same time, she raised several questions: how to know the usage by different actors of society of the scientific production hosted in repositories? How to evaluate the quality of other formats of scientific production that are not articles? How to implement the evaluation of different contents and formats of academic scientific production?

Laura Rovelli (CLACSO-FOLEC) stressed that there is a growing consensus at CLACSO-FOLEC on the need to incorporate new evaluation practices that encourage open access in diamond journals and repositories, as they do not exclude authors for economic reasons and allow peer review to focus “more on the quality of the research than on the journal where it is published.” This follows one of the principles of the pioneering DORA Declaration on Research Assessment, in this week in which we are celebrating its 10th anniversary.

Marin Dacos (France) raised some pending challenges for repositories. Firstly, he pointed out the need to make the diversity of formats of scientific production visible so that they are clear for different social actors, not only for academia. He also called for strengthening the quality standards of evaluation for publishers and building them for repositories. In addition, he suggested taking advantage of the political moment in which powerful actors are interested in the implementation of Open Science.

Rodolfo Barrere (RICYT/Observatorio CTS OEI) explored the complexities surrounding the construction of scientific production evaluation indicators that are more representative of the existing production in the region and presented a set of challenges to strengthen the existing production in repositories.

Finally, Lautaro Matas (COAR Notify, Executive and Technical Director, La Referencia) identified some existing problems of repositories in relation to coverage, duplication and metadata quality. In addition, he presented the COAR Notify Project, which assists partners in adopting a common, interoperable model for supporting reviews and approvals of distributed resources in repositories, preprints and archives.

Recordings of the speeches are available here, in English and Spanish.

The post Panel “Potencialidades de los Repositorios de Acceso Abierto para la evaluación inclusiva, diversa y equitativa de la investigación en ALC” appeared first on DORA.

]]>
Evaluating what matters with DORA the explorer https://sfdora.org/2023/07/10/evaluating-what-matters-with-dora-the-explorer/ Mon, 10 Jul 2023 11:00:00 +0000 https://sfdora.org/?p=158253 A DORAat10 Local Event Report In May 2023, DORA celebrated it’s 10th Anniversary with two plenary sessions and a decentralized weeklong program of local events organized by community members from around the world. Event organizers were given the option to write brief reports on their events that summarize key takeaways and recommendations. By Holly Limbert…

The post Evaluating what matters with DORA the explorer appeared first on DORA.

]]>

A DORAat10 Local Event Report

In May 2023, DORA celebrated it’s 10th Anniversary with two plenary sessions and a decentralized weeklong program of local events organized by community members from around the world. Event organizers were given the option to write brief reports on their events that summarize key takeaways and recommendations.

By Holly Limbert

This year, here at the University of Derby, we joined the global research community in celebration of the 10th birthday of the Declaration on Research Assessment (DORA). It was important for us to mark this important event to demonstrate our continued commitment to the development of and approaches to responsible research assessment and Open Research more broadly. Our Repository and Open Access Librarian, Holly Limbert of the Research Liaison Team in the Library invited Professor Stephen Curry, Chair of the DORA Steering Committee, and Professor Cameron Neylon, co-lead of the Curtin Open Knowledge Initiative to deliver a talk which considered the what, why and how of research assessment and associated practices in the academy, how far we have come as a community, and where we might be heading in the future. Given both Stephen and Cameron’s commitment to questioning current and established practices when it comes to research evaluation, assessment and opening access to knowledge, the university was incredibly fortunate to secure two leading experts and inspirational speakers!

In his talk, ‘Ten years of DORA – evaluating what matters!‘ Professor Curry discussed the origins of the declaration and the organisation behind it. Whilst he acknowledged that change is evident across the research landscape, he stressed that DORA is still very much required to drive forward reform in research evaluation and assessment. A particularly pertinent aspect of Stephen’s talk addressed the pressures which researchers face relating to career advancement, publishing expectations and real-world impact in society. One example used was from a Ted Talk by Thomas Insel, a leading neuroscientist whose research focuses on mental health. In his talk, Insel makes explicit reference to the number of papers and money spent during his years at the National Institute of Mental Health (NIMH) versus how much impact was felt by those suffering from mental health problems during this time (Insel, 2013). This speaks volumes about the importance of questioning established practices and processes in the academy and advocating for change! Stephen also advocated for the Narrative C.V. which aims to highlight a wide range of skills and experiences, considering the vast contributions that researchers make to the research ecosystem and not just written publication. 

Professor Neylon’s talk ‘The problem of evaluation: DORA the explorer at 10 and our paths into the future’ centered on the role of evaluation in research across departments and geographies and some of the major challenges we face collectively in terms of how research evaluation is even thought of and considered, particularly in relation to why the academy evaluates research at all. One of the key takeaways from Cameron’s talk centered on rethinking what “excellence” means in research and how this is interpreted and understood. It was fascinating to hear that when presented with the question of ‘What is excellence?’ researchers and institutions find it is exceedingly difficult to provide a concrete definition. As co-lead of the Curtain Open Knowledge Initiative, which seeks to ‘…change the stories that universities tell about themselves, placing open knowledge at the heart of that narrative’ (COKI, 2023), Cameron and colleagues at Curtain University Australia, are passionate about using Open data sources to help universities become more open, transparent and accountable in their activities relating to scholarly communications and equality, diversity and inclusion practices.

After the talks took place, the floor was open to questions and a lively discussion ensued. Questions were raised about priority steps institutions can take to address certain established and potentially harmful ‘norms’ and how these can be phased out particularly when it comes to definitions and understandings of quality and prestige. The future research assessment exercise in the United Kingdom was also a topic of discussion, particularly around how metrics and understandings of quality might play a role in the future. There was also some debate regarding citations and readership and what both phenomena may indicate in different contexts and disciplines. A follow up question which Stephen and Cameron were both happy to respond to after the event, related to how research assessments can cater for the humanities especially as much of what is practiced is very heavily focused on STEM subject areas.

Overall, the event was an enormous success with over 160 registrants from across the globe and over 70 attendees on the day! This demonstrates the will and need for researchers and those in research support roles to transform the ways in which we assess and evaluate research. We need to challenge the systems in place which put emphasis on certain incentives and rewards in the academy. Whilst DORA has done much to change the way we think, feel and act when it comes to the value we give to certain measures of quality and what matters, there is still work to be done! The only way to achieve an open, inclusive, and equal culture of research is to continue to collaborate and help to establish new norms which celebrate and recognise the varied and wide-ranging contributions made by researchers the world over.

The post Evaluating what matters with DORA the explorer appeared first on DORA.

]]>
What can open research values bring to research assessment reform? https://sfdora.org/2023/07/07/what-can-open-research-values-bring-to-research-assessment-reform/ Fri, 07 Jul 2023 12:00:00 +0000 https://sfdora.org/?p=158254 A DORAat10 Local Event Report In May 2023, DORA celebrated it’s 10th Anniversary with two plenary sessions and a decentralized weeklong program of local events organized by community members from around the world. Event organizers were given the option to write brief reports on their events that summarize key takeaways and recommendations. By Rebecca Hill,…

The post What can open research values bring to research assessment reform? appeared first on DORA.

]]>

A DORAat10 Local Event Report

In May 2023, DORA celebrated it’s 10th Anniversary with two plenary sessions and a decentralized weeklong program of local events organized by community members from around the world. Event organizers were given the option to write brief reports on their events that summarize key takeaways and recommendations.

By Rebecca Hill, Simon Hettrick and David Moher

How do we assess research today and what has changed since the DORA declaration? And what can open research practices contribute to future research assessment systems? To explore these questions in more depth and to celebrate the 10th anniversaries of both DORA and the open research publisher F1000, F1000 hosted a webinar in May 2023, asking ‘What can open research values bring to research assessment reform?’

The panel discussion included David Moher (University of Ottawa/Hong Kong Principles); Simon Hettrick (Software Sustainability Institute/the Hidden Ref/University of Southampton); and Becky Hill (F1000).

What has changed since the DORA declaration?

David: For most researchers, career advancement is based on the currency of the day. Evidence indicates the global currency is usually the number of publications generated in a finite period. This reward scheme typically benefits certain disciplines and ‘public facing’ researchers. It excludes the many talented personnel working to bring the research to fruition. Along with the number of publications, assessors are often interested in the impact factor of the journal the research report is published in, as well as the total dollar figure associated with any awards. Clearly this tends to favor expensive research, often conducted in biomedicine. This reward ecosystem has not materially changed in 50 years.

Re-imagining incentives focused on more inclusive, transparent, and open ways of working are important. The 2020 Hong Kong principles are one such effort combining many aspects of community and research integrity. The principles intend to guide organisations’ research assessment, promotion and tenure practices to focus on incentivizing and rewarding researchers who incorporate open scholarship practices into their research. This aligns with the 2022 US Office of Science and Technology Policy recommendation on data sharing and public access to research. Implementing these frameworks as part of a researcher’s assessment make sense and is a movement towards using evidence as part of the assessment process.

Alongside what DORA is doing, it will be important to watch the development of the Coalition for Advancing Research Assessment (CoARA), an initiative committed to reimaging research evaluation and how we incentivize and reward researchers for the future.

Becky: The last five years have seen the introduction of major policy imperatives aimed at improving how we do and deliver research. These are often underpinned by a drive to more open and collaborative ways of working. The publication of the 2021 UNESCO recommendation on open science was a major landmark for attitudes to research practices and behaviors, endorsed by 193 nations. It highlights ‘Open Scientific Knowledge’ as integral to an effective research system –publishers can play a pivotal role in realising this.

What can publishers do to support responsible research assessment?

Becky: As David notes, the focus on publications as the currency for researchers to demonstrate their value in research assessment systems is widely acknowledged as problematic. DORA sets out clear guidance for publishers to help improve how research is evaluated, which include practicing more responsible use of research metrics, but also highlights the integral role that publishers can play in bringing greater transparency to the research dissemination process.

Publishers are enablers of open scientific knowledge by a combination of services they can provide, be that providing open access to research outputs, facilitating data sharing, creating routes for the publication of a diversity of research outputs, and ensuring research integrity. Publishers can also provide greater visibility of the myriad of specific and valuable contributions to research output through, for example, adoption of the Contributor Role Taxonomy (CRediT), or through providing more transparent and open peer review, bringing the often unseen role of reviewers to the fore.

What are great examples of innovation to help shape research assessment reform?

Simon: Research relies on the efforts of many non-traditional academic roles, such as technicians, data stewards, and research software engineers and the value of such roles is largely ‘hidden’ in traditional research assessment systems. The UK’s Research Excellence Framework (REF) is the national assessment framework for universities to present the impact of their research, the outcome of which determines the distribution of billions of pounds of funding. Despite a broad framework encouraging universities to present the impact of research across everything from musical compositions to software, universities place almost all their focus on published outputs. In the last REF, only 2.4% of outputs presented for assessment were not related to publications.

A focus on publications is problematic because research articles alone rarely describe all of the techniques, methods and technologies involved, nor are all the people who contributed to the research named in publications. The Hidden REF, a community-led initiative, was founded on the principle that if we do not recognise and provide incentives for everyone involved in the conduct of research, we will limit our ability to conduct research now and in the future. The Hidden REF started with a UK national competition in 2021, which recognised the vital work of a number of “Hidden Roles” without which, much research would be impossible. Partially due to lobbying by the Hidden REF, the next UK research assessment exercise will now recognise all roles – not just traditional academic ones.

What needs to happen now?

To drive reform across the research process, in both research culture and adoption of open practices we need to incentivise, recognise, and reward the behaviors we want to see – whether that’s sharing data openly or celebrating ‘hidden roles’ in research.

Each stakeholder has a role to play to drive change and adoption, but collaboration and partnerships across the research system are essential to deliver the promise of open research and reform in research evaluation. The question remains as to who needs to drive forward this change and how to ensure all parts of the system are in alignment – but DORA has certainly laid important foundations.

Want to learn more about what open research practices can contribute to research assessment reform? Watch the recording of the webinar.

The post What can open research values bring to research assessment reform? appeared first on DORA.

]]>
Why I signed DORA https://sfdora.org/2023/07/05/why-i-signed-dora/ Wed, 05 Jul 2023 10:00:00 +0000 https://sfdora.org/?p=158233 A DORAat10 Local Event Report In May 2023, DORA celebrated it’s 10th Anniversary with two plenary sessions and a decentralized weeklong program of local events organized by community members from around the world. Event organizers were given the option to write brief reports on their events that summarize key takeaways and recommendations. By Liam Bullingham…

The post Why I signed DORA appeared first on DORA.

]]>

A DORAat10 Local Event Report

In May 2023, DORA celebrated it’s 10th Anniversary with two plenary sessions and a decentralized weeklong program of local events organized by community members from around the world. Event organizers were given the option to write brief reports on their events that summarize key takeaways and recommendations.

By Liam Bullingham and Nicola Wylie

Recent discussion on DORA mostly concerns the 2,800 organisations (universities, funders or publishers) which have signed, not the 20,000 individuals. See positions like, ‘Why have Elsevier committed to Leiden but not DORA?’, ‘We as a funder expect you to assess research following the DORA principles’, and ‘how should universities that sign be held to account?’

It’s understandable – organisations have much more agency to bring change and carry greater responsibility than individuals. But this means the community voice is lost, and DORA was started and grew from its individual signatories.

‘Why I Signed DORA’

We encouraged panellists and attendees to sideline their employer’s position and focus on their own morals, principles, or beliefs.

Our researcher panel drew from different disciplines and career stages. It was comprised of:

  • Dr Rob Farrow, Senior Research Fellow in the Institute of Educational Technology, Open University, UK

…and all from Edge Hill University, UK:

‘What does DORA mean to you?’

Rob questioned the binary nature of signing, and whether we sign up to everything in the Declaration or just have a particular focus in mind when we do. Michel noted he hadn’t signed DORA until a few days ago, but was surprised to see how embedded its arguments already are in the wider community – many DORA principles are commonly held across psychologists in 2023. But signing helps us ‘put our money where our mouth is’. Costas recently signed too, simply because he didn’t know individuals could do so. But like Michel, many of these views are already held by him. He criticised obsessions with ‘top’ journals, or whether titles are indexed in Scopus. Poor or questionable research can be indexed too! Rebecca works in policing research. For her, many key journals are community-run and focus on local agendas, so can be overshadowed by big international journal titles or brands. DORA provides a framework to challenge this and move towards equal respect or recognition across disciplines. Signing DORA as an individual puts weight behind influencing wider change. Linamaría considered the range of different outputs you can have as a researcher, and the need for these to be recognised equally alongside journal articles. This is sometimes called ‘bibliodiversity’ and variety in scientific outputs is highlighted in DORA. Space for this diversity is really important for her as an early career researcher (ECR) as she hopes to gain recognition for the wide range of research she is doing.

‘What do you think DORA has achieved?’

Rob mentioned we have a better REF because of DORA. He noted that the way REF assesses ‘quality’ in research outputs deliberately de-couples the venue of a journal article from any assumptions about its quality. Rebecca is reassured by the positive influence on research funders. Funders now consider a wide range of factors when assessing research quality, with less emphasis on publication lists. Instead, more inclusive methods such as Narrative CVs are being drawn on. This is something that Linamaría is pleased to see too – she believes it will empower ECRs.

Audience feedback

Edge Hill PhD researcher Elizabeth Devine ran polls for us during the session, to help us understand where DORA stands in 2023. We share results and comments of the polls below.

During this poll, two people echoed Costas’ remarks that they didn’t know they could personally sign. Another supports DORA personally without signing because their university hasn’t supported DORA. Someone else never thought to sign because they aren’t an academic. In response, Liam offered his opinion about why research supporters are a part of scholarly communications and should feel they can sign too.

In an attempt to identify areas for action, we also asked participants what could be done to influence organisations to follow the DORA principles more. Some themes emerging from these responses included:

  • funders can influence practices by including training or requirements
  • accountability to DORA responsibilities and awards/recognition
  • prestige attached to high impact journals
  • improve the agency of individual DORA signatories
  • using assessment methods in a wide-ranging, better-informed way over singular metrics
  • improve researcher training
  • better publisher practices and repository infrastructure

We continued by asking participants about opportunities and challenges in research assessment, with response themes emerging around:

  • Responsible metrics and subjectivity weighed against the need for ‘simple’ measures
  • Support from research leaders
  • Re-use of research as impact
  • Open research and bibliodiversity
  • Opportunities from generative artificial intelligence
  • Collaboration between organisations
  • Fairness for ECRs, different disciplines, research in languages other than English

Prompted by the audience, we discussed CoARA and noted that unlike with DORA, individuals can’t sign as individuals, which is a challenge as new initiatives flow from the DORA legacy.

A view from Lancaster

Unable to join us for the event, a senior researcher from Lancaster University offered:

“As a woman in science, even a well-cited one, I have been told by more than one senior (male) colleague that I publish some ‘rather good work’ in some ‘lesser’ journals. I have been asked to consider the impact factors of journals by more than one (male) Head of Department. And there remains a bias on Journal Impact Factor (JIF) in the internal evaluation of publications to be selected for REF, despite the feedback from panel members that they do not take JIF into account (and I believe them).

I truly do believe that we, as scientists, should interrogate the evidence – the published work – and make our own independent judgements on its quality or academic impact. And if we wish to make a broader societal impact, we must be supported to publish non-REFable book chapters and to publish in those journals with lower JIF, but which are more likely to be read by users of research. I would hope that Lancaster’s emphasis on engagement makes us well positioned to be leaders on issues such as those raised by DORA.”

The post Why I signed DORA appeared first on DORA.

]]>
The DORA Movement in Canada: Working Together to Advance Assessment of Research Excellence https://sfdora.org/2023/07/03/the-dora-movement-in-canada-working-together-to-advance-assessment-of-research-excellence/ Mon, 03 Jul 2023 10:00:00 +0000 https://sfdora.org/?p=158251 A DORAat10 Local Event Report In May 2023, DORA celebrated it’s 10th Anniversary with two plenary sessions and a decentralized weeklong program of local events organized by community members from around the world. Event organizers were given the option to write brief reports on their events that summarize key takeaways and recommendations. By Stephanie Warner,…

The post The DORA Movement in Canada: Working Together to Advance Assessment of Research Excellence appeared first on DORA.

]]>

A DORAat10 Local Event Report

In May 2023, DORA celebrated it’s 10th Anniversary with two plenary sessions and a decentralized weeklong program of local events organized by community members from around the world. Event organizers were given the option to write brief reports on their events that summarize key takeaways and recommendations.

By Stephanie Warner, PhD, Manager, Knowledge Engagement, University of Calgary with support from representatives of NSERC, SSHRC, CIHR, Genome Canada and CFI

To remove barriers to research funding, hiring, tenure and promotion that many researchers in Canada continue to experience, sustained and collaborative effort is needed. The Declaration on Research Assessment aims for thoughtful inclusion of broader types of research outputs and societal impact in assessment of researchers for funding, tenure, promotion, merit and hiring.

The DORA movement in Canada is growing, with 47 organizations having signed the commitment to more effective and robust approaches to research assessment. The five major Government of Canada research funders (Social Sciences and Humanities Research Council/SSHRC, Canadian Institutes of Health Research/CIHR, Natural Sciences and Engineering Research Council/NSERC, Genome Canada and Canada Foundation for Innovation/CFI) all signed DORA in 2019. Although many institutions in Canada are adopting principles related to responsible research assessment and research impact, only seven Canadian postsecondary institutions have signed DORA.

The University of Calgary was the first university in Canada to sign DORA in 2021, and is committed to advancing conversations, practices and policies that reflect the diversity and priorities of our research community.

The Canadian Conference on Research Administration

The Canadian Association of Research Administrators annual conference brings together funders and postsecondary research administrators – the individuals who support researchers to navigate funding, ethics, legal contracts, equity, diversity, inclusion and accessibility (EDIA) in research, community engagement and research impact – to discuss updates, expectations, and experiences. This meeting is one of the few existing venues within the research ecosystem for deep conversations about research excellence and culture change.

DORA leaders from the University of Calgary, Université de Montreal, NSERC, SSHRC, CIHR, CFI and Genome Canada co-developed a half-day workshop to raise awareness of what DORA is and what it means for assessing research excellence, provide updates on the implementation of DORA into organizational and funding practices in Canada, and enable opportunities for discussion and idea generation among all participants.


The workshop

The session room was full on May 14, 2023, with 49 participants. Their interest and depth of engagement showed that the time is right for deeper engagement between higher education and funders. Through polling, we learned that around 80% of participants were already familiar with DORA, and 66% said their institutions were aware of DORA (only half of those institutions had signed DORA). When asked if their organization is currently working toward implementing DORA and responsible research assessment practices, 40% said yes while 52% were unsure.

The session opened with an overview of DORA from its Acting Director Haley Hazlett. Vincent Larivière, Université de Montréal, spoke on the evolution of research assessment and the responsible use of metrics, followed by lightning updates from the University of Calgary, CIHR, NSERC, SSHRC, and Genome Canada (view slides).

The second half of the workshop allowed attendees to rotate in groups through four discussion questions:

  1. How do you feel the recommendations of DORA will change the definition of research excellence?
  2. What support do you need from funding agencies to make this change?
  3. What support do you need from your institution to make this change?
  4. What are your suggestions for overcoming barriers?

While each table had areas of focus that emerged, there were also a number of overarching themes and take-aways.

Equity, Diversity, Inclusion and Accessibility

Overall, attendees pointed to DORA opening a broader set of possibilities under the umbrella of research excellence and understanding of different value systems such as Indigenous and community-first values. Emphasis on the differences in disciplinary norms and ways of working, as well as values emphasized in different communities, points to the relative ease of change in some fields. Examples of holistic and more qualitative impacts from these fields may help to highlight and visualize the positive impact of DORA-aligned assessment practices. Overall, attendees felt that DORA would lead to a more diverse pool of people deemed successful.

Who is responsible for leading the movement?

Everyone is looking for a leader – be it institutional leaders/executives, research councils, or other organizations – to set the expectations firmly and clearly in a way that is easy for others to follow. We were surprised by the frequency with which attendees mentioned the lack of leadership-level commitment to DORA and feel that this requires further discussion. For DORA to take shape in a complex multi-dimensional system, change and commitment is required across all levels.

What can shift the ecosystem to a new paradigm?

Awareness, education and resources arose as key levers for change. Many funders are updating guidance for merit review and reviewer guidelines. To supplement this, participants suggested that funders provide short, easy-to-read resources with clear guidance for both applicants and reviewers (for example, DO NOT include Journal Impact Factor or h-index in applications; reviewers SHOULD NOT consider these if included). Proper training and education of review committees will be crucial to support integration of these policies into practice.

Institutions could dedicate more time and personnel to raising awareness and look to leadership to sign on to DORA. Obviously, any discussion of research assessment must also include the researchers themselves. Attendees planned to return and have those conversations at their own institutions.

Next steps

DORA encompasses more than we may think initially. In Canada, DORA can be a complementary approach to strategic goals, such as equity, diversity, inclusion and accessibility (EDIA), Open Research/Science, and doing Indigenous research in a good way. Together, we must continue the conversation, sharing the opportunities and “wins” that DORA-aligned research assessment affords and the challenges that emerge along the way.

The organizers of this session are committed to further unpacking what we heard, and working together on practical resources, outputs and engagement opportunities that will benefit the Canadian research community.

The post The DORA Movement in Canada: Working Together to Advance Assessment of Research Excellence appeared first on DORA.

]]>
How adorable is DORA? https://sfdora.org/2023/07/03/how-adorable-is-dora-2/ Mon, 03 Jul 2023 10:00:00 +0000 https://sfdora.org/?p=158230 A DORAat10 Local Event Report In May 2023, DORA celebrated it’s 10th Anniversary with two plenary sessions and a decentralized weeklong program of local events organized by community members from around the world. Event organizers were given the option to write brief reports on their events that summarize key takeaways and recommendations. By Tilmann Kiessling…

The post How adorable is DORA? appeared first on DORA.

]]>

A DORAat10 Local Event Report

In May 2023, DORA celebrated it’s 10th Anniversary with two plenary sessions and a decentralized weeklong program of local events organized by community members from around the world. Event organizers were given the option to write brief reports on their events that summarize key takeaways and recommendations.

By Tilmann Kiessling

A hybrid panel discussion hosted by EMBO and EMBL on 12 May marked the anniversary of DORA, a worldwide initiative aiming to advance approaches to the assessment of scholarly research. The panel discussed issues with current methods of research assessment, as well as solutions and actions for improvement in which panel members have been involved.

Wolfgang Huber, co-Chair of the EMBL responsible research assessment working group, kicked off the discussion. “Research assessment is an integral core aspect of doing science as it helps decide on the recruitment of the next generation of scientists and the allocation of funding,” he said. “Publications are actually not the scholarship itself. They are more of an advertisement of scholarship as the actual scholarship consists of the complete set of reagents or data analysis code, for instance in my field, that generates a paper. We should think about research outputs in a broader way than just the papers,” Huber said. He cited what is known as Goodhardt’s law: Whenever a measure becomes a target, it ceases to be a good measure. The real value would lie in a culture change, Huber said.

Bernd Pulverer, Head of EMBO Scientific Publishing and DORA co-founder, talked about the development DORA underwent from its inception during the 2012 meeting of the American Society for Cell Biology. “At the meeting we quickly converged that the journal impact factor as a single metric is at the heart of some of these issues,” he said. Today, DORA has become a global initiative endorsed by more than 23,000 individuals, institutions, publishers, and funders: “DORA has turned into an advocacy group and is developing tools for more balanced research assessment.” Practices and policies at EMBO include: Applicants for fellowships and grants are not allowed to indicate journal impact factors or other metrics in their applications; instructed reviewers not to use journal impact factors in the evaluation; guidelines for reviewers and applicants are published on the EMBO website; clear-cut conflict of interest policies are applied throughout selection committees; no journal names in files of the candidates shortlisted for the EMBO Gold Medal; reviewed preprints made it realistically possible to assess research outputs at a much earlier stage than journal publications. “And we dropped impact factors for the promotion of the EMBO Press journals,” he said.

Guillermina Lopez-Bendito, from the Institute of Neuroscience in Alicante, Spain, and Chair of the EMBO Young Investigator Committee, emphasized the lack of standardized and comprehensive methods for evaluating quality and impact of scientific work as a major obstacle to advancing research assessment. “Contributions to science go beyond research papers alone. We need to consider mentoring, outreach activities, peer review, and evaluation. We should incorporate assessments of whether researchers have translated their results and discoveries in ways other than publishing,” Lopez-Bendito said. Candidates’ narratives in evaluations are important, as they provide an opportunity for candidates to explain the impact of their research. “DORA is refocusing the attention of reviewers and evaluators on what truly matters, which is the quality of the work.”

Karim Labib, from the University of Dundee and Chair of the EMBO Installation Grant Committee, addressed the assessment of research outside one’s own field. “A key challenge is how best to assess research in areas that one is not extremely familiar with, without relying solely on simple metrics,” he said. Labib supported the idea of interviewing all shortlisted candidates, as it provides equal opportunities for candidates. Labib also emphasized the role of the lead reviewer in interview panels. He advised that lead reviewers should wait until after the interview before sharing their views with the panel, so that the panel members can assess the candidate’s performance in a less biased manner. The panellists agreed with Labib’s view that another bottleneck in research assessment is time. “Scientists are generally interested in participating in research assessment, but lack of time is the primary constraint. “

Brenda Andrews, from the Donnelly Center at the University of Toronto, Canada, and Vice Chair of the EMBL Scientific Advisory Committee, is actively involved in research assessment. As the founding editor of the open-access journal G3 Genes Genomes Genetics, her goal is to publish valuable research findings without considering the impact factor or subjective opinions about the importance of the work. “Senior colleagues bear a significant responsibility in leading by example and changing how we think about research assessment,” Andrews stated. However, she acknowledged that impact factors are still discussed in evaluations. Review committees have become increasingly aware of this issue in recent years. Andrews explained that there is now a clear emphasis on the description of the work and the progress made in setting up labs and training people, rather than solely focusing on the publication venue.

Cecilia Perez, postdoctoral researcher at EMBL, became interested in topics related to social justice in research assessment during her PhD studies. She shared her experiences when applying for scholarships and the biases present in assessment processes. “I would like to highlight the arbitrary nature of assessment processes and emphasize the need for fairness to achieve greater diversity and equality,” Perez expressed. She reflected on the challenges of organizing authors on papers, particularly in collaborative projects that are becoming more common agreeing that we need to move away from simplistic assessments based on authorship order and focus more on author contributions.

The panel discussion on the occasion of the DORA anniversary was co-organized by EMBO (Sandra Bendiscioli, Senior Policy Officer) and EMBL (Katherine Silkaitis, Strategy Officer) and chaired by Sandra Bendiscioli.

This is a cross-post (original post here) that has been lightly edited for length. The quotes from the panel discussion were edited.

The post How adorable is DORA? appeared first on DORA.

]]>
How research assessment reform can help research to do more! Reflections from the  SDG Publishers Compact Fellows, Open Climate Campaign, and Open Pharma https://sfdora.org/2023/06/30/how-research-assessment-reform-can-help-research-to-do-more-reflections-from-the-sdg-publishers-compact-fellows-open-climate-campaign-and-open-pharma/ Fri, 30 Jun 2023 10:00:00 +0000 https://sfdora.org/?p=158232 A DORAat10 Local Event Report In May 2023, DORA celebrated it’s 10th Anniversary with two plenary sessions and a decentralized weeklong program of local events organized by community members from around the world. Event organizers were given the option to write brief reports on their events that summarize key takeaways and recommendations. By Gerald Beasley,…

The post How research assessment reform can help research to do more! Reflections from the  SDG Publishers Compact Fellows, Open Climate Campaign, and Open Pharma appeared first on DORA.

]]>

A DORAat10 Local Event Report

In May 2023, DORA celebrated it’s 10th Anniversary with two plenary sessions and a decentralized weeklong program of local events organized by community members from around the world. Event organizers were given the option to write brief reports on their events that summarize key takeaways and recommendations.

By Gerald Beasley, Sally Wilson, Jo Wixon, Rebecca Kirk, Victoria Gardner (HESI SDG Publishers Compact Fellows); Shane Rydquist (Cactus); Monica Granados (Open Climate Campaign); Tim Koder and Joana Osório (Open Pharma)

Since its inception, the Declaration on Research Assessment (DORA) has made great progress in stimulating discourse and driving change around academic reward and incentives structures, moving away from a focus on the journal Impact Factor as a proxy for the quality or impact of research to a broader concept of the impact of scholarly outputs of research. The Declaration has attracted a huge number of signatories over the past ten years and provides them with a forum to share ideas and best practice, and to work collaboratively to foster change. 

In the same spirit, the HESI SDG Publishers Compact Fellows formed a collaboration and convened an event in partnership with the Open Climate Campaign and Open Pharma to provide advice and guidance to researchers on writing for non-academic audiences. Our focus was on highlighting why writing for non-academic audiences is vitally important to drive progress towards the UN’s Sustainable Development Goals, and aligned with the aims of DORA in terms of improving the ways in which the outputs of scholarly research are evaluated. We agree on the need to move beyond the current academic focus of impact to a greater understanding and recognition of the impact that research can have on the broader world. We all believe that research can help us solve the challenges that we are dealing with globally. The mindset change that is is at the heart of DORA is essential as we collectively work to solve these challenges – be that good health and well being, climate change, food security, inequality, or any of those reflected in the Sustainable Development Goals and more widely. 

Our event on May 17 was titled: How to write for non-academic audiences to achieve progress towards the UN Sustainable Development Goals (SDGs). We defined non-academic audiences as any reader who does not share the specific academic knowledge of the research output. This includes the general public, practitioners, patients, advocacy organisations, and policy or decision makers. During the session, our roster of speakers and facilitators provided institutional, industry, advocacy, and publishing perspectives on how to help research do more to reach non-academic readers and users. For example, writing plain language summaries or succinct practitioner or policy action points. We covered why writing for these audiences is important, how it can help widen the reach and impact of research, and outlined some practical tools and tips to support the writing process, including Top Tips created by the HESI SDG Publishers Compact Fellows. Participants had the opportunity to try some writing of their own, based on a worked example, which gave them the chance to put some of the tips and guidance of the session into practice. We had high levels of engagement from our audience throughout the event, who shared their thoughts and experiences with the group. The slides and handout with links to related resources can be found on the SDG Publishers Compact Fellows website.

The organisers didn’t just convene the event, but learnt a lot from our audience members, who were generous and candid with sharing their thoughts and experiences, including barriers to communicating in this way for these audiences. 

The convenors came away from the session with ideas on future events (including a possible debate on the role of AI in supporting with writing these formats), and with a clear sense of the challenges around writing for non-academic audiences. Ensuring that rewards and incentives structures, alongside training and support, are in place to support researchers in this endeavour is one challenge. Many attendees noted that while they were keen to communicate with wider audiences, current incentives structures do not encourage them to devote time and energy to writing these kinds of outputs. DORA is playing a vital role in addressing this and there is lots more that can be done. 

We are keen to accelerate change and help research to realise its possible real world impact. We plan further collaboration across our groups and DORA as we develop practical solutions to support broader forms of impact and engagement, including writing for non-academic audiences to drive progress towards the SDGs. Keen to find out more and to support our goals?  Join the SDG Publishers Compact Fellows Group Virtual Community for news about  activities and future events and to take part in our work to support the aims of DORA and drive progress towards the SDGs!

The post How research assessment reform can help research to do more! Reflections from the  SDG Publishers Compact Fellows, Open Climate Campaign, and Open Pharma appeared first on DORA.

]]>