Go to main content

Four ICTA-UAB researchers among the world's top Climate scientists

22 Apr 2021
Share via Whatsapp Share via e-mail

Four researchers at the ICTA-UAB are included in the Reuters list of the world’s top climate scientists. To make the list, Reuters created a system to identify and rank 1,000 climate academics according to how influential they are.

J. van den Bergh, J. Rieradevall, A. Rosell-Melé i X. Gabarrell (de dalt a baix i d'esquerra a dreta
Photo: J. van den Bergh, J. Rieradevall, A. Rosell-Melé and X. Gabarrell (from top to bottom and from left to right)

The four ICTA-UAB researchers are Jeroen van den Bergh, Joan Rieradevall, Antoni Rosell-Melé and Xavier Gabarrell. They are the UAB scientists to appear in the list which includes a total of 22 researchers from Spanish centres considered to be among the most influential in the world.

This series tells the stories of the scientists who are having the biggest impact on the climate-change debate – their lives, their work and their influence on other scientists, the public, activists and political leaders. To identify the 1,000 most influential scientists, they have created the Hot List, which is a combination of three rankings. These rankings are based on how many research papers scientists have published on topics related to climate change; how often those papers are cited by other scientists in similar fields of study, such as biology, chemistry or physics; and how often those papers are referenced in the lay press, social media, policy papers and other outlets.

The data is provided through Dimensions, the academic research portal of the British-based technology company Digital Science. Its database contains hundreds of thousands of papers related to climate science published by many thousands of scholars, the vast majority published since 1988.

For the first ranking, researchers are selected based on the number of papers published under their names through December 2020, as indexed in the Dimensions system. They are screened for climate-related work by examining the papers’ titles or abstracts – brief descriptions of the research – for phrases closely connected to climate change, such as “climate change” itself, global warming, greenhouse gases and other related terms. These papers explicitly focus on climate change rather than mention it in passing. To be included in the count, a paper had to be cited by at least one other scientist at least once.

The second ranking is based on what Dimensions describes as a “Field Citation Ratio.” For each paper, a ratio is calculated “by dividing the number of citations a paper has received by the average number received by documents published in the same year and in the same Fields of Research category,” as defined by Dimensions. This ranking is meant to measure the influence of scientists’ work among their peers.

The third ranking is based on Digital Science’s Altmetric Attention Score, a measure of a research paper’s public reach. Most papers receive a score based on references in a variety of publications, including the mainstream media, Wikipedia, public policy papers and social media sites such as Twitter and Facebook. The ranking is meant to measure the influence of scientists’ work in the lay world.

The final score for each scientist is based on the sum of each ranking – the lower the score, the greater the scholar’s overall influence, and thus the higher he or she ranks on the Hot List.

Some notes of caution. First, the Hot List does not claim to be a rank of the “best” or “most important” climate scientists in the world. It is a measure of influence.

Second, the Hot List has some limitations inherent in our methodology. For instance, its analysis targeted the titles and abstracts of papers, not the full texts, so Reuters may has missed some studies that do touch on climate change. The Altmetric score can be skewed upward if one or a few of a scientist’s papers have particularly high scores and their remaining papers have comparatively low scores.

Also, the Hot List favours productivity. The first of its three metrics ranks scientists based on the number of papers published. The other two metrics – for citation ratios and public reach – are designed to compensate for this possible bias, but they might not fully do so.

 

Within