Web of Science Group https://clarivate.com/webofsciencegroup/ Connected, seamless, open – the power of the Group Thu, 07 Apr 2022 22:13:22 +0000 en-US hourly 1 https://wordpress.org/?v=6.2.2 https://clarivate.com/webofsciencegroup/wp-content/themes/clarivate/src/img/favicon-32x32.png Web of Science Group https://clarivate.com/webofsciencegroup/ 32 32 Peer review templates, expert examples and free training courses https://clarivate.com/blog/how-to-write-a-peer-review-practical-templates-expert-examples-and-free-training-courses/ https://clarivate.com/blog/how-to-write-a-peer-review-practical-templates-expert-examples-and-free-training-courses/#respond Wed, 06 Apr 2022 07:43:54 +0000 https://clarivate.com/webofsciencegroup/?p=43696 Learning how to write a constructive peer review is an essential step in helping to safeguard the quality and integrity of published literature. Read on for resources that will get you on the right track, including peer review templates, example reports and the Web of Science™ Academy: our free, online course that teaches you the […]

The post Peer review templates, expert examples and free training courses appeared first on Web of Science Group.

]]>
Learning how to write a constructive peer review is an essential step in helping to safeguard the quality and integrity of published literature. Read on for resources that will get you on the right track, including peer review templates, example reports and the Web of Science™ Academy: our free, online course that teaches you the core competencies of peer review through practical experience (try it today).

How to write a peer review

Understanding the principles, forms and functions of peer review will enable you to write solid, actionable review reports. It will form the basis for a comprehensive and well-structured review, and help you comment on the quality, rigor and significance of the research paper. It will also help you identify potential breaches of normal ethical practice.

This may sound daunting but it doesn’t need to be. There are plenty of peer review templates, resources and experts out there to help you, including:

  • Peer review training courses and in-person workshops
  • Peer review templates (found in our Web of Science Academy)
  • Expert examples of peer review reports
  • Co-reviewing (sharing the task of peer reviewing with a senior researcher)
  • Other peer review resources, blogs, and guidelines

We’ll go through each one of these in turn below, but first: a quick word on why learning to peer review is so important.

 

Why learn to peer review?

Peer reviewers and editors are gatekeepers of the research literature used to document and communicate human discovery. Reviewers, therefore, need a sound understanding of their role and obligations to ensure the integrity of this process. This also helps them maintain quality research and to help protect the public from flawed and misleading research findings.

Learning to peer review is also an important step in improving your own professional development.

You’ll become a better writer and a more successful published author by learning to review. It gives you a critical vantage point and you’ll begin to understand what editors are looking for. It will also help you keep abreast of new research and best-practice methods in your field.

Peer review training courses and in-person workshops

We strongly encourage you to learn the core concepts of peer review by joining a course or workshop. You can attend in-person workshops to learn from and network with experienced reviewers and editors. As an example, Sense about Science offers peer review workshops every year. To learn more about what might be in store at one of these, researcher Laura Chatland shares her experience at one of the workshops in London.

There are also plenty of free, online courses available, including courses in the Web of Science Academy such as ‘Reviewing in the Sciences’, ‘Reviewing in the Humanities’ and ‘An introduction to peer review’

The Web of Science Academy also supports co-reviewing with a mentor to teach peer review through practical experience. You learn by writing reviews of preprints, published papers, or even ‘real’ unpublished manuscripts with guidance from your mentor. You can work with one of our community mentors or your own PhD supervisor or postdoc advisor, or even a senior colleague in your department.

 

Go to the Web of Science Academy

 

Peer review templates

Peer review templates are helpful to use as you work your way through a manuscript. As part of our free Web of Science Academy courses, you’ll gain exclusive access to comprehensive guidelines and a peer review report template. It offers points to consider for all aspects of the manuscript, including the abstract, methods and results sections. It also teaches you how to structure your review. This will get you thinking about the overall strengths and impact of the paper at hand.

Beyond following a template, it’s worth asking your editor or checking the journal’s peer review management system to learn whether you’re required to follow a formal or specific peer review structure for that particular journal. If no such formal approach exists, try asking the editor for examples of other reviews performed for the journal. This will give you a solid understanding of what they expect from you.

 

Peer review examples

Understand what a constructive peer review looks like by learning from the experts.

Here’s a sample of pre and post-publication peer reviews displayed on Web of Science publication records to help guide you through your first few reviews. Some of these are transparent peer reviews, which means the entire process is open and visible—from initial review and response through to revision and final publication decision. You may wish to scroll to the bottom of these pages so you can first read the initial reviews, and make your way up the page to read the editor and author’s responses.

F1000 has also put together a nice list of expert reviewer comments pertaining to the various aspects of a review report.

 

Co-reviewing

Co-reviewing (sharing peer review assignments with senior researchers) is one of the best ways to learn peer review. It gives researchers a hands-on, practical understanding of the process.

In an article in The Scientist, the team at Future of Research argues that co-reviewing can be a valuable learning experience for peer review, as long as it’s done properly and with transparency.

The reason there’s a need to call out how co-reviewing works is because it does have its downsides. The practice can leave early-career researchers unaware of the core concepts of peer review. This can make it hard to later join an editor’s reviewer pool if they haven’t received adequate recognition for their share of the review work. (If you are asked to write a peer review on behalf of a senior colleague or researcher, get recognition for your efforts by asking your senior colleague to verify the collaborative co-review on your Web of Science researcher profiles).

The Web of Science Academy course ‘Co-reviewing with a mentor’ is uniquely practical in this sense. The course requires students to gain experience in peer review by practicing on real papers and working with a mentor to get feedback on how their peer review can be improved. Students submit their peer review report as their course assignment and after internal evaluation, they receive a course certificate, an Academy graduate badge on their Web of Science researcher profile, and they are put in front of top editors in their field through the Reviewer Locator at Clarivate.

 

Other peer review resources, blogs, and guidelines

Here are some external peer review resources found around the web:

Are we missing anything? Get in touch with us and we’ll add it to the list.

And finally, we’ve written a number of blogs about handy peer review tips. Check out some of our top picks:

Want to learn more? Become a master of peer review and connect with top journal editors as a graduate of the Web of Science Academy – your free online hub of courses designed by expert reviewers, editors and Nobel Prize winners. Find out more today.

The post Peer review templates, expert examples and free training courses appeared first on Web of Science Group.

]]>
https://clarivate.com/blog/how-to-write-a-peer-review-practical-templates-expert-examples-and-free-training-courses/feed/ 0
Introducing the Journal Citation Indicator: A new, field-normalized measurement of journal citation impact https://clarivate.com/blog/introducing-the-journal-citation-indicator-a-new-field-normalized-measurement-of-journal-citation-impact/ https://clarivate.com/blog/introducing-the-journal-citation-indicator-a-new-field-normalized-measurement-of-journal-citation-impact/#respond Thu, 20 May 2021 14:50:04 +0000 https://clarivate.com/webofsciencegroup/?p=76474 This is the second in a series of updates to provide information on the launch of the 2021 Journal Citation Reports release. In a recent blog post we discussed refinements in this year’s forthcoming release of the Journal Citation Reports (JCR)™, describing the addition of new content and hinting at a new metric for measuring […]

The post Introducing the Journal Citation Indicator: A new, field-normalized measurement of journal citation impact appeared first on Web of Science Group.

]]>
This is the second in a series of updates to provide information on the launch of the 2021 Journal Citation Reports release.

In a recent blog post we discussed refinements in this year’s forthcoming release of the Journal Citation Reports (JCR)™, describing the addition of new content and hinting at a new metric for measuring the citation impact of a journal’s recent publications.

I’m now pleased to fully introduce the Journal Citation Indicator. By normalizing for different fields of research and their widely varying rates of publication and citation, the Journal Citation Indicator provides a single journal-level metric that can be easily interpreted and compared across disciplines.

The Journal Citation Indicator will be calculated for all journals in the Web of Science Core Collection™ – including those that do not have a Journal Impact Factor (JIF)™ – and published in the 2021 JCR in June.

 

“The Journal Citation Indicator provides a single journal-level metric that can be easily interpreted and compared across disciplines.”

 

Beyond mere citation counts

Citations serve as an immediate, valid marker of research influence and significance, reflecting the judgments that researchers themselves make when acknowledging important work. Nevertheless, citations must be considered carefully and in context. For validity in assessing the impact of published research, citation analysis must control for such variables as subject field, document type and year of publication.

The new Journal Citation Indicator meets this requirement for journal evaluation, providing a single number that accounts for the specific characteristics of different fields and their publications. Although the calculations behind the Journal Citation Indicator are complex, requiring considerable computing power, the end result is simple:  a single value that is easy to interpret and compare, complementing current journal metrics and further supporting responsible use.

In its calculation for a given journal, the Journal Citation Indicator harnesses another Clarivate measure: Category Normalized Citation Impact (CNCI), a metric found in the analytic and benchmarking tool InCites™. The value of the Journal Citation Indicator is the mean CNCI for all articles and reviews published in a journal in the preceding three years. (For example, for the 2020 Journal Citation Indicator value, the years under analysis are 2017, 2018 and 2019.)

As in the CNCI measurement, the Journal Citation Indicator calculation controls for different fields, document types (articles, reviews, etc.) and year of publication. The resulting number represents the relative citation impact of a given paper as the ratio of citations compared to a global baseline. A value of 1.0 represents world average, with values higher than 1.0 denoting higher-than-average citation impact (2.0 being twice the average) and lower than 1.0 indicating less than average.

In essence, the Journal Citation Indicator provides a field-normalized measure of citation impact where a value of 1.0 means that, across the journal, published papers received a number of citations equal to the average citation count in that subject category.

 

Comparing the Journal Citation Indicator and the Journal Impact Factor

The Journal Citation Indicator is designed to complement the JIF – the original and longstanding metric for journal evaluation – and other metrics currently used in the research community. In addition to the use of normalization, there are several key differences between the Journal Citation Indicator and the JIF.

For example, the Journal Citation Indicator’s calculation on three years of publications contrasts with the two-year window employed for the JIF. This three-year calculation enables the Journal Citation Indicator to be as current as possible, while also allowing more time for publications to accrue citations.

Also, the JIF calculation is based on citations made in the current year, while the Journal Citation Indicator counts citations from any time period following publication, up to the end of the current year.

The table below summarizes how the Journal Citation Indicator compares to the JIF in various measurements.

Table 1 – Comparison of Journal Citation Indicator to JIF

Feature Journal Impact Factor Journal Citation Indicator
All Web of Science Core Collection journals N Y
Field-normalized citation metric N Y
Fixed dataset Y Y
Counts citations from the entire Core Collection Y Y
Counts citations from the current year only Y N
Includes Early Access (EA) content from 2020 onward Y Y
Includes unlinked citations Y N
Fractional counting N N

 

Required: Responsible, informed interpretation

Despite the increased uniformity and comparability afforded by the Journal Citation Indicator, as with any metric, interpretation must be instilled with judgment. Closely adjacent fields – e.g. those in the physical sciences – can be compared fairly readily. On the other hand, comparing journals in physical-science fields with, say, those in the arts and humanities, would not be advisable, as publication output, citation dynamics and other elements tend to differ so sharply between those areas.

The Journal Citation Indicator will bring citation impact metrics to the full range of journals indexed in the Web of Science Core Collection, increasing the utility of the JCR as it expands its coverage to more than 21,000 scholarly publications. Providing this information for around 7,000 journals in the ESCI will increase exposure to journals from all disciplines, helping users to understand how they compare to more established sources of scholarly content. By incorporating field normalization into the calculation, the Journal Citation Indicator will also allow users to compare citation impact between disciplines more easily and fairly. When used responsibly it can support more nuanced research assessment.

The debut of the Journal Citation Indicator represents only the latest development in the long evolution of the JCR – a continuum that has recently seen the addition of Open Access data, Early Access content and more.

What’s more, the evolution continues: watch this space for details on further refinements in the new release that will transform the JCR user experience.

 

Read the full white paper  for a detailed discussion of the Journal Citation Indicator, its calculation and its implications.

Find out more about the Journal Citation Reports here.

 

The post Introducing the Journal Citation Indicator: A new, field-normalized measurement of journal citation impact appeared first on Web of Science Group.

]]>
https://clarivate.com/blog/introducing-the-journal-citation-indicator-a-new-field-normalized-measurement-of-journal-citation-impact/feed/ 0
The road to Journal Citation Reports 2021: New content and a new metric https://clarivate.com/blog/the-road-to-journal-citation-reports-2021-new-content-and-a-new-metric/ https://clarivate.com/blog/the-road-to-journal-citation-reports-2021-new-content-and-a-new-metric/#respond Thu, 13 May 2021 10:57:12 +0000 https://clarivate.com/webofsciencegroup/?p=76273 This is the first in a series of updates to provide information on the launch of the 2021 Journal Citation Reports release. Nearly 50 years into its run as the world’s most authoritative source for transparent, publisher-neutral data and statistics, Journal Citation Reports (JCR)™ from Clarivate continues to evolve, adapting to the changing landscape of […]

The post The road to Journal Citation Reports 2021: New content and a new metric appeared first on Web of Science Group.

]]>
This is the first in a series of updates to provide information on the launch of the 2021 Journal Citation Reports release.

Nearly 50 years into its run as the world’s most authoritative source for transparent, publisher-neutral data and statistics, Journal Citation Reports (JCR)™ from Clarivate continues to evolve, adapting to the changing landscape of scholarly publishing and evaluative metrics.

 

Unifying content in the JCR

The month of June will see the new 2021 JCR release, featuring the latest refinements and additions to the journal intelligence platform’s existing store of resources.

The first of these enhancements will expand the JCR’s coverage of journal literature to reflect the full breadth of research covered in all the journals in the Web of Science Core Collection™. Although our JCR metrics already include citations recorded in journals covered in the Arts & Humanities Citation Index (AHCI)™ and the Emerging Sources Citation Index (ESCI)™, those two indexes and their journal content have not been fully covered in JCR – until now.

The journals covered in AHCI and ESCI have met the same rigorous quality criteria, applied by our expert in-house Web of Science editors, for coverage as the publications covered in the Science Citation Index™ and the Social Sciences Citation Index™. Therefore, AHCI and ESCI – and their content from trustworthy, Web of Science-selected journals – merit complete coverage in the JCR.

 

In addition to rounding out the JCR’s journal coverage, the inclusion of all material from AHCI and ESCI represents a unification of content and policies across the Web of Science, InCites Benchmarking & Analytics™ and JCR – putting everything on a common path.

 

AHCI and ESCI journals will not be awarded a Journal Impact Factor

Along with the news about the addition of AHCI and ESCI content to JCR, we must report that journals from these indexes will not receive a Journal Impact Factor (JIF)™ in the JCR.

The reason for this is the JIF calculation is only applied to the most impactful or significant journals within the sciences and social sciences – that is, those that have met our selection criteria for both quality and impact and are indexed in Science Citation Index Expanded (SCIE)™ and/or Social Sciences Citation Index (SSCI)™. Specifically, the four impact criteria (comparative citation analysis; author citation analysis; editorial board citation analysis; and content significance) are designed to select the most influential journals in their respective fields, using journal-level citation activity as the primary indicator of impact.

In terms of a potential JIF for AHCI journals, the criteria above do not precisely apply, because citation behavior and dynamics in the arts and humanities are distinctly different from other main research fields. As the AHCI product page points out, “Compared to the clinical, natural and social sciences, the arts & humanities may differ significantly regarding the type of content that is considered to be of scholarly importance, norms for reviewing content, and citation behavior.”

Therefore, although the Web of Science editors apply the same impact criteria to all our journal collections, the selection process places less emphasis on journal-level citation activity in the arts and humanities. This is why AHCI journals have never received a JIF.

As for the journals covered in ESCI: Although they have demonstrated the high levels of editorial rigor and publishing best practice required to pass our 24 quality criteria, these journals do not meet our four impact criteria. Thus, we do not calculate a JIF for ESCI journals.

As part of our collection curation process, we monitor all ESCI journals and those that develop sufficiently high levels of journal-level citation activity are re-evaluated for inclusion in SCIE, SSCI and/or AHCI.

 

Introducing the Web of Science Journal Citation Indicator

Along with its wider coverage, the next JCR release will unveil a new metric, adding still more depth, insight and context to the JCR’s range of measures – well beyond a single JIF score. The “Journal Citation Indicator” will be the subject of our next blog, where we will include a comprehensive description of this new metric, along with its underlying methodology.

Meanwhile, another new development: The expanded coverage in the 2021 JCR release will introduce Early Access articles, reflecting the earliest availability of new research as it appears in the “version of record” prior to official publication. This blog series examines this new feature in detail.

In all, these refinements to the new JCR exemplify the constant, ongoing work at Clarivate to develop and curate our data tools, collections and responsible metrics.

 

Find out more about Journal Citation Reports, publisher-neutral journal intelligence trusted by publishers, institutions and researchers.

The post The road to Journal Citation Reports 2021: New content and a new metric appeared first on Web of Science Group.

]]>
https://clarivate.com/blog/the-road-to-journal-citation-reports-2021-new-content-and-a-new-metric/feed/ 0
The changing research landscape of the Middle East, North Africa and Turkey [Report] https://clarivate.com/blog/the-changing-research-landscape-of-the-middle-east-north-africa-and-turkey-report/ https://clarivate.com/blog/the-changing-research-landscape-of-the-middle-east-north-africa-and-turkey-report/#respond Thu, 08 Apr 2021 08:19:26 +0000 https://clarivate.com/webofsciencegroup/?p=74161 The period of exceptional growth and impact that we examine in our most recent Global Research Report is unsurprising given the MENAT region’s history of deep commitment to knowledge and learning.   I am very proud to have supported the creation of a new Global Research Report from the Institute for Scientific Information (ISI)™ which […]

The post The changing research landscape of the Middle East, North Africa and Turkey [Report] appeared first on Web of Science Group.

]]>
The period of exceptional growth and impact that we examine in our most recent Global Research Report is unsurprising given the MENAT region’s history of deep commitment to knowledge and learning.

 

I am very proud to have supported the creation of a new Global Research Report from the Institute for Scientific Information (ISI)™ which explores the seismic shift of the research landscape across 19 countries in the Middle East, North Africa and Turkey (MENAT), spread from Morocco in the west to Iran in the east. It presents a global success story and demonstrates how MENAT research is growing in volume and impact, driven by increased participation in international research networks.

 

Collaboration, innovation and impact across the MENAT region

Our view of research in the Middle East, North Africa and Turkey is enhanced by new developments led by the Egyptian Knowledge Bank (EKB), the Scientific and Technological Research Council of Turkey (TUBITAK) and the Islamic World Science Citation Center (ISC) in Iran.

Funded by the Egyptian government and launched in 2020, the Arabic Citation Index (ARCI)™  provides access to bibliographic information and citations to scientific journals from more than 400 expertly curated Arabic journals. By bridging the gap between local scientific output and global impact, the benefits of the ARCI are substantial.

These important developments, from MENAT countries leading in the regional research renaissance, confirm the value of national indexes as an important regional supplement to the international citation indexes such as the Web of Science™.

 

“It presents a global success story and demonstrates how MENAT research is growing in volume and impact, driven by increased participation in international research networks.”

 

To put the region’s research contribution into global context, this study also features a special analysis of MENAT research output mapped against the United Nations Sustainable Development Goals (SDGs). (Read more from the ISI about global research activity supporting the SDGs here.)

 

Researcher mobility: strengths and opportunities

The increasingly international scope of the MENAT regional research base is seen in its researcher mobility and collaborations. This report analyzes researcher mobility both within the region and globally, finding that there is a significant outward flow of talent, with North America and Europe being the most popular destinations.

Source: The Web of Science

 

It draws attention to the opportunity for more local collaboration within the region, where domestic mobility is relatively low and purely domestic papers currently account for about 5% of total output.

The findings highlight how collaboration within the region as well as with the rest of the world will:

  • enhance the quality of scientific research,
  • accelerate access to new markets, and
  • allow the financial costs of research to be shared more effectively, meeting the economic and societal challenges the region faces.

 

Download report

 

Note: This report follows on from a 2011 analysis published by Thomson Reuters: “Global Research Report: Middle East – Exploring the Changing Landscape of Arabian, Persian and Turkish Research.” If you would like to receive a copy of the 2011 report, please email ISI@clarivate.com.

 

The post The changing research landscape of the Middle East, North Africa and Turkey [Report] appeared first on Web of Science Group.

]]>
https://clarivate.com/blog/the-changing-research-landscape-of-the-middle-east-north-africa-and-turkey-report/feed/ 0
The Arabic Citation Index: Transforming local research into global impact https://clarivate.com/blog/the-arabic-citation-index-transforming-local-research-into-global-impact/ https://clarivate.com/blog/the-arabic-citation-index-transforming-local-research-into-global-impact/#respond Thu, 04 Mar 2021 13:38:24 +0000 https://clarivate.com/webofsciencegroup/?p=72263 Exploring the value and power behind the new citation index for the Arabic region from Clarivate   Clarivate is building the Arabic Citation Index (ARCI)™ – the world’s first local language citation index for the Arabic world. The index will provide access to bibliographic information and citations to scholarly articles from a continuously expanding collection […]

The post The Arabic Citation Index: Transforming local research into global impact appeared first on Web of Science Group.

]]>
Exploring the value and power behind the new citation index for the Arabic region from Clarivate

 

Clarivate is building the Arabic Citation Index (ARCI)™ – the world’s first local language citation index for the Arabic world. The index will provide access to bibliographic information and citations to scholarly articles from a continuously expanding collection of more than 400 expertly curated Arabic journals, with language interface in both English and Arabic. It is now available to researchers and organizations across the globe.

Hosted on the Web of Science™ and funded by the Egyptian Government, the new index will make Arabic scholarly content more accessible, connecting it to more than 1.7 billion cited research references (dating back to 1864) and the highest quality, peer-reviewed scholarly content from across the globe. The indexing of Arabic publications will provide local scientific communities with improved routes to collaborating with national, regional and international research efforts – extending the Arabic academic footprint further than ever before.

 

How journals are selected for the Arabic Citation Index

The ARCI is now accepting journal submissions from editors of Arabic-language journals and has already seen strong submissions from Egypt (28%) and Algeria (24%), with Iraq (12%), Jordan (8%) and Saudi Arabia (7%) also well represented. We are committed to representing a broad range of quality research from the region and actively encourage interested journals to register here to be considered.

The regional journals covered in ARCI are selected by a newly established neutral editorial board. This board has representation from Arab League member countries, providing regional insights and subject knowledge.

The guiding principles of journal selection for the Arabic Citation Index are based on traditional scholarly publishing standards and the research norms of the region. All titles considered for the ARCI must have an ISSN and will receive a preliminary review or triage to confirm format, ISSNs and accessibility of the content.

Next the journal is reviewed by the editorial board to confirm that it is scholarly, with clear journal scope or mission, article abstracts, cited references, a clear relationship between the journal’s scope and its content, quality of article language consistent with scholarly communications and an editorial board that is reflective of the field of the journal. This selection process will ensure the quality of the ARCI, contributing to the overall value of the index within the Arabic region and broader world.

 

Bridging the gap between local scientific output and global impact

The ARCI is the fifth regional citation index developed by the Web of Science, alongside the Chinese Science Citation Database™, SciELO Citation Index™, Russian Science Citation Index™ and the KCI-Korean Journal Database™. With our decades of rich assets and experience combined with our unique data-driven and human-led approach to curation, we can create the trusted research data and unparalleled insights the research community needs to deliver scientific discovery, increased innovation and economic success.

The ARCI will enable us to evaluate the quality and research output of Arabic researchers, universities and research organizations. The indexing of Arabic publications will now provide our local scientific communities with the ability to contribute to not only national and regional research efforts but also connect research internationally, helping to extend the Arabic academic footprint further and enable the research community to solve the world’s most pressing and complex challenges.

 

Interested in being part of the Arabic Citation Index?

Submit your journal here

Get in touch with our team for information on how to subscribe.

The post The Arabic Citation Index: Transforming local research into global impact appeared first on Web of Science Group.

]]>
https://clarivate.com/blog/the-arabic-citation-index-transforming-local-research-into-global-impact/feed/ 0
The Web of Science Academy is here, an online hub for research integrity training https://clarivate.com/blog/the-web-of-science-academy-is-here-an-online-hub-for-research-integrity-training/ https://clarivate.com/blog/the-web-of-science-academy-is-here-an-online-hub-for-research-integrity-training/#respond Tue, 02 Mar 2021 13:55:32 +0000 https://clarivate.com/webofsciencegroup/?p=72188 Research integrity in publishing is an important topic that is not often taught to researchers through their institution or by publishers. The new Web of Science™ Academy offers researchers around the world an opportunity to improve their skills and knowledge, making them better researchers, peer reviewers and journal editorial board members. Courses will help researchers […]

The post The Web of Science Academy is here, an online hub for research integrity training appeared first on Web of Science Group.

]]>
Research integrity in publishing is an important topic that is not often taught to researchers through their institution or by publishers. The new Web of Science™ Academy offers researchers around the world an opportunity to improve their skills and knowledge, making them better researchers, peer reviewers and journal editorial board members. Courses will help researchers to:

  • gain skills and confidence,
  • improve the quality of their research outputs, and
  • maneuver the academic publishing space, while maintaining high integrity standards.

 

Context leads to better referencing and better metrics

The first course available in the Web of Science Academy, “Good citation behavior,” covers how to reference, when to reference and where in your manuscript you should reference other work. It also explores what citation manipulation is and how to prevent it. We can all play a part in responsible referencing by making sure there is relevant context for using a reference. The goal of this course is to support good referencing practices and thus good citation behavior, which in turn will lead to more robust citation metrics.

At Clarivate, we are committed to supporting the research community’s efforts to practice research more responsibly, and to promoting more responsible use of metrics in research evaluation. Current citation metrics are often criticized for not putting a researcher’s impact into context1. For example, how does a researcher’s citation count compare to other researchers in the same field? If a citation count increases over time, this should be taken into consideration during researcher assessment. One way we are addressing this is through our newly released Author Impact Beamplots metric on Web of Science author records, more on that here.

 

 

Why research integrity skills are important

Research is built upon already published research. It is therefore paramount that high quality and ethical standards are upheld; otherwise, new research might be flawed, and journal and research metrics might not represent true impact.

Research integrity standards bring trust to the scientific process. We all have a shared responsibility to uphold the integrity of the scholarly record for the new generations of researchers yet to come. One way we at Clarivate are supporting research integrity is by offering free training through the Web of Science Academy.

For more information on what Clarivate and our Institute for Scientific Information (ISI)™ are doing within the research integrity space, watch this recording from our inaugural research integrity event. Held in November 2020, it features Australia’s chief scientist and other Australian stakeholders within government, institutions, funders and publishers. For further insight, please also see our recent research integrity report, Research Integrity: Understanding our shared responsibility for a sustainable scholarly ecosystem, summarized and linked to in this post.

 

Practical information

In keeping with our mission to promote more responsible use of metrics in research evaluation, the Web of Science Academy is free and publicly available. Simply register and login to begin learning. After completion of each course, a certificate will be available for print or download. We will release new courses throughout the year, including peer review training previously available from the Publons Academy. Integrating Publons Academy courses with the newly launched Web of Science Academy will provide a better user experience for our learners and mentors.

Course offerings will be relevant for early career as well as senior researchers and editors from all research fields. We welcome librarians and research administrators to promote the Web of Science Academy to their students and faculty. We will share information later this year about more in-depth training offerings for institutions that wish to improve capacity building of their researcher faculty community.

 

Register for the Web of Science Academy

 

1 Profiles, not Metrics (2019) J. Adams, M. McVeigh, D. Pendlebury, M. Szomszor. Web of Science report. 

The post The Web of Science Academy is here, an online hub for research integrity training appeared first on Web of Science Group.

]]>
https://clarivate.com/blog/the-web-of-science-academy-is-here-an-online-hub-for-research-integrity-training/feed/ 0
The Web of Science Author Impact Beamplots: A new tool for responsible research evaluation https://clarivate.com/blog/the-web-of-science-author-impact-beamplots-a-new-tool-for-responsible-research-evaluation/ https://clarivate.com/blog/the-web-of-science-author-impact-beamplots-a-new-tool-for-responsible-research-evaluation/#respond Mon, 01 Mar 2021 08:58:45 +0000 https://clarivate.com/webofsciencegroup/?p=72111 How the addition of beamplots to the Web of Science will provide researchers and evaluators with contextual insights around individual researcher performance   The Web of Science™ Author Impact Beamplots are a new visualization tool that showcase the range of a researcher’s publication and citation impact in a single data exhibit. It is well-aligned with […]

The post The Web of Science Author Impact Beamplots: A new tool for responsible research evaluation appeared first on Web of Science Group.

]]>
How the addition of beamplots to the Web of Science will provide researchers and evaluators with contextual insights around individual researcher performance

 

The Web of Science™ Author Impact Beamplots are a new visualization tool that showcase the range of a researcher’s publication and citation impact in a single data exhibit. It is well-aligned with wider community efforts to reform research assessment and encourage the responsible use of metrics as it makes use of a field-normalized citation metric, does not unduly penalize researchers with gaps in their publication record, or disadvantage those who work in fields with distinctly different publication activity.

 

More than a metric?

Publication and citation metrics have become more common in determining academic appointments, promotions and funding and many researchers are rightly concerned about approaches that reduce their work to a single-number performance score. Continued dependence on simple and inadequate metrics has led to indicator impoverishment and lack of awareness of best practices.

 

A beamplot shows the volume and citation impact of an individual’s publication portfolio through time.

 

In contrast to the h-index, which tends to favor senior researchers that work in the physical sciences, a beamplot shows the volume and citation impact of an individual’s publication portfolio through time. Each paper’s citation count is normalized (i.e., benchmarked against other similar publications from the same discipline) and measured as a percentile. It is also not necessarily biased against individuals who have taken a career break or published less at any given time.

 

 

Importantly, beamplots reveal the data behind composite scores such as the h-index, show the underlying data on a paper-by-paper basis and provide a picture of performance over time. Seeing the data in this way puts a researcher’s publications into a context suitable for comparison and unpacks the citation performance of their publication portfolio.

All of these elements help to address many criticisms of the h-index and support our stance on using profiles rather than metrics. But perhaps more importantly, we believe that this tool will encourage those who use researcher metrics to consider what is actually behind the metric and to engage more actively with the data.

 

Promoting responsible research evaluation

The data visualized in beamplots steer us away from reduction to a single-point metric and force us to consider why the citation performance is the way it is.

Beamplots are particularly good at surfacing variation in the data that should be investigated and compared against other quantitative and qualitative indicators. In this way, they are a useful narrative tool that can refute or corroborate other evaluation criteria.

It’s crucial to remember that although publication and citation data are useful indicators of research activity and impact, they must be considered alongside the many other contributions that academics make and placed in proper context for each individual. This might include where they were working at the time, the nature of any collaborative projects and the type of research involved.

It is clear to us that, if used responsibly, a beamplot will help remove the current dependence on existing single-point metrics, eliminate indicator impoverishment and raise awareness of responsible research evaluation practices. We encourage people who use researcher metrics to consider what actually makes a metric and to engage more actively with the data in order to provide new opportunities to conduct research evaluation in a responsible way.

 

Our whitepaper from the Institute for Scientific Information (ISI)™ explores this useful metric and gives guidance on how it can be used in the right context to promote responsible research evaluation.   

 

Download whitepaper

 

 

The post The Web of Science Author Impact Beamplots: A new tool for responsible research evaluation appeared first on Web of Science Group.

]]>
https://clarivate.com/blog/the-web-of-science-author-impact-beamplots-a-new-tool-for-responsible-research-evaluation/feed/ 0
Order, order! New report highlights importance of evolution in data categorization https://clarivate.com/blog/order-order-new-report-highlights-importance-of-evolution-in-data-categorization/ https://clarivate.com/blog/order-order-new-report-highlights-importance-of-evolution-in-data-categorization/#respond Wed, 10 Feb 2021 11:05:22 +0000 https://clarivate.com/webofsciencegroup/?p=71416 It is the function of science to discover the existence of a general reign of order. Dmitri Mendeleev A new report Data Categorization: understanding choices and outcomes from the Institute for Scientific Information (ISI)™ at Clarivate highlights the evolving and dynamic nature of data categorization. It addresses the way we recognize natural divisions of knowledge […]

The post Order, order! New report highlights importance of evolution in data categorization appeared first on Web of Science Group.

]]>

It is the function of science to discover the existence of a general reign of order.

Dmitri Mendeleev

A new report Data Categorization: understanding choices and outcomes from the Institute for Scientific Information (ISI)™ at Clarivate highlights the evolving and dynamic nature of data categorization. It addresses the way we recognize natural divisions of knowledge and research and how we categorize publications for discovery, analysis, management and policy.

Being aware of the characteristics and limitations of the ways we categorize research publications is important to research management because it influences the way we think about established and innovative research topics, the way we analyze research activity and performance, and even the way we set up organizations to do research.

Our report introduces a new and highly flexible approach to data aggregation based on trusted research data in the Web of Science™ citation network, developed in collaboration with the leading academic scientometrics team at the Centre for Science and Technology Studies (CWTS) at Leiden University in the Netherlands.

This innovative approach demonstrated in InCites™ Citation Topics more accurately represents microclusters, or specialties, provides more uniform content and improves citation normalization. It also gives opportunity for novel groups to appear that were not previously possible with journal-based schemes.

There are clear strengths and weaknesses in  the variety of classification systems currently available, and our aim in introducing Citation Topics is to promote good practice in data management to improve knowledge, competency and confidence and to ensure the responsible use of research metrics.

Ludo Waltman, Deputy Director at CWTS, Leiden University said: “Bottom-up citation-based classifications play a prominent role in many of the scientometric analyses that we carry out at CWTS. It is great to see that InCites users will now also be able to benefit from these powerful classifications.”

 

Download report

The post Order, order! New report highlights importance of evolution in data categorization appeared first on Web of Science Group.

]]>
https://clarivate.com/blog/order-order-new-report-highlights-importance-of-evolution-in-data-categorization/feed/ 0
Adding Early Access content to Journal Citation Reports: choosing a prospective model https://clarivate.com/blog/adding-early-access-content-to-journal-citation-reports-choosing-a-prospective-model/ https://clarivate.com/blog/adding-early-access-content-to-journal-citation-reports-choosing-a-prospective-model/#respond Thu, 28 Jan 2021 14:52:16 +0000 https://clarivate.com/webofsciencegroup/?p=70650 Continuing our discussion of Early Access (EA) content and its planned appearance in the Journal Citation Reports (JCR)™, here we explain why we have chosen a phased, prospective approach to introduce EA content in 2021. For the past three years, Clarivate has been expanding the number of publishers and journals that have their early access […]

The post Adding Early Access content to Journal Citation Reports: choosing a prospective model appeared first on Web of Science Group.

]]>
Continuing our discussion of Early Access (EA) content and its planned appearance in the Journal Citation Reports (JCR)™, here we explain why we have chosen a phased, prospective approach to introduce EA content in 2021.

For the past three years, Clarivate has been expanding the number of publishers and journals that have their early access (EA) content indexed in the Web of Science™[i]. As of the end of 2020, EA content from more than 6,000 journals was included, some reaching back to materials published in an EA format in 2017. As we continued to accumulate content, we began to investigate how best to use EA content in the JCR.

Using a dataset of almost 5.3 million source items (517,000 indexed as EA) and 20.6 million citations (nearly 3.6 million either referenced in or linked to EA items), we modeled the outcomes of various methods of including EA content as part of journal performance. Two models are discussed in more detail in our discussion paper: a retroactive model and a prospective model.

Two distinct dates are associated with EA content – an ‘EA date’ marking the first availability of the Version of Record, and a ‘publication date’ tied to a volume-issue-page assignment. Items where the EA date and the publication date fall in different calendar years present a challenge to the calculation of JCR metrics: should these articles be considered in the count of items published in their EA year, or the count of items published in their publication year?

 

Evaluating retroactive or prospective approaches

The retroactive model would apply EA date backward, onto all content we received as EA from 2017 onward. This would affect the 2020 Journal Impact Factor™ (JIF™) denominator for journals that were providing EA content for indexing in 2018 or 2019, potentially decreasing their JIF value and rank in category as a result of their early participation in our pilot Early Access project. There would be no effect on the JIF denominator for journals that were not providing EA content prior to 2020.

The prospective model would set 2020 as the first year for which EA content would be considered according to its EA date rather than its publication date and would continue to incorporate new content using the EA date. In contrast to the retroactive model, content that was indexed as EA before 2020 would not change the year it is counted in the JCR from its publication date to its EA date.

In both models, content published as EA in 2020 would contribute cited references to the 2020 JIF numerator, even if the item does not have a volume-year assignment until 2021 or later. This expands the number of items and citations contributing to the 2020 JIF numerator of most journals in the JCR regardless of whether they themselves publish EA content that is already indexed in the Web of Science.

 

Choosing a forward-looking model

We have chosen to implement the prospective model. The retroactive model would create two populations of journals that are differentially affected – based only on when Clarivate began accepting their EA content, not on any change in the citation or publication dynamics of the journal itself.

Imposing a counting disadvantage on a subset of journals while providing a citation benefit to all is not an acceptable option. The prospective model will allow us to continue expanding the number of journals included in EA indexing as we move through 2021 and beyond. Using the EA date for all new content on-going will allow the JCR data to capture the rapid incorporation of published content and the inclusion of more current citations.

For a full overview you can download our discussion paper here.

 

[i] We define EA content as the public availability of a Version of Record in advance of the assignment of that article to a volume, issue and page. See: https://clarivate.com/webofsciencegroup/article/whats-next-for-jcr-defining-early-access/

 

 

 

 

 

The post Adding Early Access content to Journal Citation Reports: choosing a prospective model appeared first on Web of Science Group.

]]>
https://clarivate.com/blog/adding-early-access-content-to-journal-citation-reports-choosing-a-prospective-model/feed/ 0
Research on the go: introducing Web of Science My Research Assistant https://clarivate.com/blog/research-on-the-go-introducing-web-of-science-my-research-assistant/ https://clarivate.com/blog/research-on-the-go-introducing-web-of-science-my-research-assistant/#respond Mon, 25 Jan 2021 10:15:02 +0000 https://clarivate.com/webofsciencegroup/?p=69091 At a time where so many researchers are working remotely, Web of Science™ My Research Assistant gives easy access to publication records of the trusted research data in the Web of Science citation index.  The events of the past year have changed the way we work, research and collaborate. As more researchers work remotely or […]

The post Research on the go: introducing Web of Science My Research Assistant appeared first on Web of Science Group.

]]>
At a time where so many researchers are working remotely, Web of Science™ My Research Assistant gives easy access to publication records of the trusted research data in the Web of Science citation index. 

The events of the past year have changed the way we work, research and collaborate. As more researchers work remotely or on the go, they needn’t sacrifice ease of access.

The new Web of Science My Research Assistant is a mobile app that allows researchers to search, save and share publication records from the Web of Science and the Master Journal List on their Apple or Android mobile devices.

Whether they’re on the move or rely on wireless networks to carry out their work, researchers can use the app to tap into an unrivaled breadth of world-class literature records linked to journals rigorously selected for their quality and impact criteria.

 

The content researchers trust, wherever they are

My Research Assistant is a mobile-native app that is optimized for use on a mobile device. It allows researchers to harness the power of the Web of Science to quickly search and save research publication records, create a curated feed of the research they care about most and easily share links to records with colleagues and scientific collaborators around the world, straight from the app.

Publication records can be shared using the device’s native share function or saved to a reading list for later viewing. Users can also navigate directly to the full-text article using the DOI within the document record.

 

Introducing feeds

A new feature unique to My Research Assistant, feeds are saved searches that automatically reload with the newest content each time the app is opened, allowing users to stay on top of the latest research. Researchers can save search criteria around multiple projects to keep their research organized. Feeds are separate from the ‘Saved Searches’ on the Web of Science desktop.

 

 

Empowering research collaboration with global access

To best support the global research community, the new app is available to anyone across the globe who wants to keep up-to-date with any scientific research field. Within the free version, users can search by topic and create up to three saved search feeds viewing up to a maximum of 25 most recent search results over the past five years from the Web of Science Core Collection.

 

Those with an institutional subscription are entitled to unlimited search functionality, feeds and search results, so they can dive into the citation network and find cited, citing and related articles, and conduct deeper searches by author, research area, journal and funder.

We know that increased complexity and collaboration within the research community brings new challenges and a clear need for more advanced resources and tools. This is why we are working hard to improve the researcher experience by delivering a more personalized service and using the latest technology to spur and facilitate connections across the science community.

Stay tuned for additional features and improvements to My Research Assistant throughout 2021 and beyond.

 

Download the app via the Apple App Store or Google Play Store, or learn more.

 

This report and any statements included herein may contain forward-looking statements regarding Clarivate. Forward-looking statements provide current expectations or forecasts of future events and may include statements regarding outcomes, anticipated capabilities and other future expectations. These statements involve risks and uncertainties including factors outside of the control of Clarivate that may cause actual outcomes to differ materially. Clarivate undertakes no obligation to update or revise the statements made herein, whether as a result of new information, future events or otherwise.

The post Research on the go: introducing Web of Science My Research Assistant appeared first on Web of Science Group.

]]>
https://clarivate.com/blog/research-on-the-go-introducing-web-of-science-my-research-assistant/feed/ 0