Skip to Main Content
It looks like you're using Internet Explorer 11 or older. This website works best with modern browsers such as the latest versions of Chrome, Firefox, Safari, and Edge. If you continue with this browser, you may see unexpected results.

Measuring Research Impact: Demonstrating Author Impact

This guide provides an introduction to the various metrics used to measure research (author, article, journal) impact.

Demonstrating Author Impact

This section of the guide discusses various ways of demonstrating author impact:

  • Author Metrics: Citation counts, h-index
  • Article level Metrics : Usage counts and altmetrics
  • Citation Benchmarking, i.e. article performance against citation baseline (Web of Science, Scopus)
  • Field-weighted Citation Impact (Scopus)

Citation Counts

Citation counts measure the impact of a particular publication or an individual author by counting the number of times either has been cited in other works. This analysis of a particular author's work is one of the components used to evaluate the quality of that's individual's scholarly output and the impact he or she is having upon a particular discipline. Although such counting sounds relatively straightforward, it is complicated by the fact that there is no single citation analysis source that covers all publications and their cited references.

There are a number of ways to measure this:

  • Citation count -- The total number of times an author's work has been cited
  • Average citation rate -- the ratio of total citations to the number of works authored
  • The h-index -- A researcher's h-index is determined by listing their publications in descending order of times cited and counting down the list to the last paper for which the number of times cited exceeds the number of papers counted. Rather than a measure of the average number of citations, which can be skewed by either a single highly-cited article or by new articles which have not yet been cited, the h-index is believed to provide a measurement that avoids over-emphasizing these extreme cases.

Citation analysis as a qualitative measurement should be used cautiously, for the following reasons:

  • Citation rates and practices vary widely between disciplines.  Citation analysis of scholars in one field should not be compared to those in another.
  • Where a scholar publishes can have a great impact on the analysis if the tools used to count citations do not index the publications where a scholarly work is cited.  This is particularly true for researchers that publish in international journals, smaller regional or local publications, or in books and other non-journal publications.
  • Citation rates can be influenced by other practices such as self-citation.

The three key sources for citation information are Web of Science, Scopus, and Google Scholar.

Citation Benchmarking

Scopus and Web of Science offer tools for article benchmarking. Citation benchmarking indicates how citations received by the document being viewed compare with the average for similar documents. 

Scopus citation benchmarking takes into account the date of publication, the document type, and the disciplines associated with the item. Citation Benchmarking compares documents within an 18 month window and is computed separately for each of its sources' disciplines.

Citation benchmarking example in Scopus

In Web of Science, there are a few more steps to obtaining the baseline information. In the top navigation, look for Essential Science Indicators. Once in it, navigate to Field Baselines and Percentiles. Baselines are annualized expected citation rates for papers in a research field. Percentiles indicate how many citations it will take for a paper to be in the top 1%, 10%, 20%, 50% of papers in that field that year. For example, a 2017 material science article that received 18 citations would put that paper among the top 10% of papers published that year.

Web of Science baseline percentile example

More information about Web of Science tools for author benchmarking can be found in Authors / Researchers: What is your impact? guide by Clarivate Analytics. Please note that the University Libraries does not currently have a subscription to the InCites section of Web of Science.

Field-Weighted Citation Impact (FWCI) in Scopus

The FWCI score in Scopus indicates how the article's citation count compares to similar articles in the same field and timeframe.  

A score of 1.00 is the "global average" and means the article is cited at an average level. Articles with a FWCI greater than 1.00 are performing better than global average. A FWCI below 1.00 suggests the article may be underperforming.  

Important To Know

Because the FWCI comes from Scopus, only documents within the database (1996 to the present) will have a FWCI.
Because the FWCI includes field normalization, theoretically, the score should be a better indicator of performance than a raw citation count. 

How to Find the FWCI

 Locate the article in Scopus and go to its full record (Document Details). The FWCI is displayed within the Metrics area:

Scopus Field Weighted Citation Impact display

Article-Level Metrics

Along with citation data, many databases now display usage data for articles.The count reflects the number of times a user of that database has accessed the full text of or saved the article

For example, Web of Science displays usage for "Last 180 Days" and "Since 2013":

Web of Science Usage Counts

Scopus displays a variety of metrics in addition to citation data: usage in EBSCO databases, captures on sites such as CiteULike or Mendeley, as well as Wikipedia / social media references. These additional metrics are typically referred to as altmetrics or alternative metrics. Besides Scopus, PlumX Metrics are available on these platforms: EBSCOHost databases, EBSCO Discovery Service, ScienceDirect, Engineering Village.

Scopus PlumX Metrics example



Web of Science, Scopus, and Google Scholar: Which to Use?

Q: When looking for citation counts or h-index, is it better to use Web of Science, Scopus, or Google Scholar?

A: Since each indexes different content, it is a good idea to search all three, export the results into Refworks or other citation manager, and remove all duplicates.  You may also want to consult discipline-specific databases that offer citation data.

Researcher Profiles

Researchers should take steps to ensure that their online presence reflects their work and scholarly contribution by creating or reviewing their profiles in Scopus, Web of Science, and Google Scholar.