I cannot stress enough how much I DISLIKE the focus research universities place on journal impact factors and ISI citation counts. Both of these are really the work of one organization, Thomson Reuters. The Science Citation (and Social Science, etc) indexes offered a great service to researchers before web-based and open access publishing hit the planet in a big way. Rather than being one of many resources for identifying related bodies of work, however, ISI citations and journal impacts are being used to make and break people’s careers, as might be the case at Northeastern. Although impact factors and ISI citation counts really are only two sources of data about the impact of research on the local and broader community, these metrics are being used to decide whether or not research is valuable. This is extremely problematic for several reasons:
1. Not all good journals are indexed by Thomson Reuters. This means that good publication is deemed “not-so-good” simply by virtue of not having the Thomson Reuters special seal of approval. This is funny, since people have been pointing out for years that Google Scholar offers a more accurate, holistic view of scholarly impact.
2. Disciplines are not equally indexed by Thomson Reuters. Way back in 2010, Larsen and von Ins (2010) noted that the Science Citation and related indexes simply do not provide the kind of coverage that open access indexes like GoogleScholar can offer. The traditional sciences have more coverage than the social sciences, and disciplines that use alternative publication venues are highly underrepresented. For example, this means that computer science, which has great traction with conference proceedings AND is ahead of the curve on open access publishing, use of Creative Commons, and innovative strategies for getting research out, is “not valuable” if the lens of Thomson Reuters citation counts and journal impact is the de facto metric.
3. Most people who publish, including some pretty important journal editors, see journal impact factors as a poor way to assess research value. Or, as the DATA PUB blog would say, impact factors are a broken system. All sorts of web-based alternatives exist, but somehow aren’t being valued by the administrators, granting agencies, and other people who make decisions based on “research impact”.
4. Most frightening, organizations like the Association of American Universities (AAU) are setting a precedent that gives Thomson Reuters power over the kinds of research universities are willing to invest in. AAU is considered to be pretty elite; their website explains that “AAU member universities are on the leading edge of innovation, scholarship, and solutions that contribute to the nation’s economy, security, and well-being. The 60 AAU universities in the United States award more than one-half of all U.S. doctoral degrees and 55 percent of those in the sciences and engineering.” Getting into or falling out of the AAU is a BIG DEAL. Look what happened when University of Nebraska was kicked out, and the jealousy other schools felt when Georgia Tech was let in.
How does AAU decide if an institution is elite enough to be a member? They have a very nice membership policy document published in Nov. of 2012 that you can download from their website. AAU puts universities through two stages of analysis. The first, more quantitative stage looks at four metrics – to directly quote AAU:
1. Competitively funded federal research support.
2. Membership in the National Academies (NAS, NAE, IOM).
3. Faculty awards, fellowships, and memberships.
4. Citations: Thomson Reuters InCitesTM.
Phase 2 metrics are more complicated, but let’s be clear: ONE of only FOUR criteria used to initially decide if universities are elite enough to be in the AAU is based on…Thomson Reuters’ metrics. Let’s think about this logically:
1. Universities want to be in the AAU, much like college football teams want to play in the Rose Bowl.
2. To be in the AAU, universities have to get lots of federal grants; employ people who are in the National Academies; employee people who receive awards, fellowships, and elite memberships (apparently, there is a list of such things that count); and must be affiliated with publications that are indexed by Thomson Reuters.
3. Universities that want to get into (or stay in) the AAU must increase their metrics. Faculty at research institutions already seek and receive federal funding, National Academy membership, and awards. Faculty also publish – but not necessarily in so called “ISI journals”.
3. Thomson Reuters indexes a fraction of all of the articles published each year.
4. Universities seeking to increase AAU ranking may be tempted to treat ISI publications as more valuable than publications in venues not indexed by Thomson Reuters.
Which means that: Research universities could fall into a trap of allowing Thomson Reuters to indirectly set research agendas! How on earth did we reach a point where a third-party company has such power to control the types of research that are valued, funded, and supported by our academic institutions?
I wonder how things would change if Thomson Reuters dropped the evaluation process for journals and simply started indexing everything? Oh, wait – Google Scholar already does that.
I should note that I publish in both ISI and non-ISI journals. Since my work is interdisciplinary, my personal decision on where to publish reflects which communities I want to reach, and I often have to make a judgement call based on where the work will have the most impact. This is not impact as reflected by some outside metric, but impact as I think it should be viewed: Who needs to see my research? What other scholars could be impacted by my research? Which community will have the greatest impact on related future work? In essence, I need to figure out with whom I want to have a scholarly conversation, and publish accordingly. I would be ignoring many valuable colleagues and groups if I limited my publication to ISI journals, so I have always simply refused to allow my publishing decisions to be dictated by an elitist metric.
One thought on “Citations and impact? Who says your research is valuable?”
Comments are closed.