Iain Craig
Iain Craig
Associate Director  Market Research & Analysis, Wiley

impactSummer is Impact Factor (IF) season. Publishers, societies, and journal editors the world over anxiously await the outcome of the latest version of the Thomson Reuters  Journal Citation Reports (JCR). Will their journal IFs have increased or decreased? How will they compare with the competition? Has a new journal made it into the JCR for the first time? Many faculty are equally interested in the JCR rankings – universities and funders use this data to help inform decision-making about tenure, promotion, and appointments, as well as research funding.

Wiley journals have once again performed well – over three quarters of our journals (1,192 titles) now have Impact Factors, following the announcement of the 2012 JCR on June 17.  Of these, 25 achieved a top category rank across 31 JCR categories, compared with 21 journals ranked top in 22 categories the previous year. Similarly, the number of Wiley titles indexed in the top 10 of their JCR categories increased by 27 to 264. And, of publishers with more than 100 journals indexed in the 2012 JCR, we had the largest proportion of titles (just under 60%) increasing in Impact Factor.

All cause for celebration, for sure. But this year there is an added dimension to the IF results. The330px-Dora recently published San Francisco Declaration on Research Assessment (DORA) challenges the use of the IF and other journal metrics in decision-making. DORA grew out of discussions at the 2012 annual meeting of the American Society for Cell Biology with a group of editors and publishers of scholarly journals, and it focuses on three main themes:


    • the need to eliminate the use of journal-based metrics, such as Journal Impact Factors, in funding, appointment, and promotion considerations;


    • the need to assess research on its own merits rather than on the basis of the journal in which the research is published; and


    • the need to capitalize on the opportunities provided by online publication (such as relaxing unnecessary limits on the number of words, figures, and references in articles, and exploring new indicators of significance and impact).


Interested parties are invited to become signatories to the Declaration, and over 300 organizations and more than 8,100 individuals have already done so.

Others, while generally supportive, feel that some modifications are needed. Dr Peter Goelitz (Editor-in-Chief of Angewandte Chemie, which is published by Wiley-VCH on behalf of the German Chemical Society) supports the first two themes, but is not planning to sign the Declaration because “the recommendation to “explore” new indicators goes in the wrong direction. Article level metrics are already with us and will be subject to “engineering”, just like the journal Impact Factor. I would favor a “Declaration” that is outspoken in pushing back on “metricizing” scientific journals and articles.” Dr Goelitz also believes that “The third theme is a “blurred” one as it mixes several issues. Length of articles, proper referencing, new indicators of significance and impact are totally different things; the length of articles is a non-issue these days, and one could argue for better and more succinct writing these days instead of recommending relaxing limits to length. Proper referencing has little to do with the (restricted) number of references but a lot with true scholarship and fairness. .. It is not the number of references (a “metric”) that is an issue.”

Wiley as an organization will not be signing up to the Declaration either, in part because it contains specific statements about intellectual property in the form of reference lists (For Publishers, item 9), where it would be inappropriate for us to agree given that we do not own all of content we publish, and data (For organizations that supply metrics, item 12), where we would be recommending that other commercial entities manage their IP in a certain way.

However, individual editors and society partners (and their editors) are, of course, free to sign up to DORA and many of our societies and journal editors have already done so. If you wish to join them, simply complete and submit this online form.

In a scholarly world that is increasingly looking for ways to measure output, the IF and other journal metrics can be a valuable – if imperfect – tool, but only if used to measure the right things in the right way. Other tools, including altmetrics, are being developed to help us gain a more rounded view of a journal or article’s impact. Wiley, like many other publishers, is experimenting with these, including participating in a six-month trial of Altmetric. In addition, NISO recently announced that it will be developing standards for the collection and use of altmetrics.

It is essential that anyone using any kind of metrics to make important decisions – whether as a publisher or a society, a funder or a recruiter – really understands their shortcomings as well as their strengths.