Sara Bowman
Sara Bowman
Project Manager, Center for Open Science

center for open science.PNGThere has been a lively discussion during the last few years about strategies for improving research and publishing practices in science. Beyond the research community itself, the issues of transparency and reproducibility are now priorities at many of the major funders in the U.S. and elsewhere including NIH and NSF, and the White House.

Nearly one year ago, a diverse group of about 50 researchers, journal editors, funders, and society leaders gathered at the Center for Open Science (COS) in Charlottesville, VA to continue the discussion about transparency and reproducibility, and leave with concrete, actionable guidelines. Over 2 days, the group crafted what are now known as the Transparency and Openness Promotion (TOP) Guidelines. The TOP Guidelines provide templates of policies and procedures that journals can adopt to encourage greater transparency and reproducibility of the research they publish. With a few tweaks, funders can adopt the TOP guidelines as policies for the research they sponsor. To date, over 500 journals and 50 organizations have become TOP Guidelines signatories.

This week, Wiley joined as the 51st organizational signatory to the Guidelines. Wiley has become an organizational signatory because so many of its journal and partner societies are taking an interest in improving transparency in general, and in the TOP Guidelines in particular. Before this announcement, 33 journals published by Wiley had already become TOP signatories (you can view, sort, and search the list by journal title, publisher, society, and subject area at cos.io/top).

Both journal and organizational signatories to TOP are expressing their support for the principles of openness, transparency, and reproducibility, with journal signatories committing to conduct a review for potential adoption within a year.

What do the guidelines say?

The guidelines are domain-agnostic, so they can be adopted across disciplines. They’re also modular, covering 8 different components of the research process:

• Data Citation
• Data Transparency
• Analytic Methods (Code) Transparency
• Research Materials Transparency
• Design and Analysis Transparency
• Preregistration of Studies
• Preregistration of Analysis Plans
• Replication

Each component includes 3 different levels, giving journals the opportunity to adopt part or all of the standards, and select a level of stringency that is most appropriate for them. This simultaneously provides flexibility and offers the benefits of standards across domains. A table summarizing the guidelines is below - Levels 1-3 are included along with Level 0, which is not part of the guidelines but is included as a baseline for comparison. You can read the guidelines in full detail on the TOP website.

Who’s adopting? And at what levels?

The vast majority of the signatories are still in the process of reviewing the Guidelines. While it’s too early to know what levels of TOP the signatories may adopt, we do plan to gather this information as decisions are made. The TOP Committee hopes to facilitate a community of editors who can share their experiences with implementation and offer advice on best practices.

There are some great examples of the guidelines already in practice - journals that have implemented these practices before the TOP Guidelines came to fruition. These journals can be models for others trying to figure out how to implement the Guidelines. There are more journals with policies like these out there - I’m just highlighting a few:

Citation Standards
American Sociological Review and Political Analysis have implemented data citation standards for their publications.

Data, Analytic Methods (Code), and Materials Transparency
The American Journal of Political Science has author guidelines that already match Level 3 of TOP. AJPS requires authors to submit all data, code, and materials to a trusted third-party repository prior to publication, and the analyses are independently reproduced by analysts at the University of North Carolina’s Odum Institute for Research in Social Science. It’s a really interesting process, and you can read more about it in their blog post.

Preregistration of Studies
There are a handful of journals encouraging the preregistration of research studies by offering Badges to Acknowledge Open Practices to their authors, Level 2 of the Guidelines. These journals are Psychological Science, European Journal of Personality, Journal of Social Psychology, and Language Learning.

Level 3 of the Guidelines prescribe using the registered report format of publishing as a submission format for replication studies. Registered reports have the study methodology and analysis plans peer reviewed before the research outcomes are known. Acceptance for publication is based on the importance of the research question and the soundness of the study plan. If the plans pass peer review, the report is granted “in-principle acceptance,” meaning that as long as the researchers follow the plan (or document and justify why they do not), the study will be published regardless of its outcome. This gets at the issue of publication bias. Perspectives on Psychological Science is facilitating and publishing Registered Replication Reports - what’s outlined in Level 3 of the guidelines, with a twist of crowd-sourcing the data collection. Perspectives is publishing registered reports solely in this way, but it’s worth noting that there are nearly 20 journals publishing registered reports for novel work and replications alike.

How can you get involved?

We welcome community involvement! This is just version 1.0 - we expect TOP to evolve. We encourage feedback about what does or doesn’t work in implementation to improve the Guidelines.

If you’re an editor, add your journal to the list of signatories. If you’re an author, ask the editors of journals in which you publish to become signatories. If you’re part of a society or organization that’s in a position to become a signatory - do so! And encourage others to do so, too.

The Transparency and Openness Promotion Committee meeting was organized by COS, the Berkeley Initiative for Transparency in the Social Sciences (BITSS) and Science Magazine, and was funded by the Laura and John Arnold Foundation.

Credit Image: COS | Openness, Integrity, and Reproduciblity

Open recognition and reward

Posted Jan 29, 2016
    Phil Wright
Phil Wright
Senior Marketing Manager, Author Marketing, Wiley

presents.jpgAs we come to the end of open access week, this post takes a detailed look at the theme of open recognition and reward and looks at 4 initiatives.

Author disambiguation – enabling recognition and future reward

Author disambiguation is key to author recognition. Whether your funding organization wants to review your past work before making an award, your institution needs you to provide a list of work before your promotion review, or potential collaborators are struggling to find if you’re the right person for their project, how do you openly, quickly and easily differentiate your research activities from everyone else? How can you get the recognition and reward that you deserve – without endless form filling? Many believe that the solution is ORCiD – a voluntary and open unique identifier that connects you to your research activities – ensuring your work is unambiguously attributed and discoverable by all.

As a member organization, Wiley is a big supporter of ORCiD and our online submission sites already encourage authors to register for an ORCID iD and then associate it with their account. But how does this optional unique identifier help with research evaluation and author recognition?

From August 2015, The Wellcome Trust requires that grant applicants must include their ORCID iD when they sign up with their grant application system. The reason behind this is to help researchers answer questions to secure their next grant, ‘…everyone is asking similar questions of their grant holders: what have you produced?; how openly accessible is the work you’ve published?; what shape is your career trajectory taking, and how have we changed that?’

In June 2015, Italy announced that it was implementing ORCiD nationally with 70 universities and four research centers initially participating. ‘The project’s goal is to ensure that at least 80% of Italian researchers have an ORCID iD, with links to their research output back to 2006, by the end of 2016.’ As institutions and funders continue to look for ways to track, quantify and measure their academic output, many are hopeful that ORCiD is the foundation needed to make this possible. ‘…,the incoming national assessment of research (VQR 2011-2014) will constitute an occasion not to be missed to provide all Italian professors and researchers with the ORCID identifier and to link to it the publications submitted for evaluation.’

Self-promotional tools – maximizing the recognition and reward opportunity

Research surveys, including our own, have identified that authors are active in promoting their research – as they attempt to stand out from their peers and ‘get noticed’ in their respective field. Furthermore, 89%* of these ‘self promoters’ indicated that they were concerned with measuring the impact of their work. We therefore focus our author promotional resources on helping you get the recognition you deserve.

At the forefront of our author self-promotional strategy is our partnership with Kudos – a service that helps you measure, monitor and maximize the visibility and impact of your published articles. As of October 2015, there were more than 70,000 users, including 22,000 Wiley authors, using Kudos. You can find out more by watching our video. But what does Kudos actually achieve? “The Kudos service helps authors explain (and then share) their work in plain language, enabling people in their immediate field to skim the work more quickly, and those in adjacent fields to understand its relevance to their own work.” Charlie Rapple, Co-Founder Kudos.
This increased discoverability makes it more likely that your work will be found and accessed, and you get the recognition and reward that you deserve.

Altmetric – supporting immediate recognition and reward

With the development of new article discoverability tools such as social media, blogs, videos and news outlets, there is a growing desire to incorporate some of the new article-level metrics (alongside ‘traditional’ metrics), such as the Altmetric service, to understand the more immediate and broader impact of a research article. In 2014, Wiley rolled out the Altmetric service to all journals on Wiley Online Library. The service allows you to freely track online activity and discussions about your individual scholarly papers from social media sources, including Twitter, Facebook and blogs, the mainstream media and online reference managers such as Mendeley and CiteULike. The Altmetric ‘score’ is updated daily and is open for everyone to see, follow and understand. In the past 12 months over 150,000 Wiley articles have received a mention#.

As part of assessing academic impact, there is growing evidence that institutions are monitoring article-level metrics in addition to traditional metrics such as citations. “In 2015, Altmetric data were included alongside usage and citation data, as part of the Usage and Impact reports that Wiley create and provide to all 53 Cochrane Review Groups every August. Cochrane Review Groups now consider Altmetric data when deciding which Cochrane Reviews to prioritize for updating. The data are also used by the groups to encourage authors to update their Cochrane reviews.” Gavin Stewart, Associate Editor, Cochrane.

This is further typified through the growing services offered to institutions such as the Altmetric for Institutions service. There is much you can do yourself to increase the online discussions around your work. Increasing your Altmetric score therefore represents a new and different way of achieving the recognition and reward that your work deserves.

Peer Review – creating recognition and reward

In this final section we turn our focus to Reviewers. Researchers spend a substantial amount of time reading and reviewing, but often feel under acknowledged for this important contribution to the community. Many journals offer some form of reward for their reviewers – for example, the ACES journals and their sister titles from ChemPubSoc Europe (CPSE) recently announced a pilot program to reward the top 5% of their reviewers with the opportunity to publish their next ACES or CPSE journal article open access, free of charge. Good feedback for reviewers is also important. In recent surveys reviewers have told us how highly they value feedback on the usefulness of their review and editorial outcomes – essentially, acknowledging that their time spent reviewing has been time well spent.

However, academic recognition and credit for reviewing activity are two different sides of the same coin. At Wiley we are currently running a pilot with Publons to openly give researchers credit for their peer review activity. “Peer review is the foundation for safeguarding the quality and integrity of scientific and scholarly research. Wiley’s objective is to develop a program of reviewer services in order to engage reviewers and recognize their contribution.” Miriam Maus, Vice President - Editorial Management, Wiley.

Publons have recently partnered with ORCiD to enable reviewers to link ORCID iDs to a Publons account, meaning that if you have a Publons profile you can easily connect your reviewer activities to your ORCID iD. We are also exploring other opportunities to uniquely attribute review activity to individuals via ORCiD.

As we look ahead, Wiley is committed to developing a program of services that engage and support both authors and reviewers, and help them gain the recognition and reward that they deserve. Open recognition and reward is here to stay, and we plan to be at the very heart of it.

*Based on 949 completed responses from a survey to Wiley authors from 2014. 610 indicated they had promoted their work in the previous 12 months, with the majority wanting more help/support to do this. 89% (of the 610) indicated that they were concerned with measuring the impact of their work.

#Generic term for anything that Altmetric finds valuable enough to be captured, e.g. Tweets, Facebook posts, Mendeley saves etc.

Image Credit/Source: Blend Images/Getty Images

    Vikki Renwick
Vikki Renwick
Assistant Marketing Manager, Author Marketing, Wiley

bridge+over+water.jpgTo coincide with Open Access Week 2015, we hosted a webinar to give authors tips on how to successfully publish open access; from finding the best journal, to complying with mandates and promoting their published paper.

Natasha White and Liz Ferguson, both from Wiley, were joined by Alice Meadows from ORCiD and Charlie Rapple from Kudos for a 60 minute journey through the world of open access. Authors from 15 different countries tuned in, asking some great questions about ORCiD, Kudos and the publishing process in general.

Many of the questions asked about ORCiD were from our early career researchers. A few highlights include:

- You can register for an ORCiD before you’ve published your first paper.
- You can associate anything with a DOI to your ORCiD, as well as manually associate books and book chapters.
- ORCiD is not intended to be a profiling tool – you can choose how much of your work is made public.
- Your ORCiD can be linked to your social profiles.
- If you change institutions your ORCiD can be taken with you.
- Trusted users can be added to your account to update your profile on your behalf.

Questions around the Kudos platform recapped some of the key ways in which Kudos can help promote your work:

- Kudos uses metrics from usage data, citation data and Altmetrics, and maps it against your promotional activity to measure the most effective channel to promote your article.
- Kudos can highlight the most efficient way to increase the reach of your work.
- You can use Kudos for everything you’ve published that has a DOI.
- Institutions are increasingly using Altmetrics to monitor the reach of published work.

During the webinar we asked viewers to vote in some popup polls – here’s what we learned:

80% of viewers had not published open access.
22% of viewers who published open access did so under both green and gold open access.
64% of viewers have an ORCiD.
66% of viewers promoted their article via social media or academic networks.  The remaining viewers have not promoted their article.

If you missed this webinar or one of our previous ones, you can watch them all on demand via the Wiley Author Services Channel.

Image Credit/Source: bingdian/Getty Images

    Kelly Neubeiser
Kelly Neubeiser
Author Marketing, Wiley

Think, check, submitJPG.JPG

The theme of this year’s Open Access Week is “Open for Collaboration” – a reminder to authors, funders, publishers and the whole of the research community how important collaboration is to advancing the open access movement. New partnerships and innovative ventures between all stakeholders – from societies to publishers – encourage the reach and growth of quality scholarly research.

The production of open access content begins with you, the researcher. As the open access publishing landscape continues to evolve, valuable resources for helping you learn about the process have never been more important. Cue Think. Check. Submit., a new cross-industry initiative designed to support researchers on the road to publication.


The mission of Think. Check. Submit. is to help researchers identify high-quality journals which would be appropriate for submitting their research. Think. Check. Submit. is an easy-to-use online checklist built on indicators that allow you to thoroughly investigate individual journals, ensuring that whichever publication you choose is the right fit for you and your work. The campaign is led by representatives from various organizations, including ALPSP, DOAJ, ISSN and SPARC.


Picture this: You’ve finished your first research article and you’d like to get it published. Which journal should you pick? Think. Check. Submit. aims to eliminate the stress of this all too familiar scenario. Early-career researchers will benefit greatly from this new service, as will English Second Language-researchers and those who may not have access to or be aware of the extent of existing scholarly content. Ultimately, though, Think. Check. Submit. is meant to help all researchers searching for a trusted place to publish their work.


As the output of scholarly content grows, so does the risk of deception and malpractice. Think. Check. Submit. seeks to combat both issues. According to The STM Report, the number of academic journals grows by 3.5% each year – nearly 1,000 journals were launched in 2014 alone and new ones are launched daily. Since March 2014, DOAJ has processed 6,000 applications, of which 2,700 have been rejected, 1,800 are in process, and 1,500 have been accepted. That makes choosing the best journal for your work a difficult task, especially for those less experienced with the process. Now, more than ever, it is essential that you have the right tools and services to help you make informed decisions.

Beyond Think. Check. Submit., authors can turn to freely available resources such as the new Directory of Open Access Scholarly Resources (ROAD) and Quality Open Access Market (QOAM). Developed through partnerships between scholarly organizations and the International ISSN Centre, ROAD offers descriptions of over 12,000 open access journals, conference proceedings, academic repositories and more. Like Think. Check. Submit., QOAM, a service based on academic crowd sourcing, also helps authors in the market for the ideal journal for their work.

For more information, visit http://thinkchecksubmit.org/ or follow the project on Twitter.

Image Credit: Banner:https://hub.wiley.com/servlet/JiveServlet/downloadImage/38-1422-99966/Think%2C+check%2C+submitJPG.JPG

There's no question that sitting down to begin the process of writing an article for publication can be daunting. Below are 5 tips to help you improve the overall quality of your paper, thereby improving your chances of getting published.

5 tips for writing better science papers Infographic 3.png

Filter Blog

By date:
By tag: