Anne-Marie Green
Anne-Marie Green
Communication Manager, Wiley

We recently spoke with Frances Pinter, founder of Knowledge Unlatched, a non-profit enabling sustainable Open Access book publishing. She is also the CEO of Manchester University Press.


Source: Frances Pinter

Q. What is your current role with Knowledge Unlatched?
A. I am the founder and currently the executive director.


Q. Why did you decide to start Knowledge Unlatched – what problem were you trying to solve?
I was trying to find a way around a number of problems. First and foremost, I've watched the decline in the sale of monographs over the decades, especially in Humanities and Social Sciences. I think this is a disaster because so many foundations of our knowledge generated in these subjects is first expressed and properly understood through a form of publication that is longer than a journal article. I also wanted to find a way of avoiding a new, but equally damaging knowledge divide between those that have access to content digitally and those that do not.


Q. How would you like to see Knowledge Unlatched grow and evolve in future?
KU was set up to demonstrate three things. They are:
1. That publishers are willing to make high quality, front-list books available on an OA license in return for the payment of a single, fixed title fee by a global community of libraries;
2. Libraries from around the world could work together to share this fee; and
3. Doing so would provide a financially-viable alternative to traditional content acquisition models for both publishers and libraries.
This was successfully accomplished through the pilot project. How it develops and grows is in the hands of the publishing and library communities.


There are practical questions around how one scales up a model like KU. We decided during the pilot that it didn't make sense to set up a totally separate infrastructure ourselves, so we are talking to intermediaries who already have the contacts amongs both publishers to source the books and libraries to present them for unlatching.


Q. OA publishing models for books are still few and far between – do you see this changing and, if so, how and when?
There aren't too many models out there. I know HEFCE in the UK will be issuing a report in the autumn being prepared by Geoffrey Crossick. He is looking at how to encourage OA for monographs and the general opinion at this stage is that there is not a single model that fits all needs. It is likely that a pool of money will be available for experimentation. Once that happens,people will likely come forth with a number of ideas on how to arrive at sustainable OA publishing. With the Europe 2020 OA mandate covering 80 billion euros of research, there will certainly be more brainpower addressing the issue. The approach in the US has also triggered a number of ideas - not all of which are practical. The danger is that some disciplines will move forward more quickly than others. This could have unintended consequences for those disciplines that are perceived as closed and laggardly in their efforts to open up their work - to their peers and to the public.


Q. You've spent your entire career in publishing. What has been the period of the most dramatic change?
Clearly the most dramatic change has been the move to digital. This has opened up so many exciting possibilities. However, we run the risk of trying to replicate print in digital and that will slow down the developments that are now possible. One day there won't need to be an area called 'digital humanities'. Digital will just be part of the toolkit.


For me, personally, the opportunity to work with George Soros in the mid-to-late nineties in the former communist block of countries was the most exciting time of my professional life. Opening up access to content that had been previously banned, facilitating translations and changes in school curricula with new publications,and helping to develop the new private, independent publishing sector was thrilling. It was amazing to be part of those heady days of transition.


Q. You started the first woman-owned published company in the UK at the age of 23. Certainly, the landscape has changed, but do you feel publishing needs more female leaders today?
Well, there certainly aren't enough women at the top. And, sadly, in the UK we recently lost two top women leaders at Random House and Harper Collins. However, there has been tremendous progress and I'm a glass -half-full rather than half-empty type of person.


Q. As an expert on intellectual property rights,do you feel reforms in these areas are keeping pace with change?
I wouldn't say I was an expert in copyright. That would need a lifetime of study! I have taken an interest in how the Creative Commons license builds on copyright. I remember being asked about seven years ago whether I was a 'copyright terrorist'. This illustrated to me a misunderstanding of the whole idea of reserving some rights rather than all rights. I respect copyright but it really is difficult to adjust it to meet the needs of the digital age. We have a cascade structure with the main international treaties requiring that any change, apart from small adjustments nationally, need to be agreed upon by the international community. This takes time and there is no way around that.


Q. What big changes do you think we will see in scholarly communications over the next five to ten years?
Publishing forms will change. Scholarly communications will become more informal with the rise of social media. This will make official versions of record more important, so I see a healthy role for scholarly publishers. But it will require different skill sets and I doubt the publishing company of tomorrow will look much like the publishing companies of today. None of this is particularly new. Large companies have had the foresight to plan and start converting themselves into new types of organizations. What will be interesting to see going forward is whether or not we can still maintain a diverse publishing ecosystem, or whether economies of scale and the need to trim costs will lead to commodification of content and value coming from what is done with it. I should probably mention here that I am also the CEO of Manchester University Press and I am very keen that smaller presses survive and thrive. I think they can.


Thank you, Frances.

    Scott Lachut
Scott Lachut
Director of Research and Strategy, PSFK

Below is the final contribution in our series of posts from Scott Lachut of PSFK, the keynote speaker at Wiley's Executive Seminar this year. #WileyES14 will bring you to all the posts on the seminar.


circaEvery day on the internet, people create 500 million tweets, share 2.5 billion pieces of content on Facebook and upload 144,000 hours of video on YouTube. This all happens outside of the 24 hour news cycle dominated by traditional media outlets like television and print, which serves to highlight the mass of information that confronts today’s researchers. It’s no wonder they’re hungry for the next generation of tools to help them filter their results to find the most relevant sources, discover fresh materials, catalog their findings and collaborate with colleagues anywhere in the world in order to drive their work forward.


At the beginning of a project, researchers often tend to cast a wide net and explore a range of information sources before narrowing their focus. Short of skimming every article to determine its relevancy, they don’t have many options to skip over this critical step. Circa is a news platform that aims to get people to the salient points of a story much more quickly, allowing them to dig further into those they deem worthy. As CEO Matt Galligan explains, “We want to make news easy to consume, like Cliff Notes. We don’t summarize. We take stories and break them into core elements — facts, statistics, videos and images — and add context to certain points.”



While academics are intimately familiar with the idea of article abstracts, these summaries don’t always touch on all of the nuances of a published piece of research and still require a fair amount of mental effort to sort through. What if there was a way to automate this process? In the tech space, ‘context’ is being billed as the new form of search - algorithms that take into consideration data like user preferences, location, time and other factors to serve up information that is highly targeted to the individual. MindMeld is one of an emerging class of applications that listens in on conference calls to deliver related articles, video and images, which can be swiped to the center of an iPhone or iPad screen to instantly share with anyone else on the line to spark new discussions.


Source: PSFK

Regardless of how well a piece of technology gets to know its user, there are some detractors that are wary of filter bubbles - the tendency for algorithms to show people information that often avoids different or dissenting points of view. These contrary opinions can be an essential tool when trying to reach a deeper understanding of a subject. To avoid this potential pitfall, researchers might instead choose to set up their own series of notifications and filters. IFTTT (If This Then That) is a digital programming tool that helps people “Put the internet to work for you,” in the words of the company. The service offers a simple interface that helps people connect two or more web services or Wi-Fi enabled products together to perform a set of actions. Users can choose from a menu of pre-programmed ‘recipes’ or create their own, automating processes like receiving alerts when relevant information is shared through social media channels or instantly adding relevant articles to Evernote.





No matter how digitally savvy a researcher is, there is invariably going to be analog time spent taking notes and gathering data. To keep pace with a culture that needs information to be instantly available, searchable and shareable, academics will need tools to bridge the off to online divide. The Livescribe 3 is a Wi-Fi enabled pen that works in conjunction with a companion app to sync handwritten notes to the cloud. Though still in the early days, the software offers some intelligent functionality, helping streamline the process of archiving information and locating it after the fact. If the company were ever to pair the pen with a contextual assistant in the future, the technology becomes even more compelling.


Source: Penn State University
Source: Penn State University

As important as it is for research teams to have shared access to research and other documents from any location to ensure that every member is one the same page, it’s sometimes necessary to get people in the same room to truly advance the goals of a project. To help foster this collaborative process amongst its scientists, Penn State has installed a 7’ X 13’ visualization wall and an array of interactive tools at its Millennium Science Complex. The school hopes the ability to view and discuss data in a more social setting will contribute to new insights and discoveries.


In the end, it’s these game-changing breakthroughs that researchers are after, changing people’s perceptions about the world or establishing novel solutions to longstanding problems. By providing the proper set of tools to help them get to that point, institutions and organizations are working to advance the goals of these digital scholars. To support this radical thinking, these same entities can’t be afraid of breaking free from established processes and experimenting with entirely new methodologies. The results will be exciting to see.

UKSG 2014: Librarians tell all

Posted Jun 24, 2014
    Quinn Janjic
Quinn Janjic
Assistant Marketing Manager, Customer and Channel Marketing, Wiley

This past April, the EMEA Channel and Customer Marketing team completed its largest ever face-to-face librarian survey at the 2014 United Kingdom Serials Group (UKSG) Conference in the historic spa town of Harrogate, North Yorkshire. Over 2.5 days, the team surveyed over 98 librarians, hoping to discover more about their current challenges, how they would go about improving their libraries, and actions that publishers could take to make Open Access easier to manage. Their answers provided valuable, firsthand insight into the issues facing our customers and the possible solutions that lie ahead.


UKSG Wiley Librarians Survey

    Scott Lachut
Scott Lachut
Director of Research and Strategy, PSFK

Wrapping up this week's coverage of Wiley's Executive Seminar is another post by keynote speaker Scott Lachut of PSFK. Next week we'll feature one more post from Scott to round out this series.


The digital revolution has upended the media and publishing industries in recent years with its ability to give more content producers more platforms on which to be heard, and audiences a near constant stream of (mostly free) news and opinions to consume. Yet despite this shift, the process of writing and reading has remained largely unchanged. Sure, authors and journalists have adopted a new set of online creation tools, but once a story has been published, it quickly becomes just another static piece of content that gets buried beneath an avalanche of what’s new and next. Although readers can share their favorite articles on social channels alongside their own point of view, they’re largely consuming these bits of content inside a vacuum devoid of context or meaningful dialogue.


Though the shelf life of a well-researched academic paper may be slightly longer than something that’s a part of the standard news cycle, it’s important that publishers embrace new social tools and platforms to rescue their authors’ works from simply becoming another historical footnote by helping their content become more dynamic and expansive.


One of social’s biggest impacts has been its rewriting of the rules defining expertise. Prior to the introduction of social channels, when a reader agreed or disagreed with an author’s assertions they had few options for discussion. Now they can instantly connect with an audience of peers (and sometimes the writer themselves) to speak about the merits or shortcomings of a particular piece of work. The democratization afforded by these tools has taken publishing from a top-down model into a two-way conversation, giving rise to powerful new voices, which otherwise wouldn’t be heard. Not only does this ongoing commentary add value to an article once it’s been published, but it also calls into question the very notion of a finished piece of work. To put it more simply, when readers start making meaningful contributions to the telling of a story, should it continually be updated to reflect those relevant facts or is their mere mention enough?


Consider the way some breaking news reporting has adapted alongside the realities of Twitter. As live events are pushed out to the public, many media outlets are moving to incorporate images, video and reactions being shared by individuals in the affected locale. This more collaborative form of storytelling recognizes the value of sharing multiple points of view, which can sometimes even call previous suppositions into doubt. While this model is only viable for certain kinds of articles, it’s important to consider how similar innovations can be applied to the current process of academic publishing.


One platform looking to make the process of research and writing more iterative is Hi. The community-powered site offers a “full stack” of creation tools that are intended to bring writers and their audiences closer together throughout the process, whether that is by helping with the development of a story idea or providing feedback after an initial draft. Hi’s founder Craig Mod explains the benefits of this model in a company blog post, “The biggest advantage is that it’s easy to weave community into each stage of the writing process. This creates a unique intimacy with an audience. It also makes building an audience feel accessible. In fact, writing on Hi feels less like using a set of tools and more like having an increasingly deepening, extended conversation.”


While not all authors may be ready to turn their writing process over to the wisdom of the crowd, most should welcome the value of healthy debate once their work goes live. Active discussion not only amplifies the reach of a published article, but also has the potential to surface relevant insights that can further the goals of research and/or place that research within the broader context of a field of study. In the same way that readers have been able to scribble notes in the margins since the dawn of printed text, new commenting tools are bringing the power of annotations to the digital world. One such service is Lifefyre’s Sidenotes application, which allows readers to directly engage with specific elements of an article whether that is a quote, paragraph or image by leaving comments and sparking conversation streams.


Popular sites like Quartz and Medium Shave already adopted similar tools. In describing the thinking behind the decision to add this feature, Quartz senior editor Zach Seward explains, “We have a really smart audience, a lot of our readers have a lot to add, and we certainly believe in that mantra that collectively they know more than we do. So we wanted to try to design a system that would take advantage of that and give space for their thoughtful contributions while avoiding the traditional pitfalls of comments.”


While the fantasy of a genius toiling away in solitude before making a monumental discovery is still possible, it’s more likely that they will have had some help along the way. That assistance may come from of a singular reader comment that refined a key insight, or a sustained conversation that enabled the work to reach a much larger audience. These external catalysts are especially important in today’s saturated publishing landscape where the margin between good and great thinking is razor thin, and only the most innovative ideas gain traction. Given these realities, it’s vital that publishers provide their authors and audiences access to an emerging set of social tools, ensuring that the best research and hypotheses can be magnified through the free exchange of assenting and dissenting points of view and related context. These articles then, can become focal points of a much larger conversation, helping advance the goals of the authors and their audiences.

    Terri Teleen
Terri Teleen
Associate Publisher, Wiley

Below, the latest in our series of posts covering this year's Wiley Executive Seminar, Terri Teleen, who chaired "The Next Generation of Researchers" panel, summarizes what we learned is important to young researchers. Share your views in the comments or by Tweeting @WileyExchanges with hashtag #WileyES14.


Source: Goodshoot / Thinkstock
Source: Goodshoot / Thinkstock

What do graduate students, young researchers and junior faculty members want, need, and expect – from their communities of learning and practice, and from their experiences of researching and publishing? We assembled a panel at our 2014 Wiley Executive Seminar to try to attack this complicated set of interrelated questions. Our guests were:

  • K. Bo Foreman, PhD, Associate Professor, Department of Physical Therapy, University of Utah
  • Clare Kim, doctoral candidate, program in History, Anthropology, and Science, Technology and Society (HASTS) at the Massachusetts Institute of Technology
  • Mary-Hunter (Mae) McDonnell, PhD, JD, Assistant Professor of Strategy, McDonough School of Business, Georgetown University
  • R. Patrick (Pat) Weitzel, PhD, postdoctoral fellow, National Heart, Lung, and Blood Institute, National Institutes of Health


Although the panelists represented a wide range of backgrounds and experiences, several key themes emerged quickly in the discussion.


First, for societies and associations, our panel offered a reassuring message: the next generation values many of the benefits societies and research communities have historically provided. Young researchers want to feel that they are a part of a community. They want opportunities to network and to develop relationships with people who share research interests – face-to-face whenever possible. If given the choice and opportunity, our young researchers said they always prefer to meet in person. There was consensus among the panel that virtual conferences have not, thus far, lived up to the hype.


However, the next generation also faces some new challenges, including changes in the way faculty appointments are being made. Our panelists reported that people increasingly enter, or continue in, graduate school understanding that they will need to cast a broad career net, focusing on a range of opportunities beyond the academy. They pointed to an emerging requirement for mentoring programs and career services – things which societies and associations are well positioned to provide but which have not historically been at the core of their services.


Quite by accident, our panelists all hailed from interdisciplinary backgrounds. This struck me as a theme in itself. Is alliance to a single discipline nearing obsolescence in this fast-paced, increasingly global, interconnected environment? Clare Kim thinks it is. With that in mind, she said she would love to see societies forge alliances with one another, so that she and her peers can have access to wider networks without having to choose one community over another. Clare also pointed to a pressing need for more research funding, particularly in the social sciences and humanities. There were many in the audience who clearly felt her pain.


On the publishing side, our panelists suggested that speed of publication is extremely important to them, as is ease of the review process. An audience member asked if they feel speed has an inverse relationship to quality, and the panelists confirmed that in their view, it does not. The panel also suggested that they expect publishing norms, formats, and technologies to keep pace with their evolving expectations. For example, Bo Foreman noted the increasing importance of making video content permanent, rather than treating it as ephemeral.


Our panel agreed unanimously that their research lives happen almost entirely online – save the occasional set of handwritten notes on a printed PDF. Pat Weitzel suggested that like many of his peers, he does not peruse journal Tables of Contents in any format. He relies on aggregation services and keyword alerts to keep on top of the latest developments. Mae McDonnell joked that she does not own enough print material to fill the shelves in the office she earned recently with her first faculty appointment.


Our Executive Seminar panel was just one point of engagement between Wiley and the Next Generation of Researchers. We recognize that the pace of change is ever increasing and believe that to grow and prosper, publishers, societies and editors must remain in conversation with those who will shape and control our organizations in the years to come.


We are grateful to our excellent panelists. Bo, Clare, Mae, and Pat: we appreciated your time, and your interest in engaging with us on these important questions.

    Chris Graf
Chris Graf
New Business Director, Professional Innovations, Wiley

Below is the second in our series of posts covering last week's Wiley Executive Seminar. "The Next Big Thing." Don't forget to share your thoughts by tweeting @WileyExchanges using hashtag #WileyES14.

What’s the next big thing for research authors? Delivering innovation for authors and society members requires calm assessment of the trends that matter, careful selection of the innovations that justify introduction, and then support for authors and society members to use them. The 2014 Wiley Executive Seminar (#wileyes14) highlighted some important trends for authors, and shared news about innovations at Wiley. Here are some of my highlights, with a focus on authors.

Scott Lachut of PSFK (@scottlachut) presented his insights into digital trends for researchers and authors, through a lens provided by some of the innovations he is watching, like:

  • IFTTT.com, which monitors events in the digital world and then initiates the actions that you specify
  • hi.co, a new social publishing platform
  • premise.com, which turns big data into insight by tracking ‘macroeconomic and human development trends in real time’
  • thingful.net, which collects new kinds of big data by sharing to the web data from any enabled device

But which digital innovations most deserve our attention? Allen Moore (@allenjmoore) shared part of the answer: The innovations that matter most put authors front and center. Moore is co-editor in chief of the open access journal Ecology and Evolution. He reminded us that author attitudes are changing and, lest we forget, that what authors think will remain central in every successful society publishing strategy. Moore builds his editorial philosophy around authors as partners:


“Authors are our partners, our friends, and we should respect them and their efforts. Most importantly, we need to make this clear to them.”


If Moore had a message for publishing innovators it might be this: focus on authors, and at the same time make it clear to those authors that they have our undivided attention. Kudos is one innovation, supported by Wiley and more than 20 other publishers, that does this. Melinda Kenneway (@MelindaKenneway) shared how Kudos was born of authors’ need to do more to promote their research work:


“84% of researchers think more could be done to increase the impact and visibility of their work”


Expertise Portals are another. David Tharp of Knode (@davidtharp) explained how Wiley-Knode Expertise Portals derive new kinds of value for communities of authors automatically, from their research outputs. Tharp suggested that Expertise Portals might feed membership strategy particularly for life and biomedical societies by, for example, identifying new members, spotting rising stars among their members, or facilitating creation of new collaborations between members.


Digital innovations like these need parallel innovations in policy and operations. Victoria Forlini, from the American Geophysical Union discussed data publishing and the AGU’s data policy. The policy says ‘all data necessary to understand, evaluate, replicate, and build upon the reported research must be made available and accessible whenever possible.’ Forlini described this as “the perfect storm for authors”. Even so, she reported that AGU has high compliance, at 95%, with its data publishing requirements:


“We’re working to make our authors’ lives run smoothly”


I take a second message for publishing innovators from Forlini’s presentation: Process and policy innovations go hand-in-hand with digital wizardry.


The third message for publishing innovators that I took from the seminar is this: Speed up. Innovation is perspiration, as well as inspiration. It requires an innovation strategy to combine expertise from professional communities and societies with a go-the-distance innovations team. Steve Miron, in closing remarks, vocalized how Wiley recognizes this need for speed:


“We’re responding to a clear imperative: Pick up the pace.”

    Scott Lachut
Scott Lachut
Director of Research and Strategy, PSFK

Wiley's Executive Seminar takes place today,and over the next week we will be featuring issues, trends, and key learning points from the seminar. This year's theme is "The Next Big Thing" and, below, seminar speaker Scott Lachut starts us off discussing human engagement with big data.


What if the scientific method were no longer necessary? A bold statement, but one that Wired’s Chris Anderson posits in his 2008 article, extolling the virtues of “Big Data.” He writes, “Petabytes allow us to say: ‘Correlation is enough.’ We can stop looking for models. We can analyze the data without hypotheses about what it might show. We can throw the numbers into the biggest computing clusters the world has ever seen and let statistical algorithms find patterns where science cannot.” Simply put, take your Excel spreadsheets, throw them in an algorithmic blender, chill your insights and serve.


In the five plus years since Anderson made this claim, the process of pure science remains largely unchanged, though we’re beginning to see early signs of the data revolution beginning to influence the research process. What’s more, its impacts extend beyond the hard sciences into the social sciences and humanities as well, creating new opportunities for academics of every stripe to look for answers at a scale never before possible. Consider that Google alone has digitized more than 30 million texts as of April 2013, creating a vast trove of knowledge accessible by anyone on the planet. And this is just the tip of the proverbial data iceberg, which grows with every Tweet, like, selfie, step logged and internet-enabled device.


On top of this stream of information and updates that are permeating nearly every facet of society and daily life, we’re seeing a host of new initiatives that recognize the value in making data and research archives available to the wider community in the hopes of unleashing new breakthroughs. The Health Data Exploration project from the California Institute for Telecommunications and Information Technology is looking to bring together the disparate silos of data being generated by personal health devices and startups in the space with more traditional information gathered by the medical community to create a comprehensive resource for research and study. Similarly, UK startup Thingful, is working to create a search index for the Internet of Things, providing access to datasets and information streams of the world’s connected objects. Beyond the data, the site’s pages are centered on Twitter-profiles, building a community of people willing to discuss why and how they are using the devices that they add to the index.


Yet despite the sheer volume of raw information available, the vast majority of it - less than 1% has been analyzed according to the Guardian - lacks any significant meaning. Up until recently, researchers have lacked a sophisticated and intuitive enough set of tools to allow anyone, regardless of skill level, to search for answers within this data. For researchers lacking a computer science degree, nuanced data reporting usually begins and ends with a few rows and columns on a spreadsheet, and maybe a formula or two, but even these somewhat antiquated systems can be given new life with the right set of tools. Kinetica is an iPad application that lets users ‘physically’ manipulate data from a spreadsheet, remixing the points into highly visual charts and graphs. Developed by the Human-Computer Interaction Institute at Carnegie Mellon University, the platform was designed to help people uncover new patterns and connections within their data.


Source: Recorded Future
Source: Recorded Future

While it’s important to consider how to mine insights from fresh research, what are researchers to do with the legacy of information that already exists? Science has classically been adept at handling numbers and stats, but it is only beginning to ascertain how to unearth meaning from other types of content, particularly written text. However, as advanced algorithms become better at understanding human language and sentiment, there is an entirely new set of possibilities for what can be accomplished. Researchers need only ask the right questions. To that end Baylor College of Medicine partnered with IBM to develop software that could read back through 60,000 research papers focused on p53, a protein that is present in most cancers, to look for new discoveries that could be applied to novel drug development. While still in early stage trials, the software has proven to be successful at making connections between disparate pieces of research, identifying isolated facts that many human researchers might otherwise miss.


As the speed of information increases, there is an emerging school of thought that says that knowledge must also follow suit to enable researchers to stay on top of important trends as they happen, and one day even predict future outcomes. In order to achieve this, organizations are combining powerful analytics platforms with visual reporting tools to provide detailed snapshots of real-time data. In the case of Recorded Future, the company’s algorithms scour the web to find data associated with particular events or corporations, and plot that information along a graphic timeline. This highly curated point of view arms their clients with the intelligence to anticipate change and react accordingly.


Looping back to Anderson’s original premise on the ascendency of Big Data, it’s becoming clear that these tools are a valuable addition to existing research methodologies, particularly where large amounts of information are in play, but not a replacement for them. Superseding the human element seems to equate rational analysis with nuanced thinking and quantifiable results with understanding – two aspects that we’ll continue to rely on academics, scientists and researchers to interpret for us.

The Open Access Revolution

Posted Jun 11, 2014
    Rachel Burley
Rachel Burley
Vice President & Director  Open Access, Wiley

The post below is an excerpt of an article first published in issue three of the Horizon 2020 Projects Portal.


fish jumping into bigger bowl
Source: RomoloTavani / Thinkstock

Between 2009-2012, the number of peer-reviewed academic journals increased by about 10%. Authors now have a choice of over 28,000 journals worldwide1 in which to publish the outcomes of their research.  The rising volume of research literature has been a theme of scholarly publishing for many decades, but Open Access (OA) – the most talked-about (and controversial) development in research publishing in recent years - has led to an unprecedented explosion in author choice.  Calling for the free availability of research articles via the Internet, OA is an opportunity for authors to amplify the impact of their work. In this increasingly article-based economy the power of transaction is with the author, but these new publishing models - coupled with increased pressure from funders to track the outcomes of the research they fund - mean that authors are now navigating an increasingly complex publishing environment.

At Wiley we have been working closely with our authors and exploring new services and products to meet their changing needs. We understand that authors want to publish their work quickly, they want their research to be known, and they need support around complying with the ever-increasing array of (sometimes conflicting) publishing mandates. Our goal is to develop the best possible publishing experience for authors, which will in turn help funders achieve their goals.

Open Access

First, a refresher: there are two forms of Open Access— gold and green. In gold OA, articles are made freely available by the publisher immediately upon publication (typically under a liberal license, such as Creative Commons By Attribution – CC-BY), in return for payment of an Article Publication Charge (APC) that is usually covered by the author’s funding source. A gold OA article may appear in a completely OA journal or in a hybrid journal mixing OA and subscription content.

We recognize the need to support our authors with a range of options, including immediate OA publication of their articles in most Wiley journals through OnlineOpen, our hybrid option. In addition, we have a growing portfolio of fully OA journals - Wiley Open Access.

The appeal to authors of immediate, free access to their work in a prestigious journal is compelling and payment of APCs is not necessarily an obstacle, particularly in better funded disciplines, such as life sciences.

Wiley offers a choice of creative commons licensing arrangements to authors in our OA publishing program, taking into account the specific requirements at an article, journal, and discipline level as well as addressing the various needs of authors, funders, and our society and other publishing partners.

In green OA, the author’s accepted manuscript (AAM - a peer-reviewed but otherwise unfinalized version of an article) is deposited in an institutional repository (IR) or a subject archive such as PubMed Central, and made freely available, typically following an embargo (usually between six to 36 months) after the article’s formal publication in an academic journal. At Wiley, we recognize that authors need the flexibility of a green OA option in order to comply with funder mandates, and have adjusted our self-archiving policies to make it easier for them to do so.

Embracing Change

From the 30,000-foot perspective, OA is part of a much larger trend toward Open Science, an exciting global movement that encourages collaboration between researchers through the open sharing of data, supported by evolving technology and funder participation. With 32 fully OA journals and 1,250 journals offering OnlineOpen, many of which are published on behalf of scholarly and professional associations which are increasingly interested in OA, we are forging ahead in this new open environment. We are also finding new ways to meet the changing needs of authors, providing practical tools to help them disseminate their research and taking a fresh look at how we engage with them throughout the publishing process.

English Language Editing

For the growing proportion of authors who are not native English speakers, Wiley offers a language editing service, developed through a program of market research, and implemented through a partnership with American Journal Experts. ELES offers a suite of services allowing authors to access English language editing including translation to and from Chinese, Spanish, Portuguese and English, manuscript formatting, and customized figure preparation. This service comes with a clear guarantee of quality and rapid turnaround times to meet exacting author requirements. Authors can also opt to receive a certificate to illustrate to journal editors that their article is ready for submission and the English language has been reviewed and verified.

Author education

New and less-experienced authors need to learn the publishing process. Our global author workshop program guides them through all aspects of research writing and publication, from “How the peer review process works” to “How to publish a paper” to “Ethics Guidelines.” These sessions are customized for each individual market and author audience and are often hosted in collaboration with our partner societies. We ran more than 180 such workshops during 2013, both live and as virtual events or webinars. Our program in emerging markets such as China, Brazil, and Turkey is especially robust, but we also run workshops in mature markets like the US and UK, since young and early career researchers do not always have access to this sort of training at their institutions.

We are also creating brief videos, for example to help authors understand how to comply with the recently implemented Research Councils UK (RCUK) mandate.


In April 2013, we launched the Wiley Author Licensing Service (WALS) in order to simplify the licensing process for journals and their authors, timed alongside the release of the RCUK mandate. 70% of journals published by Wiley now use WALS, and one year later just over 100,000 licenses had been signed in the system.

Our Wiley Author Services (AS) application incorporates production tracking, author resources, and other features for our journal authors. AS enables authors to track their accepted article online right through the production process to publication. Once an article is published, authors receive free access and are encouraged to nominate up to 10 colleagues who also receive access, to drive readership and citations.

The AS website includes many valuable resources for authors, including information on publication ethics, copyright, English-language editing, tips on optimizing articles for search engines, and funder requirements, as well as links to journal-specific author guidelines, manuscript submission sites, and e-alert sign-up. We are currently developing new functionality (such as single sign on), features (such as transfer of articles), and content enrichment (through semantic tagging and application of taxonomies) to create a more streamlined and valuable experience for authors.

Helping authors make more impact

Through our technology partners, we are offering some exciting new tools to help researchers collaborate and authors achieve greater impact for their articles.

Article level metrics (ALMs) have become an important tool to establish a more complete picture of the impact of individual papers, as distinct from the publication in which they appear. Following a successful pilot phase, Wiley is partnering with Altmetric, a service that tracks social media sites for mentions of scholarly articles. Altmetric creates and displays a score for each article measuring the quality and quantity of attention that the particular article has received.

77% of users during the pilot phase felt that altmetrics added value to the article and most of them used the information to gauge the overall popularity of articles, discover researchers interested in the same area of work, or understand a paper’s influence.

Another new partnership is with Kudos, a service that helps researchers and their institutions harness their expertise and networks to measure, monitor, and maximize the visibility and impact of their published articles.

We have launched a one year trial, working with select Wiley authors to drive social media activity, and to rework abstracts and headings for better use in social media. The results will inform our future services to authors.


The OA movement represents a challenge to the traditional journal subscription model, but within a complex, shifting landscape it is also opening up opportunities for new relationships between publishers and authors and new ways for publishers to add value for researchers, funders, and libraries. “Funders and authors want to make more impact,” says Bob Campbell (Senior Publisher at Wiley), “and we are now organized to help them achieve that in this new world. It will be a fascinating time. Embrace the changes.”


1.            STM Report 2012 (Ware and Mabe, 2012)

    John Webster
John Webster
Senior Marketing Manager, Wiley

An earlier post on this topic ‘Search Engine Optimization and your journal article: do you want the bad news first?’ focused on tips to make your journal article optimized for search engines. Generally relevant for promoting both journal articles and books, today’s post looks beyond the journal article structure to ‘off-page’ SEO strategies.

SEO, SEO, SEO.It seems like all we hear these days is about the importance of Search Engine Optimization. But there is a good reason for that. Don’t believe me? Google it! When talking about Google search alone, the numbers are both impressive and scary at the same time:

  • Daily, Google crawls 20 billion sites and processes 100 billion searches
  • Yearly, there are 525 algorithm changes (computer programs and formulas that transform search queries into meaningful results), each algorithm containing more than 200 variables


So what does that all mean? Google is fast, efficient, imperative, and complex as heck.


Unfortunately it is nearly impossible for those of us who don’t live, breathe, and sleep SEO to keep up with its ever-changing landscape. However, there are some fundamental things we can do to yield return results in line with what people are actually searching for (hint: this is the core of what Google is trying to accomplish with their constant algorithm changes).


Take heart; I’m not talking about 301 redirects, site architecture, or unethical black hat SEO; below are a few fundamental tips we can all utilize. You need to create valuable additional content (outside of your published works) and get people to link to you – your name, your brand,and your work (note this does not address Google Scholar, which is a topic unto itself!):


√ Content, content, content

Developing content sounds pretty easy, particularly in the academic and scientific communities where producing content is our bread and butter. However, there are additional ways you can get your name (and by extension your published content) out there in the online universe:

  1. Write a blog. Search Engines love blogs.With the rise of social media platforms like Facebook and Twitter many people question whether blogs still have their place and the answer is a definitive YES! Sometimes people want a greater level of engagement on a certain topic and blogs are a great way to accomplish this while also positioning yourself as a thought leader in your space.
  2. Get on Google+. It probably doesn’t come as a great shock that using Google’s social media platform will also help in other areas of Google’s offerings. In this case it means that your individual posts can show up in Google as a search result and are very important in increasing your SEO rankings.The best way to make Google+ work for you is to find relevant communities and contribute to those conversations. And best of all, if you are looking for content to post you can repurpose your blog content on your Google+ page.It’s two for the price of one!


√ Link Building

Google weighs a link to your website just as a researcher would value a citation or a book-buyer might consider a positive review. The more inbound links you have, the more Google will value and highlight your content.

  1. Integrate .edu’s. Links from .edu’s are like gold.Google loves links from institutional websites, yet it amazes me how infrequently you see this taken advantage of.If you write or publish content make sure that you link to it from your profile page on your institution’s website.It is also critical to reach out to other members of your community to get them to link to your content as well. A word of caution, Google doesn’t like reciprocal links (for the most part) so be careful of a, “you link to mine and I’ll link to yours” approach.
  2. Wikipedia. Whether we like it or not, one of the first places many people look for substantive information is Wikipedia.Try finding a Wikipedia page on a topic related to your article and add content and your article link as a reference.
    Anne-Marie Green
Anne-Marie Green
Communication Manager, Wiley

Elaine Alligood is the Chief, Library Service for the Boston VA Healthcare System's three campuses and is the 2013 recipient of the Vagainas Librarian of the year award.


Elaine Alligood
Source: Elaine Alligood

Q. Can you please explain your current role and how it came about?
A. It’s an ironic tale for sure! In March of 2011 I worked at VA Boston in a National VA program:the VA Technology Assessment Program. We did systematic reviews of the evidence for the Chief Patient Care Services Officer in the VA Central Office. I’d been there for 16 years. Looking back, I was in total change mode, I was divorcing my husband of 21 years and following that up were significant work changes & challenges! VA N.E. libraries were stale: librarians around VA New England were retiring without replacements or succession planning. A decreasing morale and lack of professional mojo among the Library Services usurped energy and creativity—nothing new was going on except fear and whining about closing libraries. Frankly, it was simply horrific! So,a colleague and I started chatting up the food chain with examples of 21st century librarian practices, making the case that we needed to stem the tide of library closings AND medical errors while increasing and improving accurate and timely information access.. We aggressively countered the misperception that all anyone needed was a paraprofessional to do document delivery. I disseminated tons of evidence from the literature to make the case (doing all those systematic reviews helped). Thus, in March 2011, my pal Joan and I were asked to be on the Libraries Re-Design Task Force! In May 2011, my work group was informed we were closing in August! Luckily, at that moment, I was on the recruitment team to hire the new Boston VA librarian. On the day I found out that my job was ending I called my colleague and chair of the recruitment team and left him an amusing message letting him know I had the perfect candidate for the job! The rest is history!


Q. What are some of the challenges and rewards of your position?
A. My greatest challenges lie in promoting my services AND my capabilities across the three Boston hospitals. My experience is that most clinicians just don’t know how much and how powerful we 21st century librarians are today. Especially challenging are the clinicians who didn’t go to medical school with a laptop under their arms. We've got a huge array of databases and digital content that users need to integrate into their work. It's just like Naisbitt's Megatrends predicted in 1982: high tech requires high touch! And, that's me!--the high touch Informationista! I want to get our users to fully experience the WIIFM (What's In It For Me) that eContent offers for working faster and smarter.


We’re a huge hospital and not everyone knows I exist. Boston used to have 4-6 librarians, now there’s just me (thankfully, we’re recruiting right now for another librarian).


Surprisingly, there is no open library space currently, so I’ve developed several shameless self-promoting tactics to achieve this. I introduce myself to anyone in a white coat, scrubs, or not obviously a patient and hand them an eye-catching flyer. My mission is to ensure everyone knows who I am and what I can do, along with all the terrific resources available to improve their clinical and research practice. The reward is that they use them, ask more questions, call me more often, invite me to present to residents, and to teach evidence based research. I’m at the point now where I’m overwhelmed with work, but in a good way (sort of). I’m supporting three research projects, working directly with clinicians at their desktops, and with our Harvard psychiatry residents and nurse residents. I couldn’t be happier that help is on the way.


Q. What are the major differences between this role and librarian appointments you’ve held in the past?
A. My work is much more hands-on, intellectually, than ever before! These days, information needs don’t occur in the physical, dusty, creaky, library spaces—they occur at the point of care, at the point of need, on rounds, in the ER, and in the lab. With our electronic content, users can access resources anywhere they are, anywhere they have internet. So my job is to empower users, teach them to use/access these resources, integrate them into their work, connect them to alerts and assist them when they need it. My goal is to be part of the clinical team, rounding with them and hanging out in the emergency room a few days each week. In a quirky sort of way, knowledge transfer is about adjacency as much as anything else. The best part of all is that I don’t have any real purchasing, cataloging or HR tasks. So my time is really focused on improving users’ skills and connecting them to knowledge.


Q. Why is it so crucial for librarians to play a part in diagnosis and patient safety?
A. 1 in 20 patients in the ER dies from a missed diagnosis; medical errors are the 3rd largest cause of death. Stakes are high and time is short in the ER. We can assist mitigating diagnostic or medical errors by working in the ER--adjacency generates familiarity. Familiarity enables the white coats to ask more questions, ask for help and admit uncertainty. We can raise awareness of cognitive biases, train users to step back, use a diagnostic checklist tool such as ISABEL, and more.America may run on Dunkin – but Health Care runs on information! Who better than an Informationista and librarians to help?We know what they need, we know what they don’t know and, above all, we know where to find it!


Q. You believe librarians need to move out of the library “place” to where the information needs are. Does this thinking extend beyond healthcare?
A. My philosophy is that everyone can use a good librarian AND a little therapy! I’m a trustee of my local Belmont, Massachusetts public library – (just can’t get enough of libraries, I suppose). I feel strongly—the evidence is growing—that as our society grows more knowledge-work driven we must effectively navigate information and knowledge resources as part of our work, our home lives, and our educations. Librarians need to flip the model a bit. Instead of sending kids TO the library to learn knowledge skills, why not work with kids directly, virtually, in the physical classroom or one on one. I think we need to mitigate the digital divide by providing access tools such as laptops, netbooks, and tablets to our users who can’t obtain them, and train them to the skill levels they’ll require.


Q. What is one piece of advice you would give to a new librarian?
A. Hone your tech skills! Always do your best, and don’t be shy about taking risks, writing a blog, creating a customized search engine, or enthusiastically promoting yourself to possible employers. Find mentors who are doing things you’d like to do. If you’re not working, think about a volunteer gig at the library/organization where you’d like to be. An act of grace can lead to opportunities.


Q. How do you see research libraries evolving in the future?
A. Great question! It seems to me, we’ll have ever more virtual access to the oldest of incunabula and loads of archival and historical content mashed into all sorts of multi-media knowledge portals, connecting history texts with images, video, interactive timelines, and more than I can list. I could be very wrong, but I think expanding the access to historical collections can only improve and expand scholarship. Old and dusty perhaps, but insights await children AND scholars.


Q. Your Twitter handle @informationista is a nod to coffee, and you mention in a tweet that it must be black and sweet. Is strong coffee key to being a successful librarian?
A. Absolutely! Strong coffee is a metaphor for strength of will, character, and a whole lot of patience. Plus, there’s evidence that coffee drinkers have lower rates of Alzheimer's Disease!


Thanks Elaine!

    Fiona Murphy
Fiona Murphy
Publisher, Life Sciences, Wiley
Mark Thorley
Source: Mark Thorley

Q: First of all Mark, thanks for taking the time to talk to us. Can you tell us what your role at RCUK involves?
A: I’m part of the Natural Environment Research Council (NERC), one of the 7 Research Councils that make up Research Councils UK (RCUK). My remit includes scholarly communications – of which the two main branches are Open Access (OA) policy and research data. As Chair of the RCUK Research Outputs Network, I help coordinate the collective work RCUK does in this area. My day job, if you like, is to oversee NERC’s network of environmental data centers and NERC’s Data Policy. I’m also responsible for the NERC Library Service (which supports NERC’s own research staff in our research centers). I also work closely with my innovation and knowledge exchange colleagues in getting NERC’s data and information out there and available for use I have recently added research outputs collection to my portfolio of activities.


Q: Your twitter handle says ‘My life is Open Access and Open Data’ – what do you mean by that?
A: I’m keen to remove as many barriers as possible to the outputs of the research we fund – both publications and data – to support research as well as other exploitation uses, both commercial and non-commercial.


Thinking back to five years ago, there’s been a sea change in the whole landscape: Open Data, the Royal Society report (Science as an Open Enterprise), the data activities the Russell Group are looking at, the establishment of the Research Sector Transparency Board: we simply weren’t having the conversations we’re having now. I believe one of the reasons for this is that data is such a cross-cutting issue – it’s almost been too difficult to conceptualize and take action on, and it’s organizationally challenging. Other drivers to action have been the Engineering & Physical Sciences Research Council (EPSRC) Policy Framework on Research Data that was implemented in 2011 and the Higher Education Institutions’ (HEIs) discussions around how research institutions take responsibility for research data in the UK.


Q: The RCUK’s mandate on Open Access has now been in effect for nearly a year – what have been the biggest challenges so far?
A: When we first put our mandate in place, some considered that the consequences could be apocalyptic – for both researchers and publishers. But, I’m happy to observe, we didn’t seem to have broken the system. I remind everyone – including myself – that this is a journey, not a single event. I’m pleased to see institutions starting to implement our policy, and we are starting to see change – for example, the work that JISC Publications and some publishers are doing on the total cost of ownership.


There are still some ongoing issues; green embargo periods are still fluctuating, for instance, and the whole landscape can be confusing for the researchers themselves. We also need to provide some more clarification around multi-funder and multi-author situations. And we’ve felt some frustration that some HEIs have been acting in an ‘anti-Gold’ OA manner in the way they advise their researchers. Clearly, there is still some more work to be done to develop a common understanding of what OA means. It’s not just ‘free to read’ for instance. We’re advocating the Gold OA route as it provides immediate, unrestricted access to the version of record.


I can understand that both publishers and libraries are looking to ‘protect’ their places within the scholarly communications landscape, but the rise of the internet means that anyone can post (in effect ‘publish’) anything. It’s the ultimate disruptive technology for this industry that potentially enables the democratization of knowledge so it’s the responsibility of all the actors involved to ensure that quality, peer-reviewed research is at least as accessible as the unsubstantiated material that will also be discoverable. To me, trusted intermediaries, such as learned societies, are key to this process, taking the open, peer-reviewed and published outputs and interpreting them for a wider society. Traditional ‘closed access’ does just not align with the modern world. Research funders and institutions both have a responsibility to delivery access to knowledge – it is part of their remit.


This is not a singular challenge. Instead, as expected, we’re in a long-term transition – like a super-tanker changing direction. It’s a learning process for all of us.


Q: How optimistic are you about the future for Gold OA both in the UK and globally?
I’ve been very encouraged by the statistics showing that Gold OA is growing – both in the UK and globally. Publishers have responded by increasing the range and number of journals with ‘Gold’ options, and it’s still early days really. I’m also pleased to see the progress being made in the humanities, with the Open Library of Humanities getting support from the Andrew W. Mellon Foundation. Far from diverging from the STM ‘open’ trajectory, it seems the humanities are also considering how they can transition in this direction.


I’m also pleased to see the rise of the small university presses within this landscape. For example, UCL is relaunching its press, which is taking publishing back to its roots as a service for researchers that’s able to be very responsive and tailored in what it delivers.


But, challenges do remain. Sustainability and scalability are key. There are real costs to providing access to quality, peer-reviewed papers, and these need to be covered, regardless of the business model, without bankrupting the system.


We need to be engaging in viability discussions at all levels. In a world where subscription increases outstrip inflation, the Green option isn’t sustainable in the long term. I know some publishers argue that subscription prices have to increase at this rate because of the growth in the number of research papers being written. This implies to me that the subscription model could be reaching its limits of scalability. In addition, in a world where too many research papers are published for any one researcher to keep on top of the ones in their domain, we need to use text and data mining tools to extract ‘knowledge’ from these papers for us. That is best supported in an Open environment.


Gold OA offers the most transparent and scalable pricing model, as well as with the right license, and the unrestricted use of text and data mining tools. As the Finch Group pointed out, the Article Processing Charge is, crucially, scalable and transparent, , which is critical in building trust among customers, allowing for flexibility among researchers, and enabling them to adapt the cost of their publishing activities to their research practices.


Various publishers have attempted to address the double-dipping issue, and of course we’re pleased to see this, but ultimately the hybrid-gold model perpetuates the institutions’ sense of frustration with a system that fails to acknowledge the contribution its own researchers have made to what is being purchased (and potentially withheld). In the UK, for instance, we are seeing Research Libraries UK (RLUK) and JISC Collections, supported by the Department for Business, Innovation and Skills (BIS), working with publishers to develop offsetting models that would take this contribution into account (for example, the Royal Society of Chemistry’s Gold for Gold initiative and the UK trial announced by the Institute of Physics). The Government, through BIS, wants to ensure a sustainable future for the scientific publishing industry – certainly in the UK, but with international cooperation – but it will be incumbent on the publishers themselves to opt in to this transition.


In the future, I believe publishers need to see themselves not just as gatekeepers to knowledge, but as enablers who add value by building services around the content. There is a world of difference between controlling access to content (the ‘closed’ model) and enabling access to content (the ‘open’ model). Services might include translations, machine reading, text and data-mining, automated exploration tools, premium submission systems, freemium business models. The key challenge is to build long-term, sustainable publishing business models in an open domain. And this not only applies to commercial publishers, but also learned societies and other public interest / not for profit organizations. In any case, as I’ve already mentioned, I’m very keen that we should be working together to achieve these goals.


Q: What about Open Data – how important is that for RCUK, for Europe and for G8?
Just about all I’ve already said applies equally to data. Geoffrey Boulton (lead author of the Royal Society’s Report ‘Science as an Open Enterprise’) makes a number of key points about the need for transparent science that can be proved or refuted, for instance. If anything, Open Data could well be a bigger challenge for research institutions than Open Access.


Q: How do you think scholarly communication, funding and the reward systems will look in five years’ time?
I hope we’ll have a more 21st Century set of tools, and a well-developed Open Access system by then. I would expect journals to still exist, but perhaps more as a brand. Where the journal brand could represent a certain caliber of content and quality, but could pull material from a number of sources (a sort of secondary publishing market). I would be disappointed if we were still mimicking a print business model in an e world, as the signs are there that we’re finally moving away from that.


I think the metrics will also have undergone considerable change. Altmetrics now seem to be moving into this space, but other measures of quality may also emerge. I think that peer review practices among particular communities may well diverge from the traditional model. And, identifiers such as ORCID and GRANTID will become increasingly important as well.


Thanks again Mark.

Filter Blog

By date:
By tag: