The practice of scientific research is a surprisingly long one. After securing research funding, a researcher faces multiple years of experimentation before reaching a discovery worthy of publishing. However, a manuscript is far from the last stop in the journey to scientific success. The publishing scientist must submit his or her work to a journal. After submission, he or she must wait for the opinions of scientific editors and fellow scientist reviewers to determine if the manuscript has the impact and rigor to merit publication. The answer might be yes, but more commonly the scientist must endure many rounds of additional experiments, revisions, waiting, and resubmissions before this process of “peer review” deems a manuscript worthy for print. Finally, after all this, only those who have paid subscription fees to the journal may view the findings of any scientific story published in a typical journal, even if taxpayers or donors funded the research.
Scientists widely acknowledge that evaluating scientific research is incredibly difficult. In the life sciences, the database PubMed currently hosts over 23 million individual research articles, each serving as a form of currency between scientists producing the work and funding or hiring committees consuming and evaluating the work. Scientists struggle to prove the merit and impact of their work to their peers inside their field, and to the broader groups of scientists that make up the funding and hiring committees.
Even after the effort to get a manuscript into a respected journal, there is evidence that the revered establishment of peer review can falter. In 2011 the prestigious journal Science published a paper on a new species of bacteria that use the normally toxic element arsenic instead of generally utilized phosphate in their cellular biochemistry. The field has failed to corroborate the findings, and scientists largely consider the paper incorrect. In early 2014, several research groups failed to reproduce a groundbreaking paper on stem cells published in Nature. The lead author of the paper was later deemed guilty of scientific misconduct by her institution. Occasionally papers published with much fanfare fail to prove true, even after peer review in the most highly respected academic venues.
Moreover, evaluating research is more important than ever, as competition for research jobs and funding continues to escalate. No individual hiring committee has the capacity to read 300 submitted applications. However, a quick browse through the journal names of each applicant’s publications takes less time. The concept of “journal impact factor” utilizes the assumption that a paper published in a prestigious journal matches the quality of the journal’s reputation. This concept is numerically represented as the number of citations a journal receives divided by the number of papers published in that journal for a given year. However, this metric may be imperfect. A few select papers can generate the majority of citations for a given journal and so the journal impact factor may not reflect the impact of an individual story. Moreover, equating the quality of a given article with the journal that publishes it incentivizes scientists to focus their research on what will get them publication in top journals. In the case of the arsenic bacteria or stem cell stories of the past few years, this might not always be a good thing. Researchers must battle these questions in their ongoing quests to discover and communicate their scientific findings.
Design: Cindy Wang; Faces: adapted from www.webdesignhot.com
In the fall of 2012, three of the most respected scientific funding agencies decided that furthering research meant not only helping scientists conduct experiments, but helping them publish their results. Howard Hughes Medical Institute, the Max Planck Society, and the UK Wellcome Trust have launched eLife, a new journal for the life sciences. eLife aims to streamline several obstacles in scientific publishing: how authors submit papers, how reviewers evaluate quality and significance, and even how fellow scientists read the story once it is published online. eLife is entirely open access, meaning that anyone can access the published findings freely. This is in contrast to traditional journals, which usually require some form of payment to read most articles. University libraries struggle to budget for subscription access to a wide array of journals for their students and faculty. Often, budget limitations require libraries to pick and choose which journals they subscribe to, and publishers maintain a tight grip over their captive audience. Should an average member of the public want to peruse the research their tax dollars paid for they must pay a $25-50 fee per article.
Scientists from Berkeley play important roles in every level of the journal’s infrastructure. Berkeley professors are among the esteemed editorial boards, including Editor in Chief Randy Schekman, a recent Nobel laureate and professor in the Department of Molecular and Cell Biology. University research groups have published over twenty articles in eLife since its debut just over a year ago. Such participation in the new journal’s beginnings has stimulated discussion about peer review and open access, as eLife and its editors insist that scientists need to actively strive to improve the current state of scientific publishing.
Better peer review
eLife has implemented a number of changes to make submission a more efficient process. First, authors submit an initial manuscript to the senior editor in their given field. Within three days, the senior editor decides if the paper merits peer review. If the answer is yes, the senior editor requests a full submission and hands the paper to a reviewing editor, who facilitates an online discussion with two other reviewers on the merits of the submitted manuscript. In the case of a positive decision, the reviewing editor sums up the discussion in a single letter that clearly lists next steps for the authors. This contrasts with other journals, which send papers under review to three independent reviewers who do not interact with each other during the review process. Submitting authors then receive three sets of comments that can run the gamut from redundant to conflicting. In contrast, at eLife the reviewers know one another’s identities, and discuss the merits and weaknesses of a paper together. The average paper goes through only a single round of review. The median length of acceptance is 42 days. Such short timelines suggest that the months or years an unlucky article may spend “in review” at many traditional journals are unnecessary.
eLife also continues to encourage transparency in peer review by allowing authors to publish decision letters and rebuttals along with the actual article and by encouraging reviewers to sign their names on the published document. Authors are embracing the option to publish reviewer comments with names attached. As Polina Lishko, an assistant professor of molecular and cell biology notes, “You can decide to publish [the comments]. That actually promotes reviewers to be really fair.”
Once the peer review process is completed, eLife also distinguishes itself from other journals by the format of its published papers. As an electronic-only journal, it does not have a page limit, allowing authors to take more space to clearly construct interpretation and discussion of published findings. The journal also encourages authors to move data that would normally be in the supplemental files into the main text. Such an attitude towards article length strongly contrasts to the very high-profile journals that have the broadest reader audience but the shortest article limits. A breakthrough discovery published in a high-profile journal can often be compressed so drastically to meet a four-page limit that the resulting text is nearly impenetrable. Should a reader attempt to actually comprehend what is communicated they are forced to dive into lengthy supplemental text that has received little, if any, editorial polishing and as such may be just as incoherent as its companion article. Lishko agrees that by allowing more figures and text, eLife granted their manuscript “improved access of a general audience to scientific publication.”
eLife’s fast turnaround from submission to publication gave assistant professor Polina Lishko an edge over her competitors.
However, to some readers, eLife can improve on the length of its articles. When one research group discussed an eLife paper for a monthly journal club, the presenter emailed the article with the message, “this paper, published in eLife, has eleven figures and is 26 pages long.” Ideally, research communication balances details and efficiency within a cohesive narrative. Four figures and no room for discussion may be too short, but what is too long?
So far, Berkeley researchers’ experiences publishing in eLife have been largely positive. Lishko said she submitted her paper to eLife instead of Cell because the faster turnaround made a big difference. “We had competitors who were working in this direction and we got afraid that if we submitted to Cell, it would take much longer.”
eLife also encourages authors to talk about work before it is published, which strongly appealed to Lishko and the paper’s first author. “We felt liberated to present absolutely new and unpublished data at the scientific meeting,” she asserted. Traditional journals with longer publishing timelines often hold authors back from discussing unpublished data in public settings. The fear that a competitor will see unpublished work at a conference and quickly push out similar findings leads authors to keep data under wraps until they have been published. Quick turnaround coupled with the encouragement to discuss data under review at eLife without fear of rejection if competitors publish a competing article, gives authors more confidence to share findings ahead of publication. “That is how scientific discussion is supposed to take place,” says Lishko. As a scientist in a rapidly evolving and competitive field, Lishko strongly believes that the secrecy required in the race to publish first hurts the quality of scientific discourse, “What promotes unhealthy relationships shouldn’t be part of the scientific process. We should care about the truth and what is real, what is actually happening, not playing the competition game. What matters is not who the first is, but who is right.”
Manchuta Dangkulwanich, a graduate student in the college of chemistry, said her group decided to submit to eLife because of the quality of work others had already published there in her field of single molecule biophysics. “[eLife] is a new journal so we weren’t quite sure” she admitted, but “we saw a couple papers and they were good and exciting work” so they decided to submit. She was impressed at how the journal lived up to its advertised efficiency. “I think in the future, if I have more papers, I would consider sending them to eLife because the process was so smooth.” Dangkulwanich was also part of a research team that published a paper in Cell previously and said her turnaround time at eLife definitely beat the other journal. “It’s faster for sure.”
Papers published through eLife are completely open access; no subscription is required to view content. This decision made by Howard Hughes Medical Institute, Max Planck Society, and the UK Wellcome Trust, follows in the footsteps of open access pioneers including the Public Library of Science (PLoS). However, financial support from the three funding agencies covers the cost of running the journal so authors do not pay any publication fees. This financial support actually makes publishing in eLife less expensive than in subscription journals. Although subscription journals do not require payment to publish articles, journals charge additional fees for features including color figures, or even color figures in the supplemental text not printed in the hard-copy journal. PLoS and other open access journals charge a fee to submitting authors for publication instead of a reader subscription. eLife does not need to cover print costs as it exists exclusively online so its payment model cuts out the middleman: the funding agencies finance publishing directly, instead of giving money to researchers who will then pay traditional publishers.
This model of course depends upon the funding agencies agreeing to fund an entire journal. The support of such respected funding agencies sends a powerful message. Open access advocate, PLoS founder, and UC Berkeley molecular and cell biology Professor Michael Eisen applauds the three funding agencies for making a statement by funding eLife to be completely free to both authors and readers. “I love the idea that with eLife, they said, ‘We are just going to fund publishing. Why should we go through this crazy game of giving our researchers money and then having them give it back to us to publish? It is stupid, let’s just fund publishing.’”
In addition to controlling publishing costs, many researchers on campus believe in the importance of free access to scientific research. Molecular and cell biology Professor John Kuriyan is on the senior editorial board at eLife, and he asks why his research that is funded through public institutions and philanthropies is hidden behind a paywall. He questions, “Why is it that most of my own papers are not accessible to me if I step outside the university network?” Aside from constrained literature access outside of the university, Kuriyan points out that in other countries some scientists struggle against paywalls, “I am from India, and if you are a researcher in India access to the scientific literature is a major barrier.” Even closer to home, many researchers that leave universities for other endeavors also lose access, “If you start a small biotech company here in south San Francisco, the scientific literature is closed to you unless you pay exorbitant fees or use a university account that you have access to somehow,” notes Kuriyan.
Historically, the prestige of a journal has been measured by its impact factor: a metric dividing the number of citations a journal receives by the number of articles it publishes. High impact factors can thus be reached by only publishing a small number of cutting-edge or trendy articles. eLife adamantly insists that journal impact factor should no longer be the determining metric of a given article, but currently impact factor remains the currency a researcher must use to compete for jobs and funding. As such, forgoing submission to prestigious journals like Cell, Science, or Nature for a different and newer journal involves a certain degree of career risk. Schekman acknowledges that a paradigm shift may take time. “We are doing what we can to change that mindset. It is not going to happen overnight. People have been embracing what we do organically, but not logarithmically.”
With increasing numbers of scientific articles and citations, eLife is gaining positive reviews within the United States. Also promising is that some of eLife’s very first authors have successfully landed positions with papers published in eLife. For example,Claudio Ciferri, a former postdoc in the Department of Molecular and Cell Biology, published his paper in one of the first issues of eLife and still found a desirable job in industry based off his work and letters from an eLife editor and his postdoctoral advisor.
However, eLife’s visibility still lags far behind competing journals, and some authors may not be as fortunate. Postdoc Margaret Stratton of the molecular and cell biology department said she and her co-authors decided to submit to eLife instead of Science because the new journal’s mission strongly resonated with their own beliefs. Stratton says, “We all agree that there is a problem in scientific publishing right now and the way people are judged on the science that they do.” Nonetheless, she worries about her competitiveness as she applies for academic positions. “Unfortunately, a Science paper still holds the bar,” she admits. Stratton hopes that her work will pique enough interest to stand out to hiring committees across the United States. “Hopefully [prospective employers] will judge my paper on the science that was done and not on where it was published. But we'll have to wait and see.”
caption: Postdoctoral fellow Margaret Stratton submitted her research to eLife because she supported the journal’s vision of fair publishing.
For those applying outside the United States for academic positions, finding a job might be even more difficult. Yusong Guo, a postdoc in molecular and cell biology, says that because eLife is not an established journal, one institute in his native China automatically cut his application, “In China there is one institute that said eLife does not have an impact factor and so they rejected me.” In spite of this, Guo still received invitations for interviews both in the United States and in China. Guo remains enthusiastic about eLife and its mission, “It was really a great experience to submit my paper to eLife. It’s very fast. I think the revision process is terrific,” he beamed. He also acknowledged that time might be the biggest aide to future publishing postdocs looking for jobs. “It is a new journal and it still needs some time to be recognized in the field. I think it will be a very top journal in the future. But it takes some time for people to recognize that journal.” For postdocs like Stratton and Guo, however, the question remains if the scientists currently sitting on hiring committees internationally will notice and appreciate papers from this younger journal with different ideas.
Postdoctoral fellow Yusong Guo worries that his eLife publication lacks the prestige of other journals, hampering his job prospects.
On the other side of the coin, established scientists urge young researchers to take the risk. Schekman himself acknowledges that this is a difficult task. “The trouble is that the younger generation is not at fault here. We have created a monster.” He acknowledges that both established scientists like himself and early career postdocs looking for academic positions and funding have to take action. “Someone has to take a stand and create an alternative that is a better model that people can accept, or not. That is a big challenge. We are dealing with an entrenched bureaucracy and a mindset that is very difficult to change.” Some younger scientists wholeheartedly agree with this notion but are hoping for evidence of change before they go on the job market. “I just wish Berkeley would hire someone with an eLife paper,” remarked one postdoc.
Such dilemmas lead to central questions regarding scientific publishing: what is the role of a publishing scientist? Is it fair to ask early career researchers to risk their futures if funding and hiring committees stilllook for papers in “luxury journals”? The number of scientific papers published grows exponentially with every year; how are scientists to preliminarily filter papers if not by journal title? eLife wants to make scientific communication and evaluation better. How?
A fascinating project of eLife is its online article viewer, eLifeLens. Born from ideas of Berkeley bioengineering graduate student Ivan Grubisic, Lens is an online reading platform that facilitates viewing a scientific article. Grubisic says he got the idea shortly after the journal had a debut reception on campus. “I just got frustrated looking at the literature,” he admitted. This frustration made him think of how the scanned PDFs he pulls from the Internet make good hard copies but make electronic reading difficult. “We store the PDFs [online] but we are going to read it on a screen,” Grubisic observed. After talking with Schekman about his ideas, Grubisic developed a prototype viewer, and ultimately started collaborating with Michael Aufreiter from Austria and eLife staff to create Lens.
Lens utilizes a web browser to present a scientific article in a useful way. An article viewed in Lens shows text on the left hand side. The trick is that Grubisic and coworkers linked mentions in the text of features like figures, citations, and author affiliations to an adjacent display panel on the right. This allows the reader to browse the text corresponding to a figure side-by-side instead of flipping through pages of a PDF. It also allows the reader to seamlessly check citations without losing one’s place.
Lens is a useful online tool, but it importantly highlights the role of open access in the creation of new utilities. To the average reader, open access signifies that an article is available without a subscription fee. However, in their scientific article about Lens, the developers acknowledge that, “The open access scientific community has standardized much of their content into a formally annotated [standardized] format, making it easier to gain access to a large library of articles.” This access allowed developers to change the look and feel of an article and to create applications like Lens. Such freedom could encourage innovation and experimentation with new ways to access, share, and understand scientific content.
Access to this data could help resolve the open question of how to accurately and concisely measure the impact of an individual article. Impact factor has generally been the only metric at our disposal because citation numbers were the only data available for a long time. However, if it was possible to integrate downloads, comments, ratings, and other metrics, it may be possible to develop more sophisticated metrics than a simple citation count. Of course, this is a very difficult problem to solve, and some scientists are rightly skeptical about the efficacy of the pursuit for better article metrics.
data source: Berkeley Science Review (126 survey participants)
However, Kuriyan points out that the tech industry has solved comparable sorting problems. A colleague introduced Kuriyan to social media platform Twitter and dared the Berkeley professor to create an account. “I discovered something that nothing had prepared me for,” Kuriyan noted. “If you start a Twitter account, and you cultivate it for a few days by choosing whom to follow and whom to reject, you very quickly have a news feed that is highly focused, germane, and brings up things from the corners of the earth that are of interest to you.” The truth is that innovation requires a starting material. In this case it is not only the scientific information in an academic article, but the source code that allows one to connect comments or citations to specific figures, track post-publication, or other ideas. It is not the solution to the problem of scientific evaluation, but it is an important start. Eisen comments, “We don’t even have any idea what you can do with this stuff because we are so limited in what we allow and what we let people do. That is just the core problem.”
Will eLife create positive change for the nature of scientific publishing? Only time will tell. Its peer review structure and online article viewer indeed increase the efficiency of scientific communication. With the support of three premier funding agencies, the journal strives to send a powerful message of speed, transparency, and innovation to the biomedical research community worldwide. The future of scientific communication does not merely rest on the researchers that discover new scientific insights or the journals that publish them. Funding and hiring committees must embrace a paradigm shift in what determines research value if true change is to take place in the life sciences. In this sense, the mere fact that three funding agencies founded eLife with the intent to impact publishing might be its most striking statement.
As a topic that affects life scientists at every level, the struggles to submit, publish, and evaluate science must evolve into pro-active change. eLife’s debut has been a platform for life scientists to discuss what changes could or should occur in publishing. Schekman remains optimistic, “[eLife is] working on people at all levels. Things can change.”
Featured image credit: Cindy Wang
This article is part of the Spring 2014 issue.
Notice something wrong?
Please report it here.