Jennifer Lin – 2018 August 12
Our newest dedicated content type—peer review—has received a warm welcome from our members since rollout last November. We are pleased to formally integrate them into the scholarly record, giving the scholars who participated credit for their work, ensuring readers and systems dependably get from the reviews to the article (and vice versa), and making sure that links to these works persist over time.
Many of our members make the peer review history available to researchers in different ways. Their extra effort to post review materials alongside the article will now go further once they are registered with us and linked to the journal article. They spoke of publishing peer reviews as a standard part of their publishing operation. The scholarly contributions of their editors and referees are validated, stewarded, and published in the manner of the articles: as per general practice. To fully realize this, they are ensuring that these publications are discoverable, citable, and part of the formal scholarly record—for all the thousands of systems which draw on Crossref metadata.
Article metadata + peer review metadata = a fuller picture of the evolution of knowledge
As of August 12, 2018 three publishers have registered 12446 peer reviews in the dedicated content type (and schema) we rolled out last November. PeerJ (10.7287) with 12015 at time of writing and Stichting SciPost (10.21468) with 297 works. ScienceOpen (10.14293) has registered 126 reviews of papers on their post-publication platform.
The peer review metadata collected is partly similar, though otherwise unique to other content. In the former, general metadata that we accept for the articles, as well as the reviews, include an ORCID iD to identify the reviewer, editor, and/or author 0; license 0. This metadata is quite distinct from the article and is important to collect, not only as a discrete publication in its own right, but also to provide richer context for the actual results shared in the associated article. They are authored by different people than the paper’s contributors (author response/rebuttal excepting). They need not have the same license.
Currently, none of this data has been registered. (From the publishers we’ve talked to, this is largely due to factors related to limitations in their technology systems.) And like other content types, we link up scholarly materials in the metadata and fill in the research nexus graph through relations.
There’s no better way to understand peer review metadata than to look at real examples from our members:
Review-specific metadata is also critical to capturing the shape of the scholarly discussion. These include:
PeerJ, SciPost, and ScienceOpen have registered this whole set where applicable (review round not applicable to post-publication reviews), with the exception of the recommendation.
Published peer reviews uniquely highlight the nature of research ideas evolving over time, spotlighting the nature of this as a collective effort involving multiple individuals. The more metadata, the bolder the story. We have created a set of reference metadata (fictitious) to illustrate this phenomenon. Josiah Carberry submits a manuscript to the Journal of Psychoceramics, entitled “Dog: A Methodology for the Development of Simulated Annealing.” It undergoes two rounds of review with two referees each round. The article https://doi.org/10.5555/12345681 is published and registered on May 6, 2012 along with the history of peer review materials on the same day:
Revision round 1
Published reviews can show peer feedback in progress; the progress of scholarly discussion unfolding, as expert ideas build upon each other. Many of us have traditionally located the article’s publication as the climactic event, but the story in fact doesn’t end there. Pre-publication becomes post-publication. Throughout this time, research is validated and sprouts into new ideas.
Peer review platform Publons is working on getting reviews authored on its platform registered with us. Doing so will mean that PeerJ article, “Transformative optimisation of agricultural land use to meet future food demands” by Lian Pin Koh, Thomas Koellner, and Jaboury Ghazoul https://doi.org/10.7717/peerj.188 with three scholarly discussions published over the course of peer review, would also be accompanied by a fourth that occurred after publication from Gene A. Bunin (https://publons.com/publon/3374/), not yet registered.
In my investigation of review publications registered, two examples cropped up, highlighting the richness of the research process not only as it shows a set of research results evolve through scholarly discussion, but as it is then folded into new research outputs.
1) A PeerJ article “Software citation principles” https://doi.org/10.7717/peerj-cs.86 has had a very rich life: https://0-api.crossref.org.libcat.lafayette.edu/works/10.7717/peerj-cs.86. It was originally submitted as a preprint and underwent multiple iterations of improvement (https://doi.org/10.7287/peerj.preprints.2169, https://doi.org/10.7287/peerj.preprints.2169v1, https://doi.org/10.7287/peerj.preprints.2169v2, etc.). It then was subjected to peer review. And three referee reports are published alongside the final publication:
We glimpse a view of time unfolding here:
NB: in the review metadata, all the dates provided reference September 19, 2016 when they were published with the accompanying research article. To really make the metadata useful, we recommend providing the date the review was received, rather than published (for publishers who are publishing pre-publication review materials).
The reviews were then cited in three versions of the F1000Research article, “A multi-disciplinary perspective on emergent and future innovations in peer review” (https://doi.org/10.12688/f1000research.12037.1, https://doi.org/10.12688/f1000research.12037.2, and https://doi.org/10.12688/f1000research.12037.3). These three all link up on the Crossref metadata map. The visualization below is only an entrypoint into this picture of research dissemination and the spread of ideas.
2) András Láng served as a reviewer for a paper by Danilo Garcia and Fernando R. González Moraga published as “The Dark Cube: dark character profiles and OCEAN” (https://doi.org/10.7717/peerj.3845). As of the blog release date, this paper has been cited by two sources:
Source: https://doi.org/10.7717/peerj.3845, CC-BY 4.0
What this view of the paper does not reveal is that Láng’s review (https://doi.org/10.7287/peerj.3845v0.1/reviews/2) provided such insight to the original researchers that the first author (Garcia) incorporates the discussion in his subquent work. This evidence is documented in the citation list of that new publication, “Encyclopedia of Personality and Individual Differences” (https://doi.org/10.1007/978-3-319-28099-8_2302-1). What a wonderful illustration of the ways in which peer reviews can operate like other publications, and how far is it from being unique. But up to now, we have not yet programmatically captured them in a formal way as we do now with these materials registered properly as a review.
In the same spirit of ever evolving knowledge, we also continue to update our schemas based upon community feedback. Are references important? Tell us! What new metadata on peer reviews are important to answer your questions or help you do what you need? Members, if you are interested in registering your peer review content with us, please get in touch.
2019 November 09
2019 November 09
2019 October 29