Jennifer Lin – 2018 October 15
Over 100 Million unique scholarly works are distributed into systems across the research enterprise 24⁄7 via our APIs at a rate of around 633 Million queries a month. Crossref is broadcasting descriptions of these works (metadata) to all corners of the digital universe.
“Pre-prints” are sometimes neither Pre nor Print (c.f. https://doi.org/10.12688/f1000research.11408.1), but they do go on and get published in journals. While researchers may have different motivations for posting a preprint, such as establishing a record of priority or seeking rapid feedback, the primary motivation appears to be timely sharing of results prior to journal publication.
Jennifer Lin – 2018 August 12
Jennifer Lin – 2018 May 31
The Crossref graph of the research enterprise is growing at an impressive rate of 2.5 million records a month - scholarly communications of all stripes and sizes. Preprints are one of the fastest growing types of content. While preprints may not be new, the growth may well be: ~30% for the past 2 years (compared to article growth of 2-3% for the same period). We began supporting preprints in November 2016 at the behest of our members. When members register them, we ensure that: links to these publications persist over time; they are connected to the full history of the shared research results; and the citation record is clear and up-to-date.
Jennifer Lin – 2018 March 26
Jennifer Lin – 2017 November 14
Researchers are adopting new tools that create consistency and shareability in their experimental methods. Increasingly, these are viewed as key components in driving reproducibility and replicability. They provide transparency in reporting key methodological and analytical information. They are also used for sharing the artifacts which make up a processing trail for the results: data, material, analytical code, and related software on which the conclusions of the paper rely. Where expert feedback was also shared, such reviews further enrich this record. We capture these ideas and build on the notion of the “article nexus” blogpost with a new variation: “the research nexus.”
Jennifer Lin – 2017 October 24
About 13-20 billion researcher-hours were spent in 2015 doing peer reviews. What valuable work! Let’s get more mileage out of these labors and make these expert discussions citable, persistent, and linked up to the scholarly record. As we previously shared during Peer Review week, Crossref is launching a new content type to support the registration of peer reviews. We’re one step closer to changing that. Today, we are excited to announce that we’re open for deposits.
A number of our members have asked if they can register their peer reviews with us. They believe that discussions around scholarly works should have DOIs and be citable to provide further context and provenance for researchers reading the article. To that end, we can announce some pertinent news as we enter Peer Review Week 2017: Crossref infrastructure is soon to be extended to manage DOIs for peer reviews. Launching next month will be support for this new content type, with schema specifically dedicated to the reviews and discussions of scholarly content.
Jennifer Lin – 2017 March 02
Very carefully, one at a time? However you wish.
Last year, we introduced linking publication metadata to associated data and software when registering publisher content with Crossref Linking Publications to Data and Software. This blog post follows the “whats” and “whys” with the all-important “how(s)” for depositing data and software citations. We have made the process simple and fairly straightforward: publishers deposit data & software links by adding them directly into the standard metadata deposit via relation type and/or references. This is part of the existing content registration process and requires no new workflows.
2020 January 14
2020 January 13
2019 December 17
2019 December 11