Blog

Request for feedback: Conference ID implementation

We’ve all been subject to floods of conference invitations, it can be difficult to sort the relevant from the not-relevant or (even worse) sketchy conferences competing for our attention. In 2017, DataCite and Crossref started a working group to investigate creating identifiers for conferences and projects. Identifiers describe and disambiguate, and applying identifiers to conference events will help build clear durable connections between scholarly events and scholarly literature. Chaired by Aliaksandr Birukou, the Executive Editor for Computer Science at Springer Nature, the group has met regularly over the past two years, collaborating to create use cases and define metadata to identify and describe conference series and events.

Building better metadata with schema releases

This month we have officially released a new version of our input metadata schema. As well as walking through the latest additions, I’ll also describe here how we’re starting to develop a new streamlined and open approach to schema development, using GitLab and some of the ideas under discussion going forward.

A Lustrum over the weekend

Jennifer Lin

Jennifer Lin – 2018 March 26

In Content TypesSchema

The ancient Romans performed a purification rite (“lustration”) after taking a census every five years. The term [“lustrum”]([https://en.wikipedia.org/wiki/Lustrum) designated not only the animal sacrifice (“suovetaurilia”) but was also applied to the period of time itself. At Crossref, we’re not exactly in the business of sacrificial rituals. But over the weekend I thought it would be fun to dive into the metadata and look at very high level changes during this period of time.

The research nexus - better research through better metadata

Jennifer Lin

Jennifer Lin – 2017 November 14

In Content TypesSchema

Researchers are adopting new tools that create consistency and shareability in their experimental methods. Increasingly, these are viewed as key components in driving reproducibility and replicability. They provide transparency in reporting key methodological and analytical information. They are also used for sharing the artifacts which make up a processing trail for the results: data, material, analytical code, and related software on which the conclusions of the paper rely. Where expert feedback was also shared, such reviews further enrich this record. We capture these ideas and build on the notion of the “article nexus” blogpost with a new variation: “the research nexus.”

Peer reviews are open for registering at Crossref

About 13-20 billion researcher-hours were spent in 2015 doing peer reviews. What valuable work! Let’s get more mileage out of these labors and make these expert discussions citable, persistent, and linked up to the scholarly record. As we previously shared during Peer Review week, Crossref is launching a new content type to support the registration of peer reviews. We’re one step closer to changing that. Today, we are excited to announce that we’re open for deposits.

RSS Feed

Archives