Blog

 5 minute read.

Myth busting in Mumbai

In December, Crossref’s Head of Metadata, Patricia Feeney and I headed to Mumbai for our first ever LIVE local event in India, held in collaboration with Editage.

Crossref membership in India has escalated in recent years, with a fifth of its 500 members joining in 2017 alone. Around 40% of these new members are smaller organizations who joined through one of the eight sponsors we currently have in the country.

With such a large increase in membership numbers, it seemed timely to visit and meet both our new and longer-standing members face-to-face. Our LIVE local events provide a great opportunity for us to learn what challenges our members in the community face, so we can understand how to best meet their needs. It also gives us a chance to explain in detail how to benefit from the services we offer, as well as keep them informed about any future developments. A special thanks goes to Editage for all their help in organizing, promoting, and running this event with us.

“LIVE

The Mumbai event was held at the Sahara Star hotel and attended by participants from a range of organizations, with varying levels of knowledge about Crossref. Patricia talked about how to register your content and the importance of providing us with accurate and comprehensive metadata. She also introduced our new Metadata Manager tool, which many participants were excited to hear more about. I gave an overview of Crossref services, with a specific focus on Crossmark and Similarity Check. The afternoon session was run by Editage, and featured a session on ‘Helping journals and publishers to get closer to authors’, followed by a lively debate on research integrity. The debate brought up a number of interesting talking points, including how to attract more students into a career in research, issues around malpractice and plagiarism, and how to improve India’s research culture.

The Q&A part of the day highlighted a number of myths about Crossref that I thought would be worth detailing here, as other members may benefit from these explanations.

Myth #1: Crossref is a mark of publisher and content quality

We have a membership application process where we ask for different types of information and make it clear what the Crossref member obligations are. Crossref doesn’t assess the quality of its members’ content or verify members’ publication processes and procedures. It’s not our role or part of our mission to do these things.
It’s important to remember that content with a Crossref DOI says nothing about the quality of the content, or that it is peer-reviewed or authoritative.

Myth #2: Crossref archives content

We store the metadata our members provide about a piece of content, not the content itself. Our metadata is openly available across our APIs and search interfaces. The same applies for access to the full-text. A DOI will take you to a landing page for a piece of content, but access to the full-text will depend upon the content owner’s publishing model.

Myth #3: Crossref provides impact factors

On publisher websites, you’ll sometimes see the number of times a paper has been cited in Crossref, Google Scholar, Web of Science, etc. The Crossref citation information is made available to publishers through our Cited-by service, but it is not an impact factor. Cited-by counts are based on the subset of Crossref’s members participating in that service, so they’ll probably differ from other sources. Crossref Cited-by counts are meant to complement other services rather than replace them.

Myth #4: Crossref charges to make updates or corrections to the metadata associated with a DOI

Not true - while you have to pay for your initial registration, any subsequent updates, corrections or additions you make to the metadata of a content item is free of charge (apart from Crossmark metadata). If you’re a member, we actively encourage you to update your metadata to ensure that your records are as comprehensive and accurate as possible. This helps the scholarly community find and use the content you publish.

Myth #5: Crossref charges for failed deposits

Only deposits which are successful will be counted. You will receive an error message if your metadata deposit has failed, so you are aware of any errors and can re-submit. If you’re not sure what has gone wrong, you can contact our support team.

Myth #6: You need to have separate prefixes to register different content types

You can register all your content types under one prefix (and you don’t need to tell us if you start to do so).

Myth #7: DOI resolutions are how many DOIs you have registered

No. When someone clicks on a DOI link for an article, we count that as one DOI resolution. This is different than the number of unique DOIs you have registered with us. We’ll send you a resolution report once a month which provides details of your total number of resolutions, as well as DOIs which have been most frequently clicked, and any resolution failures. These failures can be an indication that you need to update your metadata with us for that particular article to ensure your DOI is directing readers to the correct webpage.

Myth #8: Crossref own the plagiarism software used in Similarity Check

The Similarity Check service is provided in collaboration with Turnitin who run the iThenticate text-comparison tool. The iThenticate database is the largest comparison database of full-text academic content in the world. Similarity Check participants enjoy cost-effective use of iThenticate because they contribute their own published content into Turnitin’s database. Turnitin also provides our members with access to additional features in iThenticate, such as enhanced text-matches within the document viewer and access to a dedicated Similarity Check support team in order to discuss any technical or billing queries.

It’s great to have the opportunity to do some myth-busting! You’re bound to have more questions, so we’ll be running more LIVE locals in 2019, as well as virtual events. To keep updated, follow us @CrossrefOrg, or subscribe to our newsletter.


See also:

comments powered by Disqus
RSS Feed

Recent Posts

Blog

2019 August 29

2019 election slate

2019 August 23

Archives

Last Updated: 2019 January 22 by Vanessa Fairhurst