Crossref’s “DOI Event Tracker Pilot”- 11 million+ DOIs & 64 million+ events. You can play with it at: http://goo.gl/OxImJa
Tracking DOI Events
So have you been wondering what we’ve been doing since we posted about the experiments we were conducting using PLOS’s open source ALM code? A lot, it turns out. About a week after our post, we were contacted by a group of our members from OASPA who expressed an interest in working with the system. Apparently they were all about to conduct similar experiments using the ALM code, and they thought that it might be more efficient and interesting if they did so together using our installation. Yippee. Publishers working together. That’s what we’re all about.
So today (January 20th, 2015) the DOI HTTP resolver at dx.doi.org started to fail intermittently around the world. The doi.org domain is managed by CNRI on behalf of the International DOI Foundation. This means that the problem affected all DOI registration agencies including Crossref, DataCite, mEDRA etc. This also means that more popularly known end-user services like FigShare and Zenodo were affected. The problem has been fixed, but the fix will take some time to propagate throughout the DNS system. You can monitor the progress here:
Now for the embarrassing stuff…
At Crossref we mint DOIs for publications and send them out into the world, but we like to hear how they’re getting on out there. Obviously, DOIs are used heavily within the formal scholarly literature and for citations, but they’re increasingly being used outside of formal publications in places we didn’t expect. With our DOI Event Tracking / ALM pilot project we’re collecting information about how DOIs are mentioned on the open web to try and build a picture about new methods of citation.
Do you want to see if a Crossref DOI (typically assigned to publications) refers to DataCite DOIs (typically assigned to data)? Here you go:
Conversely, do you want to see if a DataCite DOI refers to Crossref DOIs? Voilà:
Background “How can we effectively integrate data into the scholarly record?” This is the question that has, for the past few years, generated an unprecedented amount of handwringing on the part researchers, librarians, funders and publishers.
Remember when I said that the Wikipedia was the 8th largest referrer of DOI links to published research? This despite only a fraction of eligible references in the free encyclopaedia using DOIs.
We aim to fix that. Crossref and Wikimedia are launching a new initiative to better integrate scholarly literature in the world’s largest public knowledge space, Wikipedia.
This work will help promote standard links to scholarly references within Wikipedia, which persist over time by ensuring consistent use of DOIs and other citation identifiers in Wikipedia references.
Summary You can use a new Crossref API to query all sorts of interesting things about who funded the research behind the content Crossref members publish.
Background Back in May 2013 we launched Crossref’s FundRef service. It can be summarized like this:
Crossref keeps and manages a canonical list of Funder Names (ephemeral) and associated identifiers (persistent). We encourage our members (or anybody, really- the list is available under A CC-Zero license waiver) to use this list for collecting information on who funded the research behind the content that our members publish.
[ Crossref Labs loves to be the last to jump on an internet trend, so what better than than to combine the Doge meme with altmetrics?
Note: The API calls below have been superceeded with the development of the Event Data project. See the latest API documentation for equivalent functionality
Want to know how many times a Crossref DOI is cited by the Wikipedia?
http://0-det.labs.crossref.org.lib.rivier.edu/works/doi/10.1371/journal.pone.0086859 Or how many times one has been mentioned in Europe PubMed Central?
The South Park movie , “Bigger, Longer & Uncut” has a DOI:
So does the pornographic movie, “Young Sex Crazed Nurses”:
And the following DOI points to a fake article on a “Google-Based Alien Detector”:
And the following DOI refers to an infamous fake article on literary theory:
This scholarly article discusses the entirely fictitious Australian “Drop Bear”:
The following two DOIs point to the same article- the first DOI points to the final author version, and the second DOI points to the final published version:
You can now easily search for publications and add them to your ORCID profile in the new beta of Crossref Metadata Search (CRMDS). The user interface is pretty self-explanatory, but if you want to read about it before trying it, here is a summary of how it works.
When you go to to CRMDS, you will see that there is now a small ORCID sign-in button on the top right-hand side of the screen.
We have just released a bunch of new functionality for Crossref Metadata Search. The tool now supports the following features:
A completely new UI Faceted searches Copying of search results as formatted citations using CSL COinS, so that you can easily import results into Zotero and other document management tools An API, so that you can integrate Crossref Metadata Search into your own applications, plugins, etc.