Adherence to the FAIR Principles - that scientific outputs should be Findable, Accessible, Interoperable, and Reusable - is rapidly becoming a requirement of funding agencies and journals. This manuscript describes the first objective approach to quantitatively evaluating compliance with the FAIR Data Principles.
The idea to make all research output - data, workflows, papers and other publications - Findable, Accessible, Interoperable, and Reusable (FAIR) is becoming widely accepted among science funders and the scholarly publishing community. However, in the absence of any way to measure "FAIRness", a wide range of interpretations of FAIR have appeared, often with the goal of minimizing any change to current practices. In response, a group of interested stakeholders, including journal editors, data archivists, FAIR experts, and software developers, including CBGP researcher Mark Wilkinson, assembled with the goal of creating a set of definitive metrics for objectively and quantitatively measuring FAIRness of a data resource. The publication of this first set of Metrics [https://doi.org/10.1038/sdata.2018.118] was recently reviewed by John Hancock, an F1000 Faculty Member representing ELIXIR Europe, and was rated as "Exceptional" - the highest score possible for an F1000 review. The FAIR Metrics publication is currently tracking in the top 5% of all research outputs followed by AltMetrics, and is the #1 ranked article from Nature Publishing Group's Scientific Data journal (of articles of a similar age).