Downloads provided by UsageCounts
As part of the EOSC Task Force on FAIR Metrics and Data Quality, the FAIR Metrics subgroup works to examine the uptake and application of metrics of FAIRness and the use and utility of FAIRness evaluations. A range of FAIR assessment tools is designed to measure compliance with established FAIR Metrics by measuring one or more Digital Objects (DO, including datasets and repositories). Unfortunately, the same DO assessment by different tools often exhibits widely different results because of independent interpretations of the Metrics, metadata publishing paradigms, and even the intent of FAIR itself. In response to this status quo, the FAIR Metrics subgroup (represented by the authors of this report) brought together developers of several FAIR evaluation tools and enabling services (listed in the Acknowledgements) for a series of hands-on hackathon events to identify a common approach to metadata provision that could be implemented by all data publishers such as databases, repositories, and data catalogue managers. This led to identifying a process for (meta)data publishing based on well-established Web standards already in everyday use within data publishing communities, albeit not uniformly. A specification document describing the approach and a series of “Apples-to-Apples” (A2A1) benchmarks to evaluate compliance with this (meta)data publishing approach were created during a series of hackathon events. The authors of FAIR evaluation tools also began writing the code to ensure their independent tools would behave identically when encountering these A2A benchmark environments, thus helping to ensure that data publishers following this paradigm will be evaluated in a harmonized manner by all assessment tools; additional considerations for assessment tool harmonization are discussed later. This report explains the rationale for these workshop and hackathon events, outlines the outcomes and work done, describes the current status, and then discusses desirable next steps. The authors propose that the EOSC Association, the EOSC Task Force Long-term Data Preservation, and other groups and projects in Europe and worldwide consider this approach to deliver accurate means to all stakeholders that assist in achieving FAIRness of research data.
FAIR Principles, EOSC, HorizonEU, EOSC Association, Advancing OpenScience in Europe, OpenScience, FAIR Metrics, FAIR Assessment, EOSC Task force on FAIR Metrics and Data Quality
FAIR Principles, EOSC, HorizonEU, EOSC Association, Advancing OpenScience in Europe, OpenScience, FAIR Metrics, FAIR Assessment, EOSC Task force on FAIR Metrics and Data Quality
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 7 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
| views | 400 | |
| downloads | 291 |

Views provided by UsageCounts
Downloads provided by UsageCounts