Report on FAIR Signposting and its Uptake by the Community: EOSC Task Force on FAIR Metrics and Data Quality

Mark Wilkinson, Susanna-Assunta Sansone, Marjan Grootveld, Richard Dennis, David Hecker, Robert Huber, Stian Soiland-Reyes, Herbert Van de Sompel, Andreas Czerniak, Milo Thurston, Allyson Lister, Alban Gaignard

Research output: Book/ReportCommissioned report


The FAIR Metrics subgroup of the EOSC Task Force on FAIR Metrics and Data Quality is dedicated to scrutinizing the adoption and impact of FAIRness metrics and the practicality of FAIRness evaluations. Previous reports have highlighted the inconsistency in results when the same digital object is evaluated by different tools, attributing these disparities to varied interpretations of the Metrics, metadata publishing practices, and the fundamental objectives of FAIR principles. The authors of this report, representing the FAIR Metrics subgroup, have facilitated six workshops and hackathon-style gatherings, convening diverse FAIR assessment stakeholders—including tool developers, standards and repository experts, and interoperability specialists.

The initial workshops, as outlined in an earlier report endorsed by EOSC, pinpointed a metadata publishing design pattern known as “FAIR Signposting.” This approach offers a transparent, compliant, and straightforward mechanism for guiding automated agents through metadata spaces to locate three essential FAIR elements: the globally unique identifier (GUID), the data records, and the corresponding metadata records. Furthermore, these sessions led to the development of a paradigm for creating reference environments. These environments serve as benchmarks for evaluating the compliance of metadata harvesters with FAIR Signposting criteria and for standardizing the metadata harvesting process of FAIR assessment tools.

This report provides an updated overview of the recent progress achieved by the FAIR Metrics subgroup, encapsulating the advances from the latest two hackathon events. We outline the progress towards developing an integrated suite of FAIR assessment tests, which are now becoming standardized across various FAIR assessment instruments. Additionally, we present preliminary insights into the community's adoption of these practices and report on the ongoing effort to establish a FAIR testing governance body, along with strategies to ensure its enduring viability.

We, the authors, advocate for the EOSC Association and the EOSC Task Force on Long-term Data Preservation to endorse FAIR Signposting, along with the outcomes of our endeavours, as formal recommendations for EOSC-linked resources and EOSC-funded initiatives. We aim to promote an EOSC infrastructure where FAIR-enabling services can be assessed with clarity and uniformity. This will also ensure that tools designed for the EOSC will function with a standardized set of expectations applicable to all stakeholders - providers and users.
Original languageUndefined
Commissioning bodyEOSC Association
Number of pages22
Publication statusPublished - 11 Jan 2024


  • EOSC
  • FAIR
  • Task Force on FAIR Metrics and Data Quality
  • EOSC Association
  • FAIR Metrics
  • FAIR Principles
  • FAIR Assessment
  • Open Science
  • Advancing open science in Europe

    Goble, C., Soiland-Reyes, S. & Juty, N.


    Project: Research

Cite this