"#","URI","Aspect","Existing Service","Existing Tool","Proposed Service","TRL","Description","CPP mapping","Mapping requirements","Status" 1,1,"""Adoption of open API standards or protocols (REST, OAI-PMH)""","Swagger API Hub","Swagger UI","Not required, is solved by existing tools and services",,"To manage, expose, and validate the repository's use of open API standards and data exchange protocols, ensuring machine-readable discoverability and persistent access to documentation.",,,"Rejected" 3,2,"Reference implementations and API guidelines",,"awesome-docs","EOSC Documentation guidance service",,"As it would be very time consuming and difficult to automatically check whether an API is following certain guidelines. The best a service can offer is guidance for which tools to use.",,"INTEROPERABILITY-TECHNICAL-REQ-031","Rejected" 4,3,"Conformance testing tools","Swagger Validator","Swagger Parser, Schemathesis","EOSC API Specs & test result service",,"Service which refers to the latest batch of the latest validation of the API specs and tests which confirm the correct working of the endpoints. It would be feasible to create a service which is able to combine the results of tools and make the information available via a http endpoint. Giving a list of tools would result in an exhaustive list of all possible combinations. Thereby it would be better if there will be a service in which they can define which tool, a persistent link to the latest test results and if the test was successful yes or no.",,"INTEROPERABILITY-TECHNICAL-REQ-032, INTEROPERABILITY-TECHNICAL-REQ-033, INTEROPERABILITY-TECHNICAL-REQ-034","Rejected" 7,4,"Support for both General-purpose and Domain-specific Formats",,,"EOSC Format policy service",,"Services which publish format policies and preferred, acceptable and rejected formats and also which data types are accepted for the archive. As the supported data types and file formats can be published in a machine-readable format (e.g. JSON or RDF), it is feasible to expose this information through a standard API or static endpoint for external discovery and reuse. While it is challenging to automatically determine whether a repository fully supports community-specific or domain-preferred formats, a service could provide guidance on best practices and offer references to curated format registries such as PRONOM or FAIRsharing.",,"INTEROPERABILITY-TECHNICAL-REQ-035, DISCIPLINE-SPECIFIC-REQ-006, DISCIPLINE-SPECIFIC-REQ-007, DISCIPLINE-SPECIFIC-REQ-017","Approved" 8,5,"Metadata Alignment across Heterogeneous Formats","FAIRsharing",,"EOSC Registry for supported schemas and crosswalks by the archive",,"A lightweight add-on service that complements metadata archives by publishing and maintaining a curated list of supported metadata schemas, data formats, and crosswalks. The registry provides a lookup function so users can quickly see which schemas an archive supports, what crosswalks (mappings) exist between schemas and how metadata standards vary across disciplines. The service must be easy to integrate with archives and registries, enabling them to expose their schema and crosswalk support in a consistent, discoverable way. By offering this shared lookup layer, the registry lowers barriers for data providers, helps researchers choose the right repository, and facilitates interoperability across scientific communities.",,"INTEROPERABILITY-TECHNICAL-REQ-036, INTEROPERABILITY-TECHNICAL-REQ-037, DISCIPLINE-SPECIFIC-REQ-011","Added to deliverable" 9,6,"Integration Tools for Bridging Formats",,"OpenRefine, RO-Crate tools","Old: EOSC EDEN Transformation Service New: EOSC EDEN Data Transformation Services Registry",,"A registry service which publishes endpoints of service providers capable doing transformation of data and metadata to other standards, tool list and workflow roles for metadata and data format interoperability",,"INTEROPERABILITY-TECHNICAL-REQ-038, INTEROPERABILITY-TECHNICAL-REQ-039, DISCIPLINE-SPECIFIC-REQ-006, DISCIPLINE-SPECIFIC-REQ-008, DISCIPLINE-SPECIFIC-REQ-009","Approved" 11,7,"Support for Different Packaging Standards",,"bagit, RO-Crate tools, Archivemetica","EOSC EDEN Packaging interoperability registry",,"A registry for accepted infomration packaging formats, internally used information packaging formats of archives, conversion tools used, which community-specific information packages are used by the archive, and the discovery mechanisms.",,"INTEROPERABILITY-TECHNICAL-REQ-040, INTEROPERABILITY-TECHNICAL-REQ-041, INTEROPERABILITY-TECHNICAL-REQ-042, INTEROPERABILITY-TECHNICAL-REQ-043, DISCIPLINE-SPECIFIC-REQ-007","Approved" 14,8,"Top-level and fine-grained search mechanisms",,"OpenAIRE IIS","EOSC Metadata query service (OpenAIRE)",,"Service which gives consumers the ability to query OpenAIRE which contains content of different domains",,"INTEROPERABILITY-TECHNICAL-REQ-044, DISCIPLINE-SPECIFIC-REQ-016","Approved" 15,9,"Federated and cross-domain query support",,"OAI-PMH tools","EOSC Metadata query service (OAI-PMH)",,"Service which gives consumers the ability to query OAI-PMH harvested data",,"INTEROPERABILITY-TECHNICAL-REQ-044, DISCPLINE-SPECIFIC-REQ-016","Rejected" 16,10,"Metadata indexing at multiple levels","OpenAIRE EXPLORE",,"EOSC EDEN Federated indexing and discovery service (Contribution to OpenAIRE Explore)",,"Not introducing a new platform for discovery, but rather extending and further integration to OpenAIRE Explore, which already provides advanced metadata graph-based search and discovery layers. This has to be done by adding more sources from the EOSC EDEN network, enhancing schema crosswalks, and ensuring discipline coverage. By contribution of the EOSC EDEN on these areas, the outcome will be a more enriched OpenAIRE Explore index that offers more complete coverage of research outputs and data sources relevant for international collaboration. ",,"INTEROPERABILITY-TECHNICAL-REQ-045, INTEROPERABILITY-TECHNICAL-REQ-046, DISCIPLINE-SPECIFIC-REQ-024, DISCIPLINE-SPECIFIC-REQ-034","Approved" 17,11,"Discovery APIs supporting varying granularity",,,"EOSC API fields Discovery service",,"Discovery API which exposes common fields between EOSC services with GraphQL support.",,"INTEROPERABILITY-TECHNICAL-REQ-047, INTEROPERABILITY-TECHNICAL-REQ-048, DISCIPLINE-SPECIFIC-REQ-016, DISCIPLINE-SPECIFIC-REQ-024","Approved" 18,12,"Cross-community PID harmonisation",,"PID Graph","EOSC EDEN identifier service",,"EOSC EDEN-wide PID resolution service, which functions as a registry for PID supported types, federated resolution service and connects to API for linking PIDs across domains","CPP-005 - Identifier management","INTEROPERABILITY-TECHNICAL-REQ-049, INTEROPERABILITY-TECHNICAL-REQ-050, DISCIPLINE-SPECIFIC-REQ-014, DISCIPLINE-SPECIFIC-REQ-015","Rejected" 19,13,"Machine-resolvable PIDs","DataCite REST API, ORCID",,"Not required",,,,,"Rejected" 21,14,"Metadata-linked identifiers","schema.org, doi.org content negations",,"Not required",,,,,"Rejected" 23,15,"Clear EOSC PID policy and lifecycle tracking","DataCite REST API, ORCID",,"EOSC PID Lifecycle Registry",,"A central EOSC service to store repository PID policies, track versioning and lifecyle states, provide machine-readable lifecyle metadata, and log changes","CPP-005 - Identifier management","INTEROPERABILITY-TECHNICAL-REQ-051, INTEROPERABILITY-TECHNICAL-REQ-052, DISCIPLINE-SPECIFIC-REQ-014, DISCIPLINE-SPECIFIC-REQ-015, DISCIPLINE-SPECIFIC-REQ-027","Approved" 25,16,"Standards Support","OpenAIRE AAI",,"Not required already solved by other projects",,,,,"Rejected" 27,17,"Support for domain-specific metadata profiles","FAIRsharing, MSCR",,"Not required already solved by other projects",,"Registry of meatdata standards across disciplines, provides profile documentation, links to schemas and crosswalk information",,,"Rejected" 28,18,"Crosswalks between metadata schemas","FAIRsharing, MSCR","OpenRefine","Not required already solved by other projects",,"Catalogues metadata standards and crosswalks; includes machine-readable mapping data and links to human-readable documentation.",,,"Rejected" 31,19,"Multilingual metadata support",,,"EOSC Multilingual Metadata Hub",,"A central EOSC service for storing, translating, and serving multilingual metadata with machine-readable language tags, multilingual search capabilities, and APIs for repositories to consume translations.",,"INTEROPERABILITY-SEMANTIC-REQ-053, INTEROPERABILITY-SEMANTIC-REQ-054, DISCIPLINE-SPECIFIC-REQ-003, DISCIPLINE-SPECIFIC-REQ-011, DISCIPLINE-SPECIFIC-REQ-036","In review" 32,20,"Schema.org or FAIR-compliant metadata exposure","schema.org, FAIRsharing","DCAT-AP validator, RDF validator","Not required, can be achieved with FAIRsharing",,,,,"Rejected" 35,21,"Reference repositories for semantic artefacts","FAIRsharing, LOV (Linked Open Vocabularies)","Skosmos","EOSC Semantic Artefact Registry",,"An EOSC-managed registry aggregating domain-recognised semantic artefacts, ensuring they have PIDs, are linked to human-readable documentation, and are resolvable to machine-readable formats.",,"INTEROPERABILITY-SEMANTIC-REQ-055, INTEROPERABILITY-SEMANTIC-REQ-056, INTEROPERABILITY-SEMANTIC-REQ-057, DISCIPLINE-SPECIFIC-REQ-012, DISCIPLINE-SPECIFIC-REQ-033, DISCIPLINE-SPECIFIC-REQ-039","Approved" 38,22,"Use of community-endorsed controlled vocabularies","FAIRsharing, LOV (Linked Open Vocabularies)","Skosmos","Not required, solved by existing services and tools",,,,,"Rejected" 41,23,"Machine-readable vocabulary linking",,"LodView","EOSC Machine-Readable Vocabulary Service",,"Central EOSC service to register, validate, and publish controlled vocabularies and ontologies in machine-readable formats with resolvable URIs and queryable endpoints.",,"INTEROPERABILITY-SEMANTIC-REQ-059, INTEROPERABILITY-SEMANTIC-REQ-058, DISCIPLINE-SPECIFIC-REQ-012","Approved" 42,24,"Vocabulary harmonisation and mapping services","FAIRsharing, Europeana Vocabulary Mappings",,"Not required, solved by existing services and tools",,,,,"Rejected" 44,25,"Support for multilingual vocabularies",,,"EOSC Multilingual Vocabulary Registry Service",,"Central EOSC service registry for authoritative controlled vocabularies across domains, ensuring multilingual accessibility, semantic interoperability, and machine-actionable distribution.",,"INTEROPERABILITY-SEMANTIC-REQ-054, INTEROPERABILITY-SEMANTIC-REQ-060, DISCPLINE-SPECIFIC-REQ-003","Rejected" 45,26,"Use of PIDs for semantic terms","LOV (Linked Open Vocabularies), identifiers.org",,"Not required",,,,,"Rejected" 47,27,"Support for semantic identifier dereferencing","LOV (Linked Open Vocabularies), Wikidata, identifiers.org","LodView, Skosmos","Not required",,,,,"Rejected" 52,28,"Alignment with recognised identifier systems","ORCID, ROR (Research Organization Registry), DataCite REST API","PID Graph","Not required",,,,,"Rejected" 56,29,"Support for embedded semantic annotations","schema.org, Wikidata","OpenRefine","EOSC Embedded Semantic Annotation Service",,"EOSC-managed service enabling repositories to embed standardised semantic annotations (JSON-LD, RDFa, Microdata) in metadata records, publishing linked contexts and vocabularies for machine-actionable interpretation",,"INTEROPERABILITY-SEMANTIC-REQ-057, INTEROPERABILITY-SEMANTIC-REQ-061","Approved" 59,30,"External annotation services or tools","Hypothes.is, Annotorious, Recogito",,"Not required",,,,,"Rejected" 62,31,"SLA semantics","EOSC Service Catalogue, LOV (Linked Open Vocabularies), FAIRsharing",,"Not required",,,,,"Rejected" 66,33,"Policy includes machine-readable terms","FAIRsharing",,"Not required",,,,,"Rejected" 70,37,"Role Metadata in Deposit and Curation","Dataverse","CRediT","Not required",,,,"DISCPLINE-SPECIFIC-REQ-030","Rejected" 72,38,"Accessibility and Support Documentation",,"read the docs, Sphinx Documentation Generator","Not required",,,,,"Rejected" 75,40,"License alignment and combinability","SPDX License List & Tool, CC-API, Creative Commons License Chooser",,"EOSC License Compatibility & Guidance Service",,"An EOSC-managed service that lists repository licenses, provides a machine-readable compatibility matrix, and offers interactive tools to help researchers select combinable licenses for aggregated or remixed datasets.","CPP-020 - Rights management","INTEROPERABILITY-LEGAL-REQ-062, INTEROPERABILITY-LEGAL-REQ-063, DISCIPLINE-SPECIFIC-REQ-002, DISCIPLINE-SPECIFIC-REQ-004, DISCIPLINE-SPECIFIC-REQ-013, DISCIPLINE-SPECIFIC-REQ-035, DISCIPLINE-SPECIFIC-REQ-038","Approved" 77,41,"Use of standardized open licenses","CC-API, Creative Commons License Chooser",,"EOSC License Compatibility & Guidance Service",,"An EOSC-managed service that lists repository licenses, provides a machine-readable compatibility matrix, and offers interactive tools to help researchers select combinable licenses for aggregated or remixed datasets.","CPP-020 - Rights management","INTEROPERABILITY-LEGAL-REQ-062, INTEROPERABILITY-LEGAL-REQ-063, DISCIPLINE-SPECIFIC-REQ-002, DISCIPLINE-SPECIFIC-REQ-004, DISCIPLINE-SPECIFIC-REQ-013, DISCIPLINE-SPECIFIC-REQ-035, DISCIPLINE-SPECIFIC-REQ-038","Approved" 80,44,"Embargos, redaction, and generalisation mechanisms","Dataverse","OpenRefine, Amnesia","EOSC EDEN Data Privacy & Disclosure Control Service",,"A service that supports researchers in managing data embargos, redaction, and generalisation before publication. It integrates with repositories like Dataverse and tools such as OpenRefine and Amnesia to enable policy-driven anonymisation, embargo scheduling, and sensitive data removal. This ensures compliance with legal, ethical, and disciplinary requirements while facilitating safe and transparent data sharing.",,"DISCIPLINE-SPECIFIC-REQ-030","In review" 84,46,"Alignment with EU and national data policies",,,"EOSC EDEN Policy Compliance & Alignment Service",,"A service that ensures repositories and data services declare and demonstrate alignment with EU-level and national data policies. It provides automated checks for GDPR and other regulatory requirements, records compliance declarations, and offers guidance on applicable legal frameworks. By integrating policy registries and producing machine-readable compliance statements, the service supports trust, interoperability, and transparency across the EOSC ecosystem.",,,"In review" 92,54,"Alignment","DANS Depositer Guidance Service",,"Repository Guidance Service",,"A service to assist depositors, based on their context, to select the most appropriate repository.",,,"WIP" 93,55,"Self-contained",,,"Completeness Advisor: Dependency and Completeness Advisory Service based on AI, requires human validation",,"It will be very difficult to develop generic services to verify completeness of a deposit, verification relies on human curation expertise. Candidate for AI-based curation assistance.",,"DISCPLINE-SPECIFIC-REQ-029","WIP" 94,55,"Documentation",,,"Documentation Advisor: It may be feasible to provide AI-based assistance through training on typical documentation for similar datasets.",,"Automated and generic services to verify completeness of documentation will be very difficult to specify. Possible candidate for AI-supported assistance to curators",,,"WIP" 95,56,"Publication Consent","Dataverse, Invenio/Zenodo, Dryad","Open Journal Systems","A custom service could be developed based on Open Journal Systems code.",,"Some open publication systems (OJS), focused on publications and preprints, include author consent and licence confirmation workflows. Some repository platforms (listed) provide author consent verification metadata.",,,"WIP" 117,57,"Technical metadata extraction",,"Mediainfo, ExifTool, FFmpeg, File Scraper, ImageMagick","EOSC Metadata Extraction Service",,"The service for extracting metadata from files produces technical metadata in a standardized interoperable format. Technical metadata is used for identifying significant properties of files, which can be used to plan and assess file format transformations and file repair actions. As file's properties are different for different types of files, technical metadata standards tend to focus on specific types of files. For example NISOIMG (MIX) is applicable for image files, while audioMD is meant for audio files and streams. Technical metadata is extracted using several tools, as one tool cannot cover the extraction of all properties for a diverse set of file formats. The metadata extraction service must include tools that can extract the relevant technical metadata for all file formats supported by the TDA. The metadata extraction service harmonizes the output from the used tools and maps it to one or more technical metadata standards. Finally, the extracted metadata is serialized using the selected standard(s).","CPP-009 - Metadata Extraction","QUALITY-TECHNICAL-REQ-016","Added to deliverable" 118,58,"Significant properties definition",,,"EOSC Significant Properties Definition Service",,"Significant properties document the aspects of digital files that carry important information used to render, understand and interpret the digital content in the file. The significant properties are typically based on the type of file, e.g. image files, audio streams. The significant properties also depend on the content itself, for example a digital photograph may weigh the technical aspects of an image file differently from a scanned textual document that is saved as an image. The service for defining significant properties in a TDA is responsible for defining a base set of significant properties that need to be documented for each file format supported by the TDA. The service creates a machine-actionable template for each file format, defining the properties that must be documented (extracted as technical metadata) and allowing for putting different types of scores for the importance of each aspect. This aids the TDA in both planning technical metadata extraction for the files as well as providing a machine-actionable document that can be used in workflows for planning and evaluating file format transformation paths and processes.","CPP-022 - Significant properties definition","QUALITY-TECHNICAL-REQ-016","Approved" 119,59,"File format identification",,"Jhove, Fido, veraPDF, Jpylyzer, DROID, ExifTool, FFmpeg, file, File Scraper, v.Nu, xmllint (libxml2)","EOSC File Format Identification Service",,"File format identification is a service, in which the TDA can identify the file formats that are ingested and are preserved. The file format identification service must identify file formats to an appropriate level of granularity, often including specific file format versions and even profiles. The granularity depends on the technical structure of the file format, how the versions differ from each other and how the significant properties can be extracted and rendered. File formats must be identified so that a TDA can know the format of its preserved content and assign proper preservation strategies to the files. Without any file format identification, the files are just bytes on a hard drive that cannot be migrated or emulated. File format identification is also needed in reporting to users that ingest the content, and to verify that the file format actually is what the user claims it to be. The file format identification is a service that uses tools that read the content of the files and determines the file format and version based on the file's technical structure and for example magic number. Using for example file extensions is usually not good enough, as the extensions are not reliable or provide a proper level of granularity. The file format service must include tools to identify all the file formats that are supported by the TDA. Very few tools can accurately identify all formats (e.g. Linux's file command, ExifTool or DROID), so the file format identification service must typically integrate several different tools to accurately be able to identify the files. The file format identification service must report the identified file formats and versions for each file in the repository. The format and version must be reported in a standardized fashion, for example using MIME types for file formats. Using a file format registry such as PRONOM that links the identified formats to a registry record is highly recommended as the format registries hold valuable data about the file format and its use fosters a high degree of interoperability. The file format identification service must report the identified file formats, versions, profiles, and links to file format registries in a machine-actionable standardized manner so that its output is usable by TDAs.","CPP-008 - File Format Identification","QUALITY-TECHNICAL-REQ-006, QUALITY-TECHNICAL-REQ-007, QUALITY-TECHNICAL-REQ-008, QUALITY-TECHNICAL-REQ-009, QUALITY-TECHNICAL-REQ-010, QUALITY-TECHNICAL-REQ-011, QUALITY-TECHNICAL-REQ-012, QUALITY-TECHNICAL-REQ-013, QUALITY-TECHNICAL-REQ-014, QUALITY-TECHNICAL-REQ-027, QUALITY-TECHNICAL-REQ-030, QUALITY-TECHNICAL-REQ-042","Approved" 120,60,"File format validation","Validate your files","Jhove, Fido, veraPDF, Jpylyzer, DROID, ExifTool, FFmpeg, file, File Scraper, v.Nu, xmllint (libxml2)","EOSC File Format Validation Service",,"The TDA performs a technical validation of all files in acceptable formats in the ingest phase. The validation ensures that the file is not defective and that its format is as reported. Accepting files with technical errors may put their digital preservation at risk. File format validation has multiple purposes: It detects invalid files, it detects wrongly identified files, and it ensures that files can be accessed with contemporary software. Only a valid, well-formed file without technical errors can be migrated with established migration processes, and it can be assured that its contents can be reliably preserved with contemporary preservation methods. If the file is broken or the format is unknown, file format migration might not be possible with a known migration process. This may cause a potential dead end and eventually, the file cannot be opened at all in the future. When the software support is gone, even valid and well-formed files might not be usable or openable without proper file format handling during preservation. An error in a file may be semantical or technical. A semantical error is an error in the content, which is not recognised in the file format validation. In this case, the file is technically valid, but it contains wrong or insufficient content for the community needs. However, in file format validation the file is technically invalid, which may also affect the semantic content directly or indirectly. In indirect effect, the content is in the file in a proper way, but a file includes technical errors which prevents interpreting the file properly. On the other hand in some cases, an invalid file may work properly with a supporting software. Even in this case, since the file is technically invalid, this may cause problems in the future for using or migrating the file. It is crucial to detect and (at the extent possible) to fix all the validation errors sooner than later. When migrating the file will be at hand, the software support might not be as wide anymore as today. This also ensures that automatic mass migrations are possible in the future. However, validation is not always perfect, because of faults and shortcomings in the validation tools. When different tools are used for validating different file formats, the methods of validation are also varied. It may be that for some formats, the syntax is analysed very carefully, but for other formats, it is mainly ensured that the file can be processed with a specific program without errors.","CPP-010 - File Format Validation","QUALITY-TECHNICAL-REQ-017, QUALITY-TECHNICAL-REQ-018, QUALITY-TECHNICAL-REQ-019, QUALITY-TECHNICAL-REQ-020, QUALITY-TECHNICAL-REQ-021, QUALITY-TECHNICAL-REQ-030, QUALITY-TECHNICAL-REQ-042","Approved" 121,61,"File format validation",,,"EOSC File repair service",,"File repair in digital preservation refers to the process of restoring corrupted, damaged, or partially inaccessible digital files to a usable state while maintaining their authenticity and original characteristics. It's a reactive preservation strategy employed when digital objects have already suffered some form of degradation or corruption. In the digital preservation context, file repair differs from general data recovery because it must balance two sometimes competing priorities: making files accessible again while preserving their evidential value and original properties. This means repair processes must be carefully documented and should avoid introducing changes that could compromise the file's integrity as an authentic digital object. File corruption in digital preservation can occur through various mechanisms. Storage media degradation causes bit rot, where individual bits flip from their original values over time. Software and hardware failures during read/write operations can introduce errors. Format obsolescence may render files unreadable even when the underlying data remains intact. Migration processes themselves can sometimes introduce corruption if not properly executed. The repair process typically involves identifying the specific type and extent of damage, then selecting appropriate remediation techniques. For minor corruption, this might involve using error correction codes embedded within the file format or reconstructing damaged headers using format specifications. More severe damage may require reconstructing missing data segments using redundant information stored elsewhere in the file or from backup copies. Different file formats present unique repair challenges and opportunities. Some formats include built-in error correction mechanisms, while others are more fragile. Text files may be partially recoverable even with significant corruption, whereas executable files typically require complete integrity to function properly. High-level description (based on CPP-027): The initial analysis steps that determines precisely which erroneous format structure is responsible for the failure using validation tools and metadata extractors, and evaluate the risk that these structures cause problems for current and future users; The intervention on the file itself; The control step that measures the impact and checks that no significant properties were affected in the process; The documentation step that records the details of the process. The handling of false positives and identification of gaps in the tools producing the false positive warnings / errors. ","CPP-027 - File repair","QUALITY-TECHNICAL-REQ-004","Approved" 122,62,"Format normalization",,"ExifTool, FFmpeg, ImageMagick, LibreOffice","EOSC File format normalization service",,"If a content file to be preserved is not in a recommended or acceptable format, the files must be converted into a recommended file format that can be preserved directly before the content is transferred to the TDA. This conversion is known as normalization. The files must be directly converted into a recommended file format, as otherwise the TDA would have to carry out another conversion from an acceptable to recommended format. Each file format conversion – migration or normalization – poses a risk to data preservability, which is why the number of conversions should be minimised. In normalization, the original file format can in principle be anything. The TDA cannot prepare for normalization in the same way as for migrations, for example by planning ready-made migration paths or developing ready-made preservation plan models. For example, the original file format could be a proprietary or undocumented format, or it may have some other difficulties, which do not exist in migrating a well-known file format. This may require special know-how and information relating to suitable migration tools. As the TDA cannot prepare for normalizations as well as it would for migrations, it may be necessary to also retain the original file after normalization, for example to ensure content authenticity. The original files can be transferred to the TDA for bit-level preservation with the normalized files. If sufficiently high-quality normalization is not technically possible or it would require too much resources, and bit-level storage is an acceptable solution, the content can be submitted to the TDA without normalization. High-level description (based on CPP-026): Compare the detected format with the list of preferred formats. If there is a match, we are done. Otherwise, try to find a normalisation plan in the policy. Execute the normalisation plan If present, run the validation plan to determine if the normalised file provides a similar rendition like the original file. If no validation plan is available, it is up to the TDA’s policy to decide if manual inspection is needed. ","CPP-026 - File Normalisation","QUALITY-TECHNICAL-REQ-031, QUALITY-TECHNICAL-REQ-027, QUALITY-TECHNICAL-REQ-049","Approved" 123,63,"Format migration",,"ExifTool, FFmpeg, ImageMagick, LibreOffice","EOSC File format migration service",,"In migration, the data content or preservation information of an archival information package (AIP) is modified. File format migrations are a key logical preservation action. They allow designated communities to use the content with up-to-date software and tools. Migration may target a preserved file and/or its metadata. Migration always creates a new version of the target file in a new file format. Rather than a migration, simply updating/correcting descriptive metadata is an update event where the previous record is replaced with a new one. After file migration, the old version can either be deleted or saved to the new AIP alongside with the new file format. Identifying and anticipating migration needs will increase in importance in years to come. One of the requirements for managing this are format libraries (incl. PRONOM), on the basis of which the migration needs of a file format can be analysed. Keeping up with international technology development is the key. To identify a need for migration, it should also be noted that a file format version may become obsolete faster than the actual file format, and there may be a need for a migration for other reasons, such as a change in the designated community, even before the file format to be migrated becomes technically outdated. A migration need can be identified by monitoring of file formats and migration tools as well as possible tool development, or by monitoring content use and the designated community as well as applications available to the organization. The need for migration does not always stem only from technical reasons. It is important to actively monitor the operating environment and its changes. The designated community may change or move on to use another file format, making it difficult or even impossible for the (new) designated community to use the original format, even if the original file format were still fully relevant for another designated community. This may, for example, be due to replacement of software or expiry of software licenses. When defining migration paths, different options must be assessed regarding the following: target format of the file to be migrated (sometimes target formats in plural), needs of the designated community of the content, quality requirements of migration (what types of changes to the content and layout are allowed), available migration tools, costs (incl. coding) and other factors that are relevant from the designated community’s perspective. Several possible migration paths may exist for a specific source format. When planning preservation, it is important to assess the available options from different perspectives. This helps to prioritise migration pathways but cannot be the decisive factor. For example, it may sometimes make sense to select the migration path that ranked second best in the overall assessment if there are clear grounds for this, including excessively high costs of the highest-ranking path. The grounds for selecting a migration path must always be documented, making it possible to go back to them in the future if necessary. If the original version of the content is preserved, the migration can be repeated with other tools if significant problems are detected in an individual document. Preserving the original document also makes it possible to base the next migration on the original content when the file format generated in the migration becomes outdated if the required migration tools are either already in place or sufficiently easy to develop. Information on the migration event (date, actors, changes resulting from the migration) should be saved together with the AIPs as digital preservation metadata. It is particularly important to record changes to content and layout as comprehensively as possible. The purpose of migration planning is to ascertain the outcome of the migration as far as possible, taking into consideration the available tools and resources as well as costs, content properties and existing preservation plans. A migration cannot be approved without knowledge of its consequences. At this stage, it should (at the latest) be ensured that the content preservation plan remains current and update it if necessary. A cost estimate is also prepared for the migration at this stage. The factors influencing it include the volume of content to be migrated, the required storage capacity, and the need for personnel work input. Typically, the TDA stores content that is significant in terms of both number of objects and data volume, which is why migrations and their quality assurance must be automated as far as possible. It is important that a test migration is performed on a content sample picked in the planning phase. The actual migration can only be initiated once the results of this test have been approved. After the planning and testing phases, the actual migration can be initiated by the TDA following the preservation and migration plans. In connection with the migrations it carries out, the TDA updates the necessary changes to the administrative metadata. The metadata of a new object generated in the migration can be created in two ways. The simplest model is generating them in the TDA based on the metadata of the original object. Among other things, the object identifier and technical metadata (file format, file size) must be modified. The metadata can also be created once the migrated content and its technical metadata have been transferred to it from the TDA. The new descriptive metadata are sent to the TDA as an SIP that updates the AIP generated in the TDA. The second operating model is also followed if the format of the descriptive metadata changes. Even if the migration had been planned carefully and a test migration had been approved, it may turn out during the actual migration process that the outcome is not always what was desired. In this case, the migration process must be interrupted as soon as possible. To detect potential problems, the migration process must be actively monitored. Migration is always an action/process that targets a pre-agreed and limited number of digital objects. Finally, it is important that the outcome and experiences are publicly documented for the benefit of other organizations, possibly also internationally (for example in the PAR system). A migration always creates a new version of the object on which the migration is performed. According to the definition of migration, this version replaces the one on which the migration was based. However, it may be necessary – or even essential – to also retain the old version(s), for example to ensure the authenticity of the content. Still, retaining old versions takes up the storage capacity. High-level description (based on CPP-014): Perform the selected migration path Apply the characterisation processes against target file or Representation, optionally by running the ingest process and all its subprocesses (if necessary) Control the properties of the target file or Representation against the expected outcome of the process manually or in an automated way, in particular: Properties to be changed by the migration process have the expected value; Significant properties are maintained; Target files or Representations are valid.","CPP-014 - File Migration","QUALITY-TECHNICAL-REQ-023, QUALITY-TECHNICAL-REQ-024, QUALITY-TECHNICAL-REQ-025, QUALITY-TECHNICAL-REQ-026, DISCIPLINE-SPECIFIC-REQ-041, QUALITY-TECHNICAL-REQ-027, QUALITY-TECHNICAL-REQ-049","Approved" 124,64,"File format appraisal",,,"Technical Data Quality: EOSC File format appraisal service",,"File format appraisal is a critical component of TDA that involves the systematic evaluation and assessment of different file formats to determine their suitability for long-term preservation. This process helps TDA to make informed decisions about which formats to accept, maintain, or migrate to ensure continued access to digital content over time. File format appraisal encompasses the analysis of technical characteristics, preservation risks, and long-term viability of digital file formats. The primary goal is to identify formats that can be sustainably preserved and accessed in the future, while flagging those that may present significant preservation challenges or risks of obsolescence. Digital preservation professionals typically assess file formats against several critical factors: Technical Characteristics include format complexity, proprietary versus open standards, compression methods, and embedded dependencies. Formats with well-documented specifications and minimal dependencies generally score higher in preservation assessments. Sustainability Factors encompass the format's adoption rate, community support, vendor independence, and likelihood of continued development. Widely adopted formats with active communities and multiple supporting tools tend to be more preservation-friendly. Preservation Risks involve potential for format obsolescence, hardware or software dependencies, encryption or access restrictions, and data integrity concerns. Formats tied to specific proprietary software or hardware present higher preservation risks. Functional Requirements address whether the format adequately captures the essential characteristics of the content, supports necessary metadata, and maintains authenticity and integrity over time. File format appraisal faces several ongoing challenges. The rapid pace of technological change means new formats emerge frequently while others become obsolete. Proprietary formats may lack sufficient documentation for thorough assessment, and complex formats with multiple dependencies can be difficult to evaluate comprehensively. File format appraisal directly informs preservation strategies and policies. A TDA should use appraisal results to develop format policies, plan migration activities, and allocate preservation resources effectively. Regular re-appraisal helps track changes in format sustainability and adjust preservation approaches accordingly.","CPP-019 - Data quality assessment, CPP-018 - Community watch",,"Approved" ,65,,,,"Validation Service (for checksums/malicious content)",,"Service which is responsible for validating: Checksums of files in an Information Package The structure of an Information Package Prevention of malicious content (virus)","CPP-002 - Checksum Validation, CPP-007 - Virus Scanning","QUALITY-TECHNICAL-REQ-001, QUALITY-TECHNICAL-REQ-002, QUALITY-TECHNICAL-REQ-004","Rejected" ,66,,,,"EOSC Package (Metadata) Transformation Service",,"A service which is responsible for: the repackaging to a local format Example: Differences between information package formatting in different services could be handled via a transformation service (relating strongly to metadata)","CPP-006 - Batch Export, CPP-029 - Ingest","QUALITY-TECHNICAL-REQ-011, QUALITY-TECHNICAL-REQ-012, QUALITY-TECHNICAL-REQ-013, QUALITY-TECHNICAL-REQ-014, QUALITY-TECHNICAL-REQ-015","Approved" 100,67,"Well defined semantics for each component of an AIP",,,"AIP Semantic Packaging Information Service",,"There needs to be a mechanism that allows a TDA to describe each component of its AIPs in terms of the semantics that apply, for example which schemas, vocabularies and ontologies are being used. This includes the semantics of the data in the AIP, the semantics of associated information such as provenance, fixity, rights and quality, and the contents AIP as defined by the semantics of the OAIS Information Model.","CPP-006 - Batch Export","INTEROPERABILITY-SEMANTIC-REQ-023, INTEROPERABILITY-SEMANTIC-REQ-031, INTEROPERABILITY-SEMANTIC-REQ-032","Approved" ,68,"Evaluaton of AIPs against the semantics of the OAIS Information Model",,,"AIP OAIS Semantic Validation Service",,"The Packaging Information for an AIP is evaluated against the components of the OAIS Information Model to determine the extent to which some or all of the required components of OAIS are present in the AIP. This allows the level of OAIS conformance to be assessed according to the semantics of the OAIS Information Model.","CPP-019 - Data quality assessment","INTEROPERABILITY-SEMANTIC-REQ-033","Approved" 101,69,"The technical details of each component of an AIP are well defined and at all levels.",,,"AIP Technical Packaging Information Service",,"The service allows a TDA to describe the contents of an AIP by providing OAIS Packaging Information that describes the technical details of the AIP.  This allows external entities to understand both the contents of the AIP and what standards and specifications are being used for the serialisation and exchange of the AIP.","CPP-006 - Batch Export, CPP-021 - Versioning","INTEROPERABILITY-TECHNICAL-REQ-027, INTEROPERABILITY-TECHNICAL-REQ-029, INTEROPERABILITY-TECHNICAL-REQ-022, INTEROPERABILITY-TECHNICAL-REQ-024, INTEROPERABILITY-TECHNICAL-REQ-025, INTEROPERABILITY-TECHNICAL-REQ-026, INTEROPERABILITY-TECHNICAL-REQ-053, INTEROPERABILITY-TECHNICAL-REQ-028","Approved" 102,70,"Evaluation of AIP contents against the components of the OAIS Information Model",,,"AIP OAIS Technical Validation Service",,"The OAIS Packaging Information for an AIP is tested to validate the AIP contents at a technical level. This includes validation of the Packaging Information against a schema in order to check that it is syntactically valid, i.e. well-formed, and testing of the contents of the AIP to confirm they are present and correct as listed in the Packaging Information.","CPP-019 - Data quality assessment","INTEROPERABILITY-TECHNICAL-REQ-021, INTEROPERABILITY-TECHNICAL-REQ-030, INTEROPERABILITY-TECHNICAL-REQ-054","Approved" 103,71,"Exchange of AIPs between organisations",,,"AIP Exchange Service",,"It should be possible to request and transfer an AIP from one location to another, e.g. between TDAs. The service that supports AIP exchange should support incremental and full transfers (e.g. transfer of a whole AIP in one shot for small AIPs or transfer in batches of files for large AIPs). ","CPP-021 - Versioning, CPP-006 - Batch Export","INTEROPERABILITY-TECHNICAL-REQ-055, INTEROPERABILITY-TECHNICAL-REQ-056","Approved" 104,72,"Machine readable rights specifications",,,"AIP Rights Validation Service",,"The TDA can check the rights statements in an AIP to confirm that it has a valid legal basis for holding the AIP, performing preservation actions on it, and/or transferring it to other TDA or using third-party preservation services.","CPP-020 - Rights management","INTEROPERABILITY-LEGAL-REQ-033","Approved" 115,79,"Metadata Standards Alignment","MSCR, CESSDA Metadata Validator (CMV)",,,,,,,"Duplicate" 154,80,"Storage Management","S3 Glacier compatible interfaces",,"Online + offline file storage service",,"It should be possible to efficiently transfer (large) files to disk or tape. According to OAIS interoperable archives, archives may want to share storage options depending on resources for managing scale. This service serves this interoperability. This service acts as a storage backend interface/proxy to be used with repository/archive software that links the system to multiple storage backends. This service serves for example the National Repository use case where storage is shared between different parties.",,,"Approved" 155,81,"Retrieval and Access",,,"Online + offline file retrieval and access service",,"The repository should be able to provide the file storage status (in UI and API) and to retrieve files that are stored on tape to disk (for usage); According to OAIS interoperable archives, archives may want to share storage options depending on resources for managing scale. This service serves this interoperability.","CPP-025 - Access, CPP-029 - Ingest","QUALITY-TECHNICAL-REQ-022","Approved" 156,82,"Legal clearance",,,"Legal Assessment Service",,"Service which is able to perform a legal assessment based on the digital object and its metadata. The service shall identify and categorize potential legal issues related to the digital object and (optionally) propose measured the TDA may perform to mitigate potential legal risks.","CPP-012 - Preservation Risk Mitigation, CPP-020 - Rights management, CPP-023 - Risk definition and extraction, CPP-029 - Ingest","INTEROPERABILITY-LEGAL-REQ-003, INTEROPERABILITY-LEGAL-REQ-016","Duplicate" 157,83,"Right statement standardisation",,,"Rights Management Service",,"Service to identify and indicate rights and obligations related to a digital object or object collections in a standardized way.","CPP-020 - Rights management","INTEROPERABILITY-LEGAL-REQ-003","Duplicate" 158,84,"Ethics clearance",,,"Ethical Review Service",,"A service which allows to perform ethical reviews based on given data and metadata in an automatized or guided way.","CPP-012 - Preservation Risk Mitigation, CPP-020 - Rights management, CPP-023 - Risk definition and extraction",,"In review" 159,85,"Policy standardisation",,,"Policy Classification Service",,"A service which parses and classifies policy documents based on their written content and transfers the raw policy text into a standards-compliant, structured documents.","CPP-020 - Rights management",,"In review" 160,86,"Policy (type) provision",,,"Policy Type Vocabulary",,"A vocabulary containing a comprehensive selection of most frequently used policy type terms, e.g. defined as subclasses of dc:Policy. which can be used by the Policy Classification Service.","CPP-020 - Rights management",,"In review" 161,87,"Guidance on FAIR compliance",,,"FAIR Compliance Guidance API","eden:trl.1","Within the OSTrails (EU-HE Project) specifications on FAIR Guidance are being developed. The specification might be implemented.",,"DISCPLINE-SPECIFIC-REQ-018, QUALITY-METADATA-REQ-001, QUALITY-METADATA-REQ-005","In review" 162,88,"Metadata Standards Alignment","CESSDA Metadata Validator (CMV), MSCR, Swagger API Hub","OpenRefine",,"eden:trl.8","Validates DDI metadata records used by CESSDA’s tools and services, against a given DDI Profile.",,,"In review" 166,90,"FAIR assessment tool","F-UJI, FAIR-Checker, ""FAIR EVA (Evaluator, Validator & Advisor)"", FAIR Enough, FOOPS!, FAIR Evaluation Services",,,,,"CPP-005 - Identifier management, CPP-019 - Data quality assessment, CPP-025 - Access, CPP-024 - Discovery","QUALITY-METADATA-REQ-001, QUALITY-METADATA-REQ-004, QUALITY-METADATA-REQ-005","Approved" 169,91,"Repository affordances discovery",,,"FAIRiCat profile validator",,"A validator to check if the objects within a repository are described by FAIR Signposting profile","CPP-009 - Metadata Extraction",,"Approved" 170,92,"Metadata discovery",,,"FAIR Signposting profile validator",,"A validator to check if a repository exposes its affordances accoording the FAIRiCat specifications","CPP-009 - Metadata Extraction, CPP-024 - Discovery","QUALITY-METADATA-REQ-002","Approved" 171,93,"FAIR assessment aggregated score",,,"Facade API to calculate aggregated FAIR assessment scores on collection and repository level",,"A service which allows for aggregation of FAIR assessment scores within a given repository or collection","CPP-019 - Data quality assessment","QUALITY-METADATA-REQ-001, QUALITY-METADATA-REQ-004","Approved" 172,94,"License guidance","License Selector",,"License Guidance Service",,"A service which processes provided information and returns relevant licensing requirements and recommendations along with applicable guidance.","CPP-020 - Rights management",,"Duplicate" 173,95,"Licence provision","SPDX License List & Tool",,"License Registry",,"A License Registry for data is a searchable catalog of licenses that helps users quickly find and understand available licensing options for datasets.","CPP-020 - Rights management",,"In review" 174,96,"Licence stewardship",,,"License Compliance Monitor",,"A License Compliance Monitor is a service that regularly reviews the licenses in use and notifies users when a license is outdated, expired, or no longer valid.","CPP-020 - Rights management",,"In review" 175,97,"Contract and consent management",,,"Contract & Consent Registry",,"A Contract and Consent Management Service stores and organizes user agreements, linking them to individual digital objects as needed, such as preservation agreements or other contractual obligations.","CPP-013 - Object Management Reporting, CPP-020 - Rights management, CPP-019 - Data quality assessment",,"In review" 176,98,"File format validation","Validate your files","File Scraper","",,,"CPP-010 - File Format Validation",,"Duplicate" 177,99,"Format transformation tools registry",,,"File format tools registry",,,,,"Rejected" 178,100,"File format validation error database",,,"EOSC File format validation error database service",,"A regularly maintained database of commonly encountered file format validation errors and their solutions. Note: This already is a partly existing service (even though it is not mentioned in existing services): https://github.com/Digital-Preservation-Finland/knowledge-base https://digital-preservation-finland.github.io/knowledge-base/","CPP-010 - File Format Validation, CPP-027 - File repair",,"Approved" 180,,"Multilingual metadata support",,"Dataverse Support for External Vocabulary Services",,,"Dataverse Support for External Vocabulary Services in combination with SKOSMOS can provide multilingual search.","CPP-009 - Metadata Extraction, CPP-016 - Metadata Ingest and Management","QUALITY-METADATA-REQ-006","Approved" 215,,"Federated Identity Support","""Authentication Authorisation, and Identity Services""",,,6,"Applications for authentication; authorisation; and identification","CPP-003 - Integrity Checking",,"Rejected" 181,,"The efficient reuse of metadata from maDMP (extracting to create SIP), suggested attribute: authoriative metadata ","Machine-Actionable DMP Service",,,6,"An emerging specification for applications that implement ma-DMP","CPP-016 - Metadata Ingest and Management",,"Rejected" 182,,"Metadata Provenance","""Provenance- Recording Services Applicable standards: PROV, RO-Crate""",,,6,"Services for the querying, registration and recording of provenance information, including links between research objects.","CPP-009 - Metadata Extraction",,"Rejected" 183,,"Metadata Provenance","Scholix Exchange Protocol",,,6,"Cross-infrastructure exchange protocol; interoperable with OpenAIRE, DataCite, and Europe PMC","CPP-009 - Metadata Extraction",,"Rejected" 184,,"The efficient reuse of FAIR metrics information from DMP information systems, suggested attribute: FAIR metrics","DMP Online",,,7,"ma-DMP implementation is underway in a beta version.","CPP-009 - Metadata Extraction",,"Rejected" 185,,"Technical metadata extraction","Pelias Facade",,,7,"A facade application that integrates multiple geocoding APIs, including GeiNames and Nominatum.","CPP-009 - Metadata Extraction",,"Rejected" 186,,"License standarisation","SPDX Licence Listing Service",,,7,"Code is available, and can be implemented by anyone, based on a public repository in GitHub.","CPP-020 - Rights management",,"Rejected" 187,,"Publication Consent","Open Journal System",,,9,"Includes forms and workflow for determining author and rights holder consent, but designed for papers and preprints","CPP-009 - Metadata Extraction",,"Rejected" 188,,"Federated Identity Support","Google AAI Service",,,9,"Google provides a widely used AAI service","CPP-003 - Integrity Checking",,"Rejected" 189,,"Federated Identity Support","ORCID AAI Service",,,9,"ORCID AAI services can be used in many research-related systems and services","CPP-003 - Integrity Checking",,"Rejected" 190,,"Federated Identity Support","EduGain AAI Service",,,9,"Eduroam can be used in many academic network globally for AAI","CPP-003 - Integrity Checking",,"Rejected" 191,,"Federated Identity Support","KeyCloak AAI Service",,,9,"A federated AAI management application that creates a facade for multiple providers.","CPP-003 - Integrity Checking",,"Rejected" 192,,"The efficient reuse of metadata from maDMP (extracting to create SIP), suggested attribute: authoriative metadata ","Argos Machine Actionable DMP service",,,9,"Implements ma-DMP via an API.","CPP-009 - Metadata Extraction",,"Rejected" 193,,"The efficient reuse of metadata from maDMP (extracting to create SIP), suggested attribute: authoriative metadata ","Data Stewardship Wizard",,,9,"Full ma-DMP support implemented","CPP-009 - Metadata Extraction",,"Rejected" 194,,"Metadata Provenance","OpenAIRE Research Graph",,,9,"Provides links between objects based on SKG-IF.","CPP-009 - Metadata Extraction",,"Rejected" 195,,"Metadata Provenance","DataCite PID Graph",,,9,"Datacite implementation of the Scholix protocol.","CPP-009 - Metadata Extraction",,"Rejected" 196,,"Metadata Provenance","RO-Hub",,,9,"Operational service for registering and accessing RO-Crates via API","CPP-009 - Metadata Extraction",,"Rejected" 197,,"Fraud","Malicious Content Detection and Mitigation",,,9,"Services that detect and optionally remove malicious content","CPP-003 - Integrity Checking",,"Rejected" 198,,"Quality assessment","Compute a Checksum",,,9,"Services that compute a checksum for a binary object, optionally accepting a cryptographic method and a hash length as parameters","CPP-003 - Integrity Checking",,"Rejected" 199,,"Quality assessment","Named Entity Recognition Service",,,9,"A configurable service that can identify and suggest terms from one or more controlled vocabularies for a specific text body.","CPP-009 - Metadata Extraction",,"Rejected" 200,,"Quality assessment","OntoPortal Named Entity Recognition Services",,,9,"Services for multiple domains (BioPortal, AgriPortal, …) to assist with NER and linking data to ontologies.","CPP-009 - Metadata Extraction",,"Rejected" 201,,"Quality assessment","EDAM Ontology Annotator",,,9,"Tags text with bioinformatics operations/data types.","CPP-009 - Metadata Extraction",,"Rejected" 202,,"Quality assessment","SKOSMOS Annotator",,,9,"Used in FAO AGROVOC & multilingual terminologies. APIs available for some instances.","CPP-009 - Metadata Extraction",,"Rejected" 203,,"Quality assessment","Geonames Geocoding Service, Nominatim API - OpenStreetMaps Geocoding Service",,,9,"API services for obtaining geocoding information for place names and features","CPP-009 - Metadata Extraction",,"Rejected" 204,,"Quality assessment","Nominatim API - OpenStreetMaps Geocoding Service",,,9,"API services for obtaining geocoding information for place names and features","CPP-009 - Metadata Extraction",,"Duplicate" 205,,"Fraud","Detect Illegal Content",,,9,"API services exist for the detection of illegal and inappropriate content.","CPP-003 - Integrity Checking",,"Rejected" 206,,"Fraud","Suspected AI-Generated Content",,,9,"Services that compute a probability that content was generated using AI.","CPP-003 - Integrity Checking",,"Rejected" 207,,"Fraud","sapling.io",,,9,"A service API for detecting probability of AI-generated content in text","CPP-003 - Integrity Checking",,"Rejected" 208,,"Fraud","CopyLeaks",,,9,"A service for detecting AI-generated and plagiarised text.","CPP-003 - Integrity Checking",,"Rejected" 209,,"Legal - GDPR Compliance","Personal Identifiable Information (PII) Detection",,,9,"Services that detect a variety of personal information in text - bank account details, phone numbers, addresses, and more.","CPP-023 - Risk definition and extraction",,"Rejected" 210,,"Disclosure Risk","Detects Non-personal Sensitive Data",,,9,"A configurable service that identifies a range of sensitive data categories in text submissions.","CPP-023 - Risk definition and extraction",,"Rejected" 211,,"License alignment and combinability","DALICC Licence Listing Service",,,9,"API that provides a list of licences registered by DALICC.","CPP-020 - Rights management",,"Rejected" 212,,"License alignment and combinability","Licence Guidance Service - Zenodo",,,9,"UI-based guidance on licence selection, developed by Zenodo, FigShare, and F1000.","CPP-020 - Rights management",,"Rejected" 213,,"License alignment and combinability","Licence Guidance Service - Software",,,9,"UI-based guidance developed by ChooseALicense.com for software","CPP-020 - Rights management",,"Rejected" 214,,"License alignment and combinability","Licence Guidance Service - OpenAIRE",,,9,"UI-based guidance developed by OpenAIRE for data.","CPP-020 - Rights management",,"Rejected" 215,,"License alignment and combinability","Licence Guidance Service - DALICC",,,9,"API that allows faceted search of suitable licences matching criteria","CPP-020 - Rights management",,"Rejected" 216,,"License alignment and combinability","Licence Detail Service",,,9,"API that provides detailed machine-actionable provisions for a given licence","CPP-020 - Rights management",,"Rejected" 217,,"License alignment and combinability","Licence Conflict Service",,,9,"A service that identifies incompatibilities given multiple input licences for combined resources.","CPP-020 - Rights management",,"Rejected" 218,,"Standardised Data Deposit Process",,,"Depositor Guidance Service",1,"A service to assist depositors, based on their context, to select the most appropriate repository.","CPP-029 - Ingest",,"Approved" 219,,"Quality assessment",,,"Dependency Verification Advisory Service",1,"A service that determines whether a dataset should include additional dependencies. Requires human verification of results.","CPP-003 - Integrity Checking",,"Approved" 220,,"Quality assessment",,,"Documentation Verification Advisory Service",1,"A service that determines whether a dataset has adequate documentation with a view to reuse. Requires human verification of results.","CPP-003 - Integrity Checking",,"Approved" 221,,"Publication Consent",,,"Publication Consent Verification Service",1,"A service that determines if the authors and/ or rights holders consent to submission and publication in a specific repository","CPP-029 - Ingest",,"Approved" 222,,"Quality assessment",,,"Statistical Authenticity Verification",1,"Verifies a slate of authenticity checks for a submission based on statistical values","CPP-029 - Ingest, CPP-003 - Integrity Checking",,"Approved" 223,,"Fraud",,,"Suspected Plagiarism Service",1,"Services that computes a probability of content being plagiarised","CPP-003 - Integrity Checking",,"Approved" 224,,"Fraud",,,"Suspected Re-publication Detection",1,"Services that compute the similarity of a given submission to content already published","CPP-003 - Integrity Checking",,"Approved" 225,,"Standardised Data Deposit Process",,,"DANS Depositor Guidance Service",3,"A service offered by DANS to assist depositors, based on their context, to select the most appropriate repository.","CPP-029 - Ingest",,"Approved" 226,,"License alignment and combinability",,,"Licence Listing Service",3,"A facade API Service that provides a URI-referenced list of licences.","CPP-020 - Rights management",,"Approved" 227,,"License alignment and combinability",,,"Licence Guidance Service",3,"A system that provides licence guidance based on parameters","CPP-020 - Rights management",,"Added to deliverable" 228,,"License alignment and combinability",,,"Licence Detail Service",3,"A system that provides detailed, machine-readable provisions for a licence, possibly a facade to multiple operational systems","CPP-020 - Rights management",,"Approved" 229,,"Legal - GDPR Compliance",,,"Official, Node-Based Privacy Statement Service and Guidance",1,"Official, Node-Based Privacy Statement Service and Guidance","CPP-020 - Rights management",,"Approved" 230,,"Structured and parseable metadata",,,"EOSC Information Package Structure Validator",,"Service which is responsible for validating: The structure of an Information Package Examples: Files and folders have a consistent, acceptable format Associated metadata is in a correct, acceptable structure","CPP-029 - Ingest","QUALITY-TECHNICAL-REQ-021","Approved"