1. Introduction

The Open Geospatial Consortium (OGC) is releasing this Call for Participation (CFP) to solicit proposals for the OGC Testbesd-20. The Testbed-20 initiative will explore four tasks, including Integrity, Provenance, and Trust (IPT), GEOINT Imagery Media for Intelligence, Surveillance, and Reconnaissance (ISR) imagery (GIMI), Advancements of GeoDataCubes, and High-Performance Computing Optimized Formats.

T20 Logo

1.1. Background

The Open Geospatial Consortium (OGC) is a worldwide community of over 550 experts from industry, government, research, and academia who collaborate to make geospatial (location) information and services more accessible and usable. OGC Testbeds are annual research and development initiatives investigating geospatial technology from various perspectives. They consider the OGC Baseline while exploring selected aspects with broad teams from industry, government, and academia. These initiatives aim to advance the principles of Findable, Accessible, Interoperable, and Reusable (FAIR) and OGC’s open Standards capabilities. Testbeds bring together requirements and ideas from a group of sponsors, leveraging symbiotic effects and making the overall initiative more appealing to participants and sponsoring organizations.

OGC’s Standards development process is member-driven and creates royalty-free, publicly available, and open geospatial standards. OGC is always at the forefront, actively analyzing and anticipating emerging tech trends. OGC’s Innovation and Collaborative Solution Program is an agile, collaborative Research and Development (R&D) lab that builds and tests innovative prototype solutions to members' use-cases. The global OGC Community engages in various activities related to location-based technologies, including developing consensus-based open Standards and best practices, collaborating on problem-solving in agile innovation initiatives, participating in member meetings, events, and workshops, and more. OGC’s standards development process moves at the pace of innovation, with constant input from technology forecasting, practical prototyping, real-world testing, and community engagement.

1.2. OGC COSI Program Initiative

This initiative is being conducted under the OGC Collaborative Solutions and Innovation (COSI) Program, which aims to solve the biggest challenges in location. OGC members come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards. Since 1999, more than 200 funded initiatives have been executed, from small interoperability experiments run by an OGC working group to multi-million dollar testbeds with more than four hundred OGC-member participants. OGC COSI initiatives promote rapid prototyping, testing, and validation of technologies, such as location standards or architectures, which not only encourages rapid technology development but also determines the technology maturity of potential solutions and increases the technology adoption in the marketplace.

1.3. Benefits of Participation

This initiative provides an outstanding opportunity to engage with the latest research on geospatial system design, concept development, and rapid prototyping with government organizations (Sponsors) across the globe. The initiative provides a business opportunity for stakeholders to mutually define, refine, and evolve service interfaces and protocols in the context of hands-on experience and feedback. The outcomes are expected to shape the future of geospatial software development and data publication. The Sponsors are supporting this vision with cost-sharing funds to partially offset the costs associated with development, engineering, and demonstration of these outcomes, offering selected Participants a unique opportunity to recoup a portion of their initiative expenses. OGC COSI Program Participants benefit from access to funded research & development, reduced development costs, risks, and lead-time of new products or solutions, close relationships with potential customers, first-to-market competitive advantage on the latest geospatial innovations, influence on the development of global standards, partnership opportunities within our community of experts, and broader market reach via the recognition that OGC standards bring. Exceptional demonstrators may be selected for ongoing support and public showcasing beyond the Pilot period (see Miscellaneous).

1.4. Master Schedule

The following table details the major Initiative milestones and events. Dates are subject to change.

Table 1. Master schedule
Milestone Date  Event

M01

10 May 2024

Release of CFP.

M02

22 May 2024

Questions from CFP Bidders for the Q&A Webinar due. (Submit Questions here.)

M03

24 May 2024

Bidders Q&A Webinar to be held 10:00am - 11:00am EST

M04

10 June 2024

CFP Proposal Submission Deadline (11:59pm EST)

M05

14 June 2024

All testbed Participation Agreements Signed.

M06

17-20 June 2024

OGC Member Meeting in Montreal, Quebec, Canada (optional).

M07

21 June 2024

Kickoff Workshop (Hybrid event in Montreal at Mont Royal Center.)

M08

29 July 2024

Initial Engineering Reports (IERs).

M09

15 October 2024

Technology Integration Experiments (TIE) component implementations completed & tested; preliminary Draft Engineering Reports (DERs) completed & ready for internal reviews.

M10

1 November 2024

Ad hoc TIE demonstrations & Demo Assets posted to Portal; Near-Final DERs are ready for review; WG review requested.

M11

15 November 2024

Final DERs (incorporating internal and WG feedback) posted to pending to meet the 3-week-rule before the technical committee (TC) electronic vote for publication.

M12

6 December 2024

Last deadline for the final DER presentation in the relevant WG for publication electronic vote.

M13

9 December 2024

Last deadline for the TC electronic vote on publishing the final DER.

M14

30 December 2024

Participants' final summary reports are due.

M15

Jan 2025

Outreach presentations at an online demonstration event.

M16

Feb 2025

In-person workshop for GDC task usability test (During the week before the member meeting)

2. Technical Architecture

This section provides the technical architecture and identifies all requirements and corresponding work items. It references the OGC standards baseline, i.e. the complete set of member-approved Abstract Specifications, Standards including Profiles and Extensions, and Community Practices where necessary.

Please note that some documents referenced below may not have been released to the public yet. These reports require a login to the OGC portal. If you do not have a login, please contact OGC using the Message textbox in the OGC COSI Program Contact Form.

The Testbed deliverables are organized in a number of tasks:

2.1. Integrity, Provenance, and Trust (IPT)

In the world of Resilient Data Services, one of the major challenges is maintaining the availability, Integrity, Provenance, and Trust (IPT) of data across different endpoints, particularly in distributed systems where data services might not always be available or may need to change their access points depending on environmental conditions. It is crucial to ensure that a re-available endpoint is trustworthy and that data integrity and provenance are maintained throughout its lifecycle to establish a resilient architecture.

unnumbered

This becomes even more important as data services must be adaptable to climatic impacts, and therefore require strong integration, interoperability, and continuous monitoring. Current methodologies need to incorporate agile reference architecture (ARA) models that follow FAIR principles (findable, accessible, interoperable, and reusable) to effectively manage and use data across diverse and fluctuating networks.

2.1.1. Problem Statement and Research Questions

What do Resilient Data Services need with respect to a reference architecture? If data services are not permanently available, or have to vary their access points depending on the environment, clients are faced with the question of whether a re-available endpoint is trustworthy? For both clients and services, the elements of integrity, provenance and trust (IPT) play a key role if they are linked together. If clients are able to trace the integrity of the data, i.e. its accuracy, completeness, and consistency throughout its entire life cycle, then the essential conditions for trust are achieved.

Provenance can be used to ensure data integrity by recording all changes made to the data and who made them. This information can be used to verify that the data has not been tampered with or altered in any way. In addition, provenance can be used to track the lineage of data, which is important for ensuring the data is accurate and reliable. Data integrity and provenance are critical for building trust in data because they provide a mechanism for verifying the accuracy and reliability of data.

In modern distributed systems, building blocks are used to ensure the FAIR principles. Building blocks make data findable, accessible, interoperable, and reusable (FAIR). The prerequisite is that these building blocks are well-aligned and the same methodology was used for their definition. The OGC has invested significant resources to develop building blocks for data description, data methodology and data semantics. The OGC Building Block register provides the current set of established and emerging building blocks, and a prototype of a future version is in development all following a similar methodology.

A key challenge, to be addressed in Testbed-20, is the development of building blocks to support Integrity, Provenance and Trust (IPT). These new building blocks shall be aligned with the current set of building blocks and be suitable for the OGC Building Block register.

It is expected that IPT will hinge on object identification and be guided by FAIR principles. Importantly, the FAIR principles are considered in reverse order: i.e. Reuse requires a business decision which is influenced by an evaluation step. Evaluation typically relies on the cost and capacity implications of the level of interoperability and the ease of access. For arbitrary data within or even beyond a specific domain, all four of these aspects have their unique challenges, with the level of complexity decreasing from R (re-usability) to F (findability).

In the IPT scenario described, clients connect to data services at the reuse level. The reuse of application-specific and derived data is complex, and generally requires advanced knowledge of infrastructure concepts. These focus on semantic descriptions of data, including structure, derivation, data quality, and usage. To establish solid IPT building blocks, the following aspects need to be addressed:

  • Reusable

    • Identify resolution for data object referencing

    • Aggregation of data updates

    • Aggregation-agonistic findability of content

    • Publication of controlled vocabularies

  • Interoperable

    • Specification of data/services interoperability requirements

    • Soft vs hard-typed data models

    • Schema mapping (structural transformation)

    • Semantic mapping

  • Accessible

    • Data and information sharing

    • Determine and filter by content or property characteristic or orthogonal aspects such as e.g. data quality

  • Findable

    • Findability via context

    • Findability of data access services

The set of FAIR implications shows that IPT building blocks need to be embedded into an agile reference architecture (ARA) model. IPT can only be established if the IPT building blocks are harmonized with FAIR building blocks that ensure semantic interoperability. Semantic interoperability is a significant challenge within distributed systems. Thus the semantic interoperability layer needs principles to break it down into manageable elements. For IPT, profiles are most likely to be a way to describe which aspects of which resources are similar. Testbed-20 needs to explore how functional levels of interoperability can be defined, to provide a scalable framework for data query and analysis in the context of IPT. Provenance, the driver behind integrity and ultimately trust, is implemented through three layers – features, metadata, and processing. Data in the form of coverages or vectors, as well as collections thereof, need to be described and queryable, to understand both origin and history of the data across all manipulation steps. It is expected that the corresponding provenance graph will require some form of emergent metadata, i.e. metadata that is continuously enriched across the data manipulation chain.

When considering Resilient Data Services, i.e. the delivery of data and services that are tolerant of climate impact, the data services must consider:

  1. Addressing the challenges of resilience, integration and interoperability

  2. Universal access, for discovery and assurance.

  3. Architectural options for distribution, discovery, and access to contextual information to support heterogeneous information sources.

  4. Enabling a continuous integration and continuous testing (CICT) approach.

  5. Network Characteristics and incorporating intelligent monitoring to ensure Quality of Service.

2.1.2. Aim

  • To explore a reference architecture that supports Resilient Data Services: delivery of data and services that are immune to climate effects.

2.1.3. Previous Work

This effort is closely related to the activities reported in the OGC Testbed-19 Agile Reference Architecture Engineering Report in particular and the draft OGC Testbed-19 GeoDataCubes Engineering Report. The OGC Testbed-19 ARA Engineering Report (ER) outlined novel concepts for establishing a federated agile infrastructure of collaborative trusted systems (FACTS) that was capable of acting autonomously to ensure fit-for-purpose cooperation across the entire system. One of the key objectives was to not create a new data product, but instead a collaborative object was offered leveraging FACTS that allowed for obtaining the data product via well-defined interfaces and functions provided by the collaborative object.

Another resource for this task is the OGC Building Blocks register which provides an overview of a series of building blocks managed by the OGC community through a variety of processes. These include formal standards publication processes by the Standards Working Groups, agreements with other standards bodies (eg. ISO), community-hosted examples of re-use (profiles and extensions of OGC standards), and informal "incubator" processes where more than one project needs a solution, and appropriate SWG scope is yet to be determined.

A further resource is the OGC RAINBOW, a web-accessible source of information about things (“Concepts”) the OGC defines or that communities ask the OGC to host on their behalf. It applies FAIR principles to the key concepts that underpin interoperability in systems using OGC specifications.

2.1.4. Work Items and Deliverables

The following graphic illustrates all work items and deliverables of this task.

unnumbered

The IPT building blocks shall be demonstrated in an End–to-End (E2E) Resilient Data Services scenario. The scenario should be related to climate change.

Components

D100 IPT Server: E2E Resilient Data Service implementation, which incorporates Integrity, Provenance and Trust (IPT) as building blocks. Multiple instances of this deliverable may be funded.

Engineering Report

D001 Integrity, Provenance and Trust Engineering Report: An Engineering Report that captures all results and lessons learned. It will also address building blocks usage in an Agile Reference Architecture.

2.2. GEOINT Imagery Media for ISR (GIMI)

The Next Generation ISR Imagery Standards (NGIIS) initiative is a forward-looking effort aimed at fundamentally transforming the standards for Intelligence, Surveillance, and Reconnaissance (ISR) imagery. Part of this initiative is the development of the GEOINT Imagery Media for ISR (GIMI), encapsulated within NGA Standard 0076. Pronounced "gimmie," GIMI is designed to revolutionize how still and motion imagery is managed, integrated, and utilized across various defense and intelligence platforms.

unnumbred

The GIMI Standard integrates advanced media standards such as the ISO/IEC 14496-12 ISO Base Media File Format (ISOBMFF) and the ISO/IEC 23008-12 High Efficiency Image File Format (HEIF). This integration will enable a unified container for both Still and motion imagery, significantly enhancing interoperability and access across platforms. Key features of GIMI include robust security metadata, precise geopositioning, and a globally-unique content identifier system. These capabilities are essential for modern ISR operations and are poised to allow GIMI to replace older systems like the National Imagery Transmission Format (NITF), setting the stage for it to potentially become the new standard within NATO operations.

In October of 2023, the Open Geospatial Consortium (OGC) Code Sprint was held in the UK, with GIMI being one of the event’s main focuses. The event showcased the draft GIMI Profile (2023 Open Standards Code Sprint Summary Engineering Report), which proved to be a dependable container for GEOINT data within the OGC ecosystem. This confirmation established GIMI as a robust option for future applications.

Looking ahead, Testbed-20 is set to further the development and implementation of the GIMI standards, with a focus on cloud-optimized capabilities. The Testbed-20 activities will cover multiple aspects of the GIMI development process, from standardization and tool development to software support and documentation of gained knowledge.

2.2.1. Problem Statement and Research Questions

The primary goal of the Testbed-20 GIMI task is to develop, implement, and validate content that will form the basis of a future GIMI Standard, addressing specific standardization issues within the context of payload optimization and metadata management. Also, the development of the Testbed-20 GIMI Specification necessitates a reliable method to evaluate the performance and quality impacts of various design choices. This evaluation is critical to determine the most efficient design options for the GIMI implementation, especially when comparing it to Cloud Optimized GeoTIFF (COG). Therefore, part of the GIMI task is to develop those benchmarks and propose their incorporation into the OGC Compliance and Interoperability Test Environment (CITE).

The Engineering Report from the aforementioned sprint (https://docs.ogc.org/per/23-059.html) identified several recommended future actions to advance GIMI as an industry standard. They include:

  • A: for Future Work Items on OGC Encoding Standards

    • A.1: Tiled complex, signed, and floating-point data in multiband raster data.

    • A.2: Performance benchmarking between various imagery formats, for a variety of use-cases.

    • A.3: Extraction of the conceptual model of GMLJP2 to make it applicable to any ISOBMFF profile.

  • B For Future Work Items on GIMI

    • B.1: Extension of open source-based libraries used by several other products for creation and modification of GIMI files.

    • B.2: Prototype streaming of large images from client applications to servers for storage through OGC API-Processes.

    • B.3: Prototype a JavaScript Object Notation (JSON) encoded alternative to key-Length-Value (KLV) metadata for use in GIMI files.

These future work actions will be addressed, in whole or in part, through Testbed-20. To advance GIMI Standards and implementations, with an emphasis on Cloud Optimized capabilities, through the following activities:

  1. Develop a Testbed-20 GIMI Specification to capture recommended content for future GIMI standards (Future work A.1, A.3, B.2, B.3) (implemented in D120 and captured in D010)

  2. Develop performance and quality benchmarks for OGC COG and GIMI (Future work A.2) (implemented in D124 using D120/D122 and captured in D124)

  3. Generate a feature and capability comparison between OGC COG and GIMI (based on D120/D122/D124 experiences and captured in D013)

  4. Expand the number and scope of open-source libraries that support GIMI (Future work B.1), which is why this Call for Participation in particular invites open-source implementations for GIMI work items.

  5. Provide implementors with guidance on when and how to use COG and GIMI (based on D120/D122/D124 experiences and captured in D012)

Key research questions may include:

  • Payload Optimization

    1. How can GIMI files be optimally organized to support efficient extraction of payload content, similar to the Cloud Optimized GeoTIFF (COG) model?

    2. What are the potential benefits and drawbacks of various tiling, interleaving, blocking, and padding schemes in the organization of GIMI files?

    3. How can image pyramids and overviews be configured within GIMI files to support effective pan/zoom navigation across large datasets?

    4. What are the best practices for configuring GIMI files to download all necessary structural metadata with a minimal number of HTTP requests?

  • Metadata

    1. What core metadata should be defined for GIMI to support common discovery and location needs, and how can this metadata be future-proofed to adapt to both current and foreseeable future missions?

    2. How can GIMI metadata accommodate imaging sensors that are not overhead, such as hand-held or vehicle-mounted sensors, or those with a highly oblique field of view?

    3. In what ways can GIMI support the use and exploitation of non-imagery data (e.g., hyperspectral data or Sensor Independent Complex Data) that are not primarily intended for human viewing but for software analytics?

    4. What opportunities exist for GIMI to develop a metadata model that not only complements but extends beyond existing OGC imagery metadata models (like GMLJP2, GeoTIFF, etc.)?

  • COG and GIMI Benchmark

    1. What systems, software, and capabilities are necessary to set up a repeatable benchmark environment for testing cloud-optimized GEOINT imagery solutions?

    2. How can performance tests be structured to accurately measure and compare the execution times, error rates, and cloud service usage between GIMI and COG files?

    3. What tiling and interleaving schemes provide optimal performance for accessing imagery content in cloud environments? How do these configurations affect the performance and functionality of COG and GIMI implementations?

    4. How can these performance results contribute to the development of GIMI standards and help guide future implementors in choosing the most effective imaging solutions?

  • Implementation Guide / Best Practices

    1. What are the key lessons learned from implementing and testing GIMI and GeoTIFF formats, and how can they be effectively communicated to assist future implementors?

    2. How do cloud-optimized GIMI files perform in comparison to COG and other competing formats in terms of interaction, functionality, and efficiency?

    3. On what criteria should these formats be compared to provide a fair assessment in terms of capabilities, extensibility, and cost?

    4. What considerations are needed for client applications in terms of compute platforms, availability, and the choice between open source and commercial tools?

2.2.2. Aim

  • Develop Baseline Testbed-20 Specification for GIMI to outline recommended content for future GIMI standards, addressing issues like prototype streaming of large images.

  • To establish benchmarks for Coverage data between OGC COG and GIMI and its incorporation into the OGC Compliance and Interoperability Test Environment.

  • To generate a comparison between OGC COG and GIMI to guide implementors on when and how best to utilize competing standards.

2.2.3. Previous Work

The Open Geospatial Consortium supported a Code Sprint the week of Oct 23rd, 2023, in the UK to demonstrate the viability of the draft GIMI Profile of ISOBMFF as an alternative container of GEOINT data for use within the OGC ecosystem. The Engineering Report (https://docs.ogc.org/per/23-059.html) produced through this Sprint identified several recommended future actions to advance GIMI as an industry standard. The engineering report concludes that with minor changes to popular open-source base libraries, a wide range of software can quickly make use of GIMI capabilities.

unnumbred

The resources listed in Appendix A.5 are also relevant to this task.

2.2.4. Work Items and Deliverables

The following graphic illustrates all work items and deliverables of this task.

unnumbred

In detail, the following work items will be addressed:

Components

D120 GIMI Implementations: Implementations of the future GIMI standard with support for the payload optimization and core metadata features as described above. Ideally, both work items are used for the benchmarking activities in D124. At least, vanilla GIMI instance shall be made available for benchmarking. The open-source implementations are encouraged. The participants are expected to contribute to the Engineering Reports D010, D013, and D012. Multiple instances of this deliverable may be funded.

D122 COG implementations and benchmarking: Implementations of COG files to be used for benchmarking in D124. Participants are expected to contribute with their expertise to all research questions identified above in the section “COG and GIMI Benchmark”. The participants are required to contribute to the Coverage Selection Engineering Report and the Benchmarking Engineering Report (D011). Multiple instances of this deliverable may be funded.

D124 COG/GIMI benchmarking environment: Cloud-based benchmarking environment with appropriate COG/GIMI client instance that supports performance tests as described in the sections above. The participant is required to execute all benchmarking tests using self-developed GIMIs as well as GIMI and COG instances provided by other participants. Ideally, the participant includes other competitive formats in the performance evaluation on top of GIMI and COG. The participant serves as lead editor for the Benchmarking Engineering Report which captures all benchmarking setups, experiments, and results.

Engineering Reports

D010 GIMI Specification Engineering Report: Engineering Report that can be used as a baseline for a future GIMI standard by the appropriate SWG. The report shall use the template for OGC Standards and follow the corresponding requirements. The report shall document all aspects listed above.

D013 GIMI Lessons Learned and Best Practices Engineering Report: This Engineering Report captures all major results from this task (without repeating any detail provided in the other reports.

D012 Coverage Selection Engineering Report: This Engineering Report captures all lessons learned and Best Practice recommendations that help making the right choice to select the right coverage serialization method. The report takes results from D120/D122 into account, discusses the advantages and disadvantages of the various solutions, and ideally provides further comparison with other competitive formats and approaches. The report shall reflect on typical use-cases.

2.3. Advancements on GeoDataCubes

The community has invested significant resources into Geospatial Data Cubes (GDC), highlighting a commitment to creating infrastructure conducive to the organized storage and use of multidimensional geospatial data. While progress has been made, the state-of0the-art falls short of fully interoperable GDCs capable of meeting specific organizational requirements. Establishing reference implementations to ensure GDCs are both interoperable and exploitable, particularly in Earth Observation (EO) contexts, remains a priority.

unnumbered

Born out of the collective efforts of the OGC Testbed-19 (draft Engineering Report is available here) initiative and GeoDataCube SWG, the GDC Draft API strives to unify the disparate threads of geospatial data cube technology. By integrating elements from the openEO API, OGC API-Processes, and SpatioTemporal Asset Catalog (STAC), the GDC draft API presents a holistic approach to accessing, managing, and processing Earth Observation data.

The draft GDC API standard addresses the critical need for Analysis Ready Data (ARD) by offering robust solutions to handle heterogeneous, multi-dimensional geospatial datasets seamlessly. This capability is vital for organizations and researchers dealing with large volumes of Earth observation and environmental data, enabling them to extract meaningful insights efficiently. As GeoDataCube continues to evolve, it aims to set the standard for geospatial data cube technologies, pushing the boundaries of what can be achieved with geospatial analytics.

This Testbed-20 task aims to support the GDC draft API by developing a language-independent approach to using GDCs in workflows (e.g. OGC API-Processes, OpenEO, CWL). This approach should allow workflows and processes to be interchangeable regardless of the language used to create them. This task also encompasses the design and execution of a structured usability test for a use-case that implements GDC-API draft Standard in a realistic operational setting. This test should involve a community of end-users, assembled by the User Testing participant, and the results should be provided in a report with recommendations for improvements. Participants can choose the specific method for conducting the usability tests, e.g. workshops, paper-based exercises). Additionally, a GDC API demonstrator will be developed and documented to exchange information on data sources (i.e. metadata) and provenance information (i.e. processing steps, algorithms, specifications) for a GeoDataCube in both open science and non-disclosed workflows.

2.3.1. Problem Statement

The primary objective of this task is to develop a language-independent extension for the GeoDataCube (GDC) draft API, enabling the creation of workflows that remain consistent across different workflow description languages. This task involves participants from different platforms integrating this extension using technologies like openEO, CWL, and OGC API-Processes. The extension aims to foster interoperability between various implementations and streamline the workflow creation process, drawing on existing frameworks established in the current GDC API.

Key research questions may include:

  1. How effectively does the new API extension facilitate interoperability across different workflow languages and systems?

  2. What are the challenges and limitations encountered in adapting existing platforms to this new, language-independent framework?

  3. How do different implementations of the GDC API extension compare in terms of functionality and ease of integration?

  4. How well does the GDC API integrate into existing workflows within scientific and research communities?

  5. What are the specific usability challenges encountered by users when implementing the GDC API in practical scenarios?

  6. Based on user feedback, what improvements can be made to enhance the functionality and user-friendliness of the GDC API?

  7. How can the GDC API demonstrator effectively manage and convey provenance and metadata information across different types of workflows?

  8. What are the challenges in maintaining the confidentiality of sensitive information while ensuring the usability of the demonstrator in diverse operational environments?

  9. How do users perceive the utility and functionality of the demonstrator in real-world applications, and what improvements are necessary to meet the diverse needs of the scientific community?

  10. How can the development and implementation of a GDC API demonstrator facilitate the exchange of information on metadata and provenance for a given GeoDataCube, while explicitly considering its applicability in both open science workflows and workflows where metadata and provenance details are either partially known or cannot be fully disclosed?

The GeoDataCube use-cases in Testbed-20 (Appendix B and Appendix C) highlight two important aspects of datacube interoperability between meteorological data and other raster data. For ECMWF, a successful GeoDataCube standard must adequately express the ability to handle these different types of datacube. This means being able to handle:

  • A variety of different dimensions (not just x,y,z,t, but also ensemble numbers, multiple axes of time). These axes fall into the category of countable, measurable and metadata axes. More details can be found in ECMWF’s use-case section in the Testbed-19 CFP document.

  • Non-cartesian grids, because meteorological data is typically served on non- lat-lon grids.

2.3.2. Aim

  • To create interoperability between openEO, CWL and/or OGC API Processes, participants will collaborate on implementing a language-independent approach to the use of GeoDataCubes in workflows.

  • To design and carry out a structured usability test for a use-case implementing GDC draft API Standard in a realistic operational setting, and provide a report with recommendations for improvements and opportunities for its implementation.

  • To develop a GDC API demonstrator that enables exchanges of information on data sources (i.e. metadata) and provenance (i.e. processing steps, algorithms, specifications) for a given GeoDataCube, and to consider its use in open science workflows and workflows where metadata and provenance cannot be fully disclosed.

2.3.3. Previous Work

The GeoDataCube Engineering report (23-047) of OGC Testbed-19 explains how a unified draft GDC API (OGC Testbed-19 Draft API - Geodatacubes specification (23-048)) was developed by integrating the existing solutions, based on cross walk comparisons. The Testbed-19 Call for Participation provides further insights into requirements and use-cases. The openEO API specification was used as a foundation for defining a draft OGC GDC API draft standard, as it is largely compliant with the OGC API-Common Standard. During the Testbed-19 period, more building blocks from the OGC API were incorporated into the draft GDC API document, including parts of OGC API-Common, OGC API-Coverages, and OGC API-Processes.

The OGC Testbed-17: Geo Data Cube API Engineering Report (21-027) is accessible from the OGC public engineering reports webpage, and it documents the results and recommendations of the Geo Data Cube API task in 2021. The ER defines a draft specification for an interoperable GDC API that uses OGC API building blocks, explains the implementation of the draft API, and explores various aspects including data retrieval and discovery, cloud computing, and Machine Learning. Implementations of the draft GDC API have been demonstrated with use-cases that involve the integration of terrestrial and marine elevation data and forestry information for Canadian wetlands.

2.3.4. Work Items and Deliverables

The following graphic outlines all activities, work items, and deliverables in this task.

unnumbered

The following list identifies all deliverables in this task. The detailed requirements are stated above. All participants are required to participate in all technical discussions and contribute to the development of the Engineering Report(s).

Components

D140 OGC GDC API Profile: Development of extensions of language-independent GDC draft API. Multiple instances of this deliverable may be funded.

D144 GDC Provenance Demo: GDC API demonstrator that enables exchanges of information on data sources (i.e. metadata) and provenance (i.e. processing steps, algorithms, specifications) for a given GeoDataCube. Multiple instances of this deliverable may be funded.

Engineering Reports

D020 GDC API Profile Engineering Report: An Engineering Report comparing the D140/D141 implementations to assess the potential of the extension to create interoperability between implementations using openEO, CWL and/or OGC API Processes. This report may include recommendations for updates to the API documentation.

D021 GDC Usability Testing Report: Structured usability tests will be conducted both in-person (location and time are TBD) and virtually for implementing GDC API draft Standard in a realistic operational setting. Review Appendix B and Appendix C for the use-cases that must be incorporated. This test should involve a community of end-users, assembled and conducted by the D021 participant, and the results should be provided in an Engineering Report with recommendations for improvements. This Engineering Report includes evaluations of full or partial support of each implemented part of the GDC Standard, a series of recommendations for improvements of the draft GDC API, and opportunities for its implementation.

The initial phases of usability testing will be conducted during the testbed by the team that is responsible for this deliverable. The results will be captured in the preliminary draft Engineering Report (M09). The participant may also conduct a virtual workshop to gather initial results for the usability testing and include them in the DER, which will be ready for review at the M12 Milestone. However, the Usability Testing Report for D021 GDC will not be due at the M13 Milestone deadline, as the results of the in-person workshop will be included in the final DER after the workshop. The M13 milestone will be conducted shortly after the inclusion of the results.

D022 GDC Provenance Demo Engineering Report: An Engineering Report on the development of the provenance demonstrator should explicitly consider its use in both open science workflows and workflows where details on metadata and provenance may not be known or cannot be fully disclosed. Please note that the demonstrators created will be maintained for 6 months following the project, and the code and sample data will be archived in a trusted digital repository, e.g., zenodo, or osf.io.

2.4. High-Performance Geospatial Computing Optimized Formats

Note

This topic has not yet been funded; all work proposed to the HPGC Optimized Formats topic should be scoped as in-kind contributions. Some resource support will be available through OGC’s participation in the NSF-funded I-GUIDE project. I-GUIDE will provide access to HPC processing resources. These resources can be accessed through NSF ACCESS. In case the Testbed-20 participants are unfamiliar with HPC, I-GUIDE can also offer tutorials and materials to help them work and interact with HPC resources. Additionally, I-GUIDE will be available to provide support for any issues that may arise while working with HPC resources throughout the Testbed-20 period.

unnumbered

2.4.1. Problem Statement and Research Questions

High Performance Computing (HPC) is proving pivotal in the geospatial sector for analyzing and processing the vast amounts of data inherent in fields like meteorology, disaster management, and climate change, enabling swift decision-making and deep insights. The challenge lies in the traditional inaccessibility or complexity of HPC resources for geospatial researchers. Yet, advancements highlighted by Testbed-19’s investigation into HPC reveal that Open Geospatial Consortium (OGC) standards and interfaces can effectively connect user frontends with HPC backends, optimizing High-Performance Geospatial Computing (HPGC) resources for analytical applications.

Optimizing the performance of HPGC APIs involves reducing data transfers, taking advantage of parallel processing, and implementing effective memory management techniques. To achieve this, traditional geospatial algorithms must be reengineered to fully utilize HPGC systems. Parallelization strategies must be implemented to enhance HPC workflows. However, these strategies’ success depends on robust data management practices. In the future, advancements in HPGC will likely focus on geospatial data indexing and partitioning, which are critical to unlocking the full potential of parallelism in HPGC and ensuring high-performance workflows can keep up with the ever-growing demands of the geospatial domain.

Cloud-optimized formats such as Cloud Optimized GeoTIFF (COG), Cloud Optimized Point Cloud (COPC), GeoZARR, and GeoParquet have been developed independently of HPC requirements and design principles. These various formats all support the efficient handling of large datasets by providing access to subsets of the data, i.e., they reduce the need to download entire datasets before processing them. This feature could be advantageous in HPGC settings as well, where data accessibility and I/O operations are often the bottlenecks to efficient computation. Testbed-20 will evaluate cloud-optimized formats and investigate their applicability to HPGC. This may lead to the development of new HPC-optimized formats.

Key research questions may include:

  • Can the existing cloud-optimized formats also be used as HPC-optimized formats?

  • What are the best practices for geospatial data indexing and partitioning (e.g., adaptive indexing strategies, dynamic partitioning)?

  • What is(are) the optimal solution(s) for an HPC-optimized format?

2.4.2. Aim

  • This task investigates integrating cloud-optimized formats within HPC environments to scale geospatial workflows more effectively, improve computational efficiency, and advance the use of parallelism in HPGC implementations.

2.5. Previous Work

OGC’s approach towards cloud-native geospatial ecosystems includes advancing data encoding standards such as COG for tiled rasters and GeoZarr for data cubes. GeoParquet will complement these by making vector datasets easily accessible from the cloud.

  • Cloud Optimized GeoTIFF: Testbed-17 investigated COG with the aim of developing a specification that the GeoTIFF SWG can directly propose as an OGC standard.Testbed-17 also compared COG with other solutions for multi-dimensional data in the cloud context with focus on Zarr. The Testbed-17 Task produced the OGC Testbed-17: Cloud Optimized GeoTIFF Specification Engineering Report (21-025).

  • GeoParquet: The GitHub repository for GeoParquet provides the draft specification and is the hub for ongoing development, documentation, and issue deliberation. It includes the format specifications and metadata details necessary for developers and those interested in contributing to the project.

  • GeoZarr: Zarr is an OGC community standard, but GeoZarr will be an OGC Standard. The work on Zarr and its evaluation, including comparisons with other formats like COG, is detailed in the OGC Testbed 17: COG/Zarr Evaluation Engineering Report (21-032).

  • High-Performance Computing: The Testbed-19 High-Performance Computing Report (23-044) discusses the development and implementation of an API-based approach for HPGC. The HPGC API was designed and profiled based on the existing OGC API – Processes Standard. To make it easier for users to access, a client Python library based on the HPGC API was implemented. This library simplifies workflow deployment, execution, monitoring, and result visualization within the familiar environment of Jupyter notebooks. This demonstrates how effortless and flexible the HPGC API is when it comes to carrying out real-world geospatial computing tasks.

2.5.1. Work Items and Deliverables

The following graphic outlines all activities, work items, and deliverables in this task.

unnumbered

The following list identifies all deliverables in this task. The detailed requirements are stated above. All participants are required to participate in all technical discussions and contribute to the development of the Engineering Report.

Components

D160 HPC Optimized GeoTIFF Component: Develop one or more versions of HPC Optimized GeoTIFF based on COG.

D161 HPC Optimized GeoZarr Component: Develop one or more versions of HPC optimized GeoZarr.

D162 HPC Optimized GeoParquet Component: Develop one or more versions of HPC Optimized GeoParquet.

Engineering Report

D030 HPC Optimized Formats Engineering Report: An engineering report describing the investigation results of incorporating cloud-optimized formats concepts into HPC. The best practices for indexing and partitioning geospatial data are integral to this report.

3. Deliverables Summary

The following tables summarize the full set of Initiative deliverables. Technical details can be found in section Technical Architecture.

Please also note that not all work items were supported by sponsor funding at time of CFP publication. Negotiations with sponsors are ongoing, but there is no guarantee that every item will ultimately be funded.

Bidders are invited to submit proposals on all items of interest under the assumption that funding will eventually become available.

Table 2. CFP Deliverables - Grouped by Task

Task

Funding

ID and Name

Integrity, Provenance, and Trust (IPT)

Available

  • D100 IPT Server

  • D001 Integrity, Provenance and Trust Engineering Report

GEOINT Imagery Media for ISR (GIMI)

Available

  • D120 GIMI Implementations

  • D122 COG implementations and benchmarking

  • D124 COG/GIMI benchmarking environment

  • D010 GIMI Specification Engineering Report

  • D012 Coverage Selection Engineering Report.

  • D013 GIMI Lessons Learned and Best Practices Engineering Report

Advancements on GeoDataCubes

Available

  • D140 OGC GDC API Profile

  • D144 GDC Provenance Demo

  • D020 GDC API Profile Engineering Report

  • D021 GDC Usability Testing Report

  • D022 GDC Provenance Demo Engineering Report

High-Performance Computing Optimized Formats

In-kind only

  • D160 HPC Optimized GeoTIFF Component

  • D161 HPC Optimized GeoZarr Component

  • D162 HPC Optimized GeoParquet Component

  • D030 HPC Optimized Formats Engineering Report

4. Miscellaneous

Call for Participation (CFP): The CFP includes of a description of deliverables against which bidders may submit proposals. Several deliverables are more technical in nature, such as documents and component implementations. Others are more administrative, such as monthly reports and meeting attendance. The arrangement of deliverables on the timeline is presented in the Master Schedule.

Each proposal in response to the CFP should include the bidder’s technical solution(s), its cost-sharing request(s) for funding, and its proposed in-kind contribution(s) to the initiative. These inputs should all be entered on a per-deliverable basis, and proposal evaluations will take place on the same basis.

Once the original CFP has been published, ongoing updates and answers to questions can be tracked by monitoring the CFP Corrigenda Table and the CFP Clarifications Table. The HTML version of the CFP will be updated automatically and stored at the same URL as the original version. The PDF version will have to be re-downloaded with each revision.

Bidders may submit questions using the Testbed-20 CFP Questions Form. Question submitters will remain anonymous, and answers will be regularly compiled and published in the CFP clarifications.

A Bidders Q&A Webinar will be held on the date listed in the Master Schedule. The webinar is open to the public, but anyone wishing to attend must register using the provided link. Questions are due on the date listed in the Master Schedule.

Participant Selection and Agreements: Following the submission deadline, OGC will evaluate received proposals, review recommendations with Sponsors, and negotiate Participation Agreement (PA) contracts, including statements of work (SOWs). Participant selection will be complete once PA contracts have been signed with all Participants.

Kickoff: The Kickoff is a meeting where Participants, guided by the Initiative Architect, will refine the Initiative architecture and settle upon specific use-cases and interface models to be used as a baseline for prototype component interoperability. Participants will be required to attend the hybrid Kickoff, including breakout sessions, and will be expected to use these breakouts to collaborate with other Participants and confirm intended Component Interface Designs.

Regular Telecons and Meetings After the Kickoff, participants will meet frequently via weekly telecons and in person at OGC Member Meetings.

Development of Deliverables: Development of Components, Engineering Reports, Change Requests, and other deliverables will commence during or immediately after Kickoff.

Under the Participation Agreement contracts, ALL Participants will be responsible for contributing content to the ERs, particularly regarding their component implementation experiences, findings, and future recommendations. But the ER Editor will be the primary author on the shared sections such as the Executive Summary.

More detailed deliverable descriptions appear under Types of Deliverables.

Final Summary Reports, Demonstration Event and Other Stakeholder Meetings: Participant Final Summary Reports will constitute the close of funded activity. Further development work might take place to prepare and refine assets to be shown at webinars, demonstration events, and other meetings.

Assurance of Service Availability: Participants selected to implement service components must maintain availability for a period of no less than six months after the Participant Final Summary Report milestone.

Demonstrator Maintenance: OGC staff may select a limited number of exceptional demonstrators each year for maintenance beyond the period of performance of the Pilot to enable ongoing public availability. These carefully chosen participants will be recognized for their exemplary demonstrators by OGC hosting these demonstrators and showcasing their capabilities to the public. The conditions and practical arrangements for this support will be arranged on a case-by-case basis, tailored to the unique needs of each deserving winner.

Appendix A: GIMI Specifications

The primary goal of Testbed-20 is to develop implemented and validated content for a future GIMI Standard. The Testbed-20 GIMI Specification describes the standardization issues to be explored in this initiative.

A.1. Payload Optimization

This effort will explore options for how GIMI files should be organized to support the efficient extraction of payload content. Initially, this work should focus on the example provided by COG. HTTP protocols, such as the HTTP Range Request (RFC 9110) could be coupled with a tiled image pyramid (overviews) payload structure to access client-selected content. That content, and only that content, would be downloaded. Thereby obviating the need to transfer the entire file.

Additional efforts can be focused on asynchronous protocols such as MQTT Publish-Subscribe and streaming. Using these protocols, changes to the client-selected content would trigger a notification. The client would then be free to download that content as discrete tiles or as a stream of pixels.

The following parameters apply to this subtask:

  1. Work with imagery content encoded using ISO 23001-17 (Uncompressed Codec), ISO 15444- 1 (J2K), ISO 15444-15 (HTJ2K), ISO 23008-2 (HEVC), and ISO 14496-10 (AVC).

  2. Explore various tiling, interleaving, blocking, and padding schemes, identifying the advantages and disadvantages of each.

  3. Investigate the configuration of GIMI files using image pyramids and overviews to support pan/zoom navigation across large datasets.

  4. Investigate options to download all structural metadata content (via the Metabox) within GIMI files with a minimal number of HTTP requests. This metadata provides the client with a “roadmap” to the GIMI content.

Results from this effort should be used to define the payload portion of the Testbed-20 GIMI Specification.

A.2. Metadata

The Testbed-20 GIMI Specification will also define “core” metadata for GIMI as well as the data architecture and techniques needed to extend that core in a manageable fashion. This core metadata should support common discovery and location metadata as well as the metadata required to support payload optimization, as described above. It must also be “future-proofed”. Capable of supporting all current missions and extensible enough to adapt to most future missions.

It’s not just overhead: Many imaging sensors are not overhead. They are hand-held, vehicle-mounted, or have a highly oblique field of view. The GIMI metadata must be able to locate an image without ground coordinates for its four corners.

It’s not just about pictures: Many “imaging” collections are not intended for human viewing. These are coverages. [1] Collections of measurements organized in a regular or sometimes irregular grid. Examples include hyperspectral (chemical signatures) and Sensor Independent Complex Data (phase shift and amplitude). These collections may be used to produce human viewable pictures, but their primary consumer is software analytics. Support to exploit this non-imagery data must be provided in the GIMI “imagery” metadata.

Promote commonality: Other OGC imagery standards also define metadata models to augment existing non-OGC imagery standards (GMLJP2, GeoTIFF, etc.). There have been sporadic attempts to harmonize across these models, but the installed base has made these efforts impractical. GIMI, however, does not have a large installed base. This is an opportunity to develop an imagery metadata model which is a conceptual and functional superset of the existing OGC imagery metadata models. Specifically, Testbed-20 should consider GMLJP2 (as recommended by the Code Sprint), GeoTIFF, Observations and Measurements, and the Semantic Sensor Network Ontology (SSN). In the interest of future-proofing, change requests under consideration for these standards should also be included.

A.3. COG and GIMI Benchmarks

Development of the Testbed-20 GIMI Specification requires a means to quantify the performance and quality impact of the design options. The development of executable benchmarks which are accurate and repeatable is essential to completing that task. In addition, such benchmarks would prove valuable to implementors who, faced with a collection of options, will want to select the option most suitable for their requirements. Therefore, this subtask will develop those benchmarks and propose their incorporation into the OGC Compliance and Interoperability Test Environment (CITE).

Since Cloud Optimized GeoTIFF and GIMI have overlapping capabilities, these benchmarks should be capable of performing apples vs. apples comparisons between COG and GIMI implementations.

The recommended approach for this topic is as follows:

  1. Identify systems, software, and capabilities required to implement, test, measure, and evaluate cloud-optimized GEOINT imagery solutions. Use them to assemble a repeatable benchmark environment.

  2. Setup and execute performance tests of a client requesting portions of GIMI and COG files hosted in a cloud architecture. Document the results regarding execution times, error rates, cloud service usage and other statistics that demonstrate the effectiveness of each approach

    1. Test files for interoperability with common codecs and browsers.

    2. Test files with different tiling and interleaving schemes to determine optimal settings for imagery content access.

    3. Test Still Imagery files for pan/zoom functionality and performance when implemented with image pyramids and overviews.

  3. Capture the results and document for use by future implementors.

A.4. Implementation Guide / Best Practices

The lessons learned from this effort will be captured in a Best Practices document for the purpose of guiding implementors of GeoTIFF and GIMI on the most effective solution for their specific requirements.

Compare the performance of cloud-optimized GIMI file interaction versus COG or other competing formats. Where possible, compare capabilities on an even basis in terms of capabilities and extensibility of the format, cost, availability of tools, open source versus commercial tools, compute platforms, etc.

Identify software tools, libraries, and applications required to support each format. Determine and document requirements for cloud applications. Determine and document requirements for client applications.

A.5. Resources

The following resources are relevant to this task.

A.5.1. ISO Standards

  1. ISO/IEC 14496-12, Information technology - Coding of audio-visual objects - Part 12: ISO base media file format.

  2. ISO/IEC 23008-12:2022, Information technology - High efficiency coding and media delivery in heterogeneous environments - Part 12: Image File Format (Note that updates to this version have been created since 2022 and should be considered as part of this Testbed)

  3. ISO/IEC 15444-16:2021, Information technology - JPEG 2000 image coding system — Part 16: Encapsulation of JPEG 2000 images into ISO/IEC 23008-12.

  4. ISO/IEC 15444-1:2019, Information technology - JPEG 2000 image coding system: Part 1: Core coding system.

  5. ISO/IEC 15444-15, Information technology — JPEG 2000 image coding system — Part 15: High- Throughput JPEG 2000.

A.5.2. Draft Documents

  1. NGA.STND.0076, GEOINT Imagery Media for ISR, v0.6

  2. NGA.SIG.0045, Standard Information/Guidance (SIG) ISO Base Media File Format (ISOBMFF) Overview for NGA Applications, v0.6

  3. ISO/IEC 23008-12/Amd 3, Information technology − MPEG systems technologies - High efficiency coding and media delivery in heterogeneous environments - Part 12: Image File Format – WD Amd 3, October 2023

  4. ISO/IEC 23001-17, Information technology - MPEG systems technologies - Part 17: Carriage of uncompressed video and images in ISO Base Media File Format – FDIS, Nov 2023

  5. ISO/IEC 23001-17/Amd 1, Information technology — MPEG Systems technologies — Part 17: Carriage of uncompressed video and images in ISO Base Media File Format — Amendment 1: High precision time tagging – DAM1, 24 Jan 2024

  6. ISO/IEC 23001-17/Amd 2, Information technology — MPEG Systems technologies — Part 17: Carriage of uncompressed video and images in ISO Base Media File Format — Amendment 2: Generic Sample Compression – CDAM2, 24 Jan 2024

  7. m66529 [HEIF] Region Partitioning, Information technology − MPEG systems technologies - High efficiency coding and media delivery in heterogeneous environments - Part 12: Image File Format – input contribution, Jan 2024

  8. m66528 [HEIF] Combination of Regions, Information technology − MPEG systems technologies - High efficiency coding and media delivery in heterogeneous environments - Part 12: Image File Format – input contribution, Jan 2024

Appendix B: GeoDataCubes Use-case A: Data Visualization

B.1. Background

The use-cases of this task are based on the European Centre for Medium-Range Weather Forecasts (ECMWF) requirements. ECMWF is an intergovernmental organization whose main responsibility is operational numerical weather prediction (NWP). ECMWF runs 4 operational forecast cycles per day, consisting of one high-resolution deterministic forecast and a 51-member ensemble forecast. The high-resolution deterministic forecast provides the best guess of the future atmospheric conditions, and forecasts to 10 days ahead. The ensemble forecasts are perturbed forecasts representing the uncertainty in the known state of the atmosphere and model parameterization. They are used to create probabilistic forecasts, and typically forecast up to 15 days ahead.

The users of this data are national weather services, commercial users, and the general public. A large portion of ECMWF data is served as Open Data. All forecast data output is also archived in the Meteorological Archival and Retrieval System (MARS), which currently contains over 400PiB of meteorological data.

As well as operational forecasts, ECMWF also conducts research and is involved in projects such as Copernicus and Destination Earth, which all produce data which is curated for semantic storage and access in the same systems.

All data at ECMWF, whether operational or research, is stored and accessed using a semantic data modelling language (the MARS language). The entirety of the data is addressed as a single, multi-dimensional datacube. The granularity of the datacube is two-dimensional global fields, where each field is represented on a grid (e.g., octahedral, HEALPix, ORCA).

B.2. Use-case Description

The objective of this use-case is to enable users to visualize combined datacubes, coming from both meteorology and earth observations, over a certain area.

B.3. Potential Issues

There will be three main challenges inherent in this use-case:

  • ECMWF meteorological data is not in the regular latitude-longitude grid, so some processing will be required to align the satellite data with ECMWF data.

  • ECMWF has a number of dimensions not found in Earth Observation data, such as multiple time dimensions. A full solution should allow users to select these different dimensions.

  • ECMWF data can be returned in a number of different formats but is mostly in the form of GRIB. Any data access will have to handle these formats.

Appendix C: GeoDataCubes Use-case B: Data Processing

In addition to the functionality provided in the Data Visualization (Appendix A) use-case, after layering the weather and satellite data, users should be able to do some processing between these layers. Simple operations such as addition, subtraction, or averaging, are enough at this stage.

C.1. Potential Issues

There will be three main challenges inherent in this use-case: * In contrast to the Data Visualization use-case, where data alignment is not crucial, it is of utmost importance here. Since the ECMWF data is not on a standard latitude-longitude grid, aligning the data and using some form of interpolation will be necessary for processing between the layers.

Appendix D: Testbed Organization and Execution

D.1. Initiative Policies and Procedures

This initiative will be conducted within the policy framework of OGC’s Bylaws and Intellectual Property Rights Policy ("IPR Policy"), as agreed to in the OGC Membership Agreement, and in accordance with the OGC COSI Program Policies and Procedures and the OGC Principles of Conduct, the latter governing all related personal and public interactions.

Several key requirements are summarized below for ready reference:

  • Each selected Participant will agree to notify OGC staff if it is aware of any claims under any issued patents (or patent applications) which would likely impact an implementation of the specification or other work product which is the subject of the initiative. Participant need not be the inventor of such patent (or patent application) in order to provide notice, nor will Participant be held responsible for expressing a belief which turns out to be inaccurate. Specific requirements are described under the "Necessary Claims" clause of the IPR Policy.

  • Each selected Participant will agree to refrain from making any public representations that draft Engineering Report (ER) content has been endorsed by OGC before the ER has been approved in an OGC Technical Committee (TC) vote.

  • Each selected Participant will agree to provide more detailed requirements for its assigned deliverables, and to coordinate with other initiative Participants, at the Kickoff event.

D.2. Initiative Roles

The roles generally played in any OGC COSI Program initiative include Sponsors, Bidders, Participants, Observers, and the COSI Program Team. Explanations of the roles are provided in Tips for New Bidders.

The COSI Team for this Initiative will include an Initiative Director and an Initiative Architect. Unless otherwise stated, the Initiative Director will serve as the primary point of contact (POC) for the OGC.

The Initiative Architect will work with Participants and Sponsors to ensure that Initiative activities and deliverables are properly assigned and performed. They are responsible for scope and schedule control, and will provide timely escalation to the Initiative Director regarding any high-impact issues or risks that might arise during execution.

D.3. Types of Deliverables

All activities in this testbed will result in a Deliverable. These Deliverables generally take the form of Documents or Component Implementations.

D.3.1. Documents

Engineering Reports (ER) and Change Requests (CR) will be prepared in accordance with OGC published templates. Engineering Reports will be delivered by posting on the (members-only) OGC Pending directory when complete and the document has achieved a satisfactory level of consensus among interested participants, contributors and editors. Engineering Reports are the formal mechanism used to deliver results of the COSI Program to Sponsors and to the OGC Standards Program for consideration by way of Standards Working Groups and Domain Working Groups.

Tip

A common ER Template will be used as the starting point for each document. Various template files will contain requirements such as the following (from the 1-summary.adoc file):

The Executive Summary shall contain a business value statement that should describe the value of this Engineering Report to improve interoperability, advance location-based technologies or realize innovations.

Ideas for meeting this particular requirement can be found in the CFP Background as well as in previous ER content such as the business case in the SELFIE Executive Summary.

Document content should follow this OGC Document Editorial Guidance (scroll down to view PDF file content). File names for documents posted to Pending should follow this pattern (replacing the document name and deliverable ID): OGC Testbed-20: Integrity, Provenance and Trust Engineering Report (D001). For ERs, the words Engineering Report should be spelled out in full.

D.3.2. Component Implementations

Component Implementations include services, clients, datasets, and tools. A service component is typically delivered by deploying an endpoint via an accessible URL. A client component typically exercises a service interface to demonstrate interoperability. Implementations should be developed and deployed in all threads for integration testing in support of the technical architecture.

Important

Under the Participation Agreement contracts, ALL Participants will be responsible for contributing content to the ERs, particularly regarding their component implementation experiences, findings, and future recommendations. But the ER Editor will be the primary author on the shared sections such as the Executive Summary.

Component implementations are often used as part of outreach demonstrations near the end of the timeline. To support these demos, component implementations are required to include Demo Assets. For clients, the most common approach to meet this requirement is to create a video recording of a user interaction with the client. These video recordings may optionally be included in a new YouTube Playlist such as this one for Testbed-15.

Tip

Videos to be included in the new YouTube Playlist should follow these instructions:

  • Upload the video recording to the designated Portal directory (to be provided), and

  • Include the following metadata in the Description field of the upload dialog box:

    • A Title that starts with "OGC Testbed-20:", keeping in mind that there is a 100-character limit [if no title is provided, we’ll insert the file name],

    • Abstract: [1-2 sentence high-level description of the content],

    • Author(s): [organization and/or individuals], and

    • Keywords: [for example, OGC, Testbed-19, machine learning, analysis ready data, etc.].

Since server components often do not have end-user interfaces, participants may instead support outreach by delivering static UML diagrams, wiring diagrams, screenshots, etc. In many cases, the images created for an ER will be sufficient as long as they are suitable for showing in outreach activities such as Member Meetings and public presentations. A server implementer may still choose to create a video recording to feature their organization more prominently in the new YouTube playlist. Another reason to record a video might be to show interactions with a "developer user" (since these interactions might not appear in a client recording for an "end user").

Tip

Demo-asset deliverables are slightly different from TIE testing deliverables. The latter don’t necessarily need to be recorded (though they often appear in a recording if the TIE testing is demonstrated as part of one of the recorded weekly telecons).

D.4. Proposal Evaluation

Proposals are expected to be brief, broken down by deliverable and precisely addressing the work items of interest to the bidder. Details of the proposal submission process are provided under the General Proposal Submission Guidelines.

Proposals will be evaluated based on criteria in two areas: technical and management/cost.

D.4.1. Technical Evaluation Criteria

  • Concise description of each proposed solution and how it contributes to achievement of the particular deliverable requirements described the Technical Architecture,

  • Overall quality and suitability of each proposed solution, and

  • Where applicable, whether the proposed solution is OGC-compliant.

D.4.2. Management/Cost Evaluation Criteria

  • Willingness to share information and work in a collaborative environment,

  • Contribution toward Sponsor goals of enhancing availability of standards-based offerings in the marketplace,

  • Feasibility of each proposed solution using proposed resources, and

  • Proposed in-kind contribution in relation to proposed cost-share funding request.

Note that all Participants are required to provide at least some level of in-kind contribution (costs for which no cost-share compensation has been requested). As a rough guideline, a proposal should include at least one dollar of in-kind contribution for every dollar of cost-share compensation requested. All else being equal, higher levels of in-kind contributions will be considered more favorably during evaluation. Participation may also take place by purely in-kind contributions (no cost-share request at all).

Once the proposals have been evaluated and cost-share funding decisions have been made, the COSI Team will begin notifying Bidders of their selection to enter negotiations to become and initiative Participant. Each selected bidder will enter into a Participation Agreement (PA), which will include a Statement of Work (SOW) describing the assigned deliverables.

D.5. Reporting

Participants will be required to report the progress and status of their work; details will be provided during contract negotiation. Additional administrative details such as invoicing procedures will also be included in the contract.

D.5.1. Monthly Reporting

The COSI Team will provide monthly progress reports to Sponsors. Ad hoc notifications may also occasionally be provided for urgent matters. To support this reporting, each testbed participant must submit (1) a Monthly Technical Report and (2) a Monthly Business Report by the first working day on or after the 3rd of each month. Templates and instructions for both of these report types will be provided.

The purpose of the Monthly Business Report is to provide initiative management with a quick indicator of project health from each participant’s perspective. The COSI Team will review action item status on a weekly basis with assigned participants. Initiative participants must remain available for the duration of the timeline so these contacts can be made.

D.5.2. Participant Final Summary Reports

Each Participant should submit a Final Summary Report by the milestone indicated in the Master Schedule. These reports should include the following information:

  1. Briefly summarize Participant’s overall contribution to the testbed (for an executive audience),

  2. Describe, in detail, the work completed to fulfill the Participation Agreement Statement of Work (SOW) items (for a more technical audience), and

  3. Present recommendations on how we can better manage future OGC COSI Program initiatives.

This report may be in the form of email text or a more formal attachment (at the Participant’s discretion).

Appendix E: Proposal Submission

E.1. General Proposal Submission Guidelines

This section presents general guidelines for submitting a CFP proposal. Detailed instructions for submitting a response proposal using the Bid Submission Form web page can be found in the Step-by-Step Instructions below.

Important

Please note that the content of the "Proposed Contribution" text box in the Bid Submission Form will be accessible to all Stakeholders and should contain no confidential information such as labor rates.

Similarly, no sensitive information should be included in the Attached Document of Explanation.

Proposals must be submitted before the deadline indicated in the Master Schedule.

Bidders responding to this CFP must be organizational OGC members familiar with the OGC mission, organization, and process.

Proposals from non-members or individual members will be considered provided that a completed application for organizational membership (or a letter of intent) is submitted prior to or with the proposal.

Tip

Non-members or individual members should make a note regarding their intent to join OGC on the Organizational Background page of the Bid Submission Form and include their actual Letter of Intent as part of an Attached Document of Explanation.

The following screenshot shows the Organizational Background page:

organizational background page
Figure 1. Sample Organizational Background Page

Information submitted in response to this CFP will be accessible to OGC and Sponsor staff members. This information will remain in the control of these stakeholders and will not be used for other purposes without prior written consent of the Bidder. Once a Bidder has agreed to become a Participant, they will be required to release proposal content (excluding financial information) to all initiative stakeholders. Sensitive information other than labor-hour and cost-share estimates should not be submitted.

Bidders will be selected for cost share funds on the basis of adherence to the CFP requirements and the overall proposal quality. The general testbed objective is to inform future OGC standards development with findings and recommendations surrounding potential new specifications. Each proposed deliverable should formulate a path for (1) producing executable interoperable prototype implementations meeting the stated CFP requirements and (2) documenting the associated findings and recommendations. Bidders not selected for cost share funds may still request to participate on a purely in-kind basis.

Bidders should avoid attempts to use the initiative as a platform for introducing new requirements not included in Technical Architecture. Any additional in-kind scope should be offered outside the formal bidding process, where an independent determination can be made as to whether it should be included in initiative scope or not. Out-of-scope items could potentially be included in another OGC IP initiative.

Each selected Participant (even one not requesting any funding) will be required to enter into a Participation Agreement contract ("PA") with the OGC. The reason this requirement applies to purely in-kind Participants is that other Participants will likely be relying upon their delivery. Each PA will include a Statement of Work ("SOW") identifying specific Participant roles and responsibilities.

E.2. Questions and Clarifications

Once the original CFP has been published, ongoing updates and answers to questions can be tracked by monitoring the CFP Corrigenda Table and the CFP Clarifications Table

Bidders may submit questions using the Testbed-20 CFP Questions Form. Question submitters will remain anonymous, and answers will be regularly compiled and published in the CFP clarifications.

A Bidders Q&A Webinar will be held on the date listed in the Master Schedule. The webinar is open to the public, but anyone wishing to attend must register using the provided link. Questions are due on the date listed in the Master Schedule.

E.3. Proposal Submission Procedures

The process for a Bidder to complete a proposal is essentially embodied in the online Bid Submission Form. Once this site is fully prepared to receive submissions (soon after the CFP release), it will include a series of web forms, one for each deliverable of interest. A summary is provided here for the reader’s convenience.

For any individual who has not used this form in the past, a new account will need to be created first. The user will be taken to a home page indicating the "Status of Your Proposal." If any defects in the form are discovered, this page includes a link for notifying OGC. The user can return to this page at any time by clicking the OGC logo in the upper left corner.

Any submitted bids will be treated as earnest submissions, even those submitted well before the response deadline. Be certain that you intend to submit your proposal before you click the Submit button on the Review page.

Important

Because the Bid Submission Form is still relatively new, it might contain some areas that are still brittle or in need of repair. Please notify OGC of any discovered defects. Periodic updates will be provided as needed.

Please consider making local backup copies of all inputs in case any need to be re-entered.

E.3.1. High-Level Overview

Clicking on the Propose link will navigate to the Bid Submission Form. The first time through, the user should provide organizational information on the Organizational Background Page and click Update and Continue.

This will navigate to an "Add Deliverable" page that will resemble the following:

proposal submission form AddDeliverable
Figure 2. Sample "Add Deliverables" Page

The user should complete this form for each proposed deliverable.

Tip

For component implementations having multiple identical instances of the same deliverable, the bidder only needs to propose just one instance. For simplicity, each bidder should just submit against the lowest-numbered deliverable ID. OGC will assign a unique deliverable ID to each selected Participant later (during negotiations).

On the far right, the Review link navigates to a page summarizing all the deliverables the Bidder is proposing. This Review tab won’t appear until the user has actually submitted at least one deliverable under the Propose tab first.

Tip

Consider regularly creating printed output copies of this Review page at various points during proposal creation.

Once the Submit button is clicked, the user will receive an immediate confirmation on the website that their proposal has been received. The system will also send an email to the bidder and to OGC staff.

Tip

In general, up until the time that the user clicks this Submit button, the proposal may be edited as many times as the user wishes. However, this initial version of the form contains no "undo" capability, so please use caution in over-writing existing information.

The user is afforded an opportunity under Done Adding Deliverables at the bottom of this page to attach an optional Attached Document of Explanation.

proposal submission form attached doc
Figure 3. Sample Dialog for an "Attached Document of Explanation"
Important

No sensitive information (such as labor rates) should be included in the Attached Document of Explanation.

If this attachment is provided, it is limited to one per proposal and must be less than 5Mb.

This document could conceivably contain any specialized information that wasn’t suitable for entry into a Proposed Contribution field under an individual deliverable. It should be noted, however, that this additional documentation will only be read on a best-effort basis. There is no guarantee it will be used during evaluation to make selection decisions; rather, it could optionally be examined if the evaluation team feels that it might help in understanding any specialized (and particularly promising) contributions.

E.3.2. Step-by-Step Instructions

The Propose link takes the user to the first page of the proposal entry form. This form contains fields to be completed once per proposal such as names and contact information.

It also contains an optional Organizational Background field where Bidders (particularly those with no experience participating in an OGC initiative) may provide a description of their organization. It also contains a click-through check box where each Bidder will be required (before entering any data for individual deliverables) to acknowledge its understanding and acceptance of the requirements described in this appendix.

Clicking the Update and Continue button then navigates to the form for submitting deliverable-by-deliverable bids. On this page, existing deliverable bids can be modified or deleted by clicking the appropriate icon next to the deliverable name. Any attempt to delete a proposed deliverable will require scrolling down to click a Confirm Deletion button.

To add a new deliverable, the user would scroll down to the Add Deliverable section and click the Deliverable drop-down list to select the particular item.

The user would then enter the required information for each of the following fields (for this deliverable only). Required fields are indicated by an asterisk ("*"):

  • Estimated Projected Labor Hours* for this deliverable,

  • Funding Request*: total U.S. dollar cost-share amount being requested for this deliverable (to cover burdened labor only),

  • Estimated In-kind Labor Hours* to be contributed for this deliverable, and

  • Estimated In-Kind Contribution: total U.S. dollar estimate of the in-kind amount to be contributed for this deliverable (including all cost categories).

Tip

There’s no separate text box to enter a global in-kind contribution. Instead, please provide an approximate estimate on a per-deliverable basis.

Cost-sharing funds may only be used for the purpose of offsetting burdened labor costs of development, engineering, documentation, and demonstration related to the Participant’s assigned deliverables. By contrast, the costs used to formulate the Bidder’s in-kind contribution may be much broader, including supporting labor, travel, software licenses, data, IT infrastructure, and so on.

Theoretically there is no limit on the size of the Proposed Contribution for each deliverable (beyond the raw capacity of the underlying hardware and software). But bidders are encouraged to incorporate content by reference where possible (rather than inline copying and pasting) to avoid overloading the amount of material to be read in each proposal. There is also a textbox on a separate page of the submission form for inclusion of Organizational Background information, so there is no need to repeat this information for each deliverable.

Important

A breakdown (by cost category) of the "Inkind Contribution" may be included in the Proposed Contribution text box for each deliverable.

However, please note that the content of this text box will be accessible to all Stakeholders and should contain no confidential information such as labor rates.

Similarly, no sensitive information should be included in the Attached Document of Explanation.

This field Proposed Contribution (Please include any proposed datasets) should also be used to provide a succinct description of what the Bidder intends to deliver for this work item to meet the requirements expressed in the Technical Architecture. This language could potentially include a brief elaboration on how the proposed deliverable will contribute to advancing the OGC standards baseline, or how implementations enabled by the specification embodied in this deliverable could add specific value to end-user experiences.

A Bidder proposing to deliver a Service Component Implementation can also use this field to identify what suitable datasets would be contributed (or what data should be acquired from another identified source) to support the proposed service.

Tip

In general, please try to limit the length of each Proposed Contribution to about one text page per deliverable.

Note that images cannot be pasted into the field Proposed Contribution textbox. Bidders should instead provide a link to a publicly available image.

A single bid may propose deliverables arising from any number of threads or tasks. To ensure that the full set of sponsored deliverables are made, OGC might negotiate with individual Bidders to drop and/or add selected deliverables from their proposals.

E.4. Tips for New Bidders

Bidders who are new to OGC initiatives are encouraged to review the following tips:

  • In general, the term "activity" is used as a verb describing work to be performed in an initiative, and the term "deliverable" is used as a noun describing artifacts to be developed and delivered for inspection and use.

  • The roles generally played in any OGC COSI Program initiative are defined in the OGC COSI Program Policies and Procedures, from which the following definitions are derived and extended:

    • Sponsors are OGC member organizations that contribute financial resources to steer Initiative requirements toward rapid development and delivery of proven candidate specifications to the OGC Standards Program. These requirements take the form of the deliverables described herein. Sponsors representatives help serve as "customers" during Initiative execution, helping ensure that requirements are being addressed and broader OGC interests are being served.

    • Bidders are organizations who submit proposals in response to this CFP. A Bidder selected to participate will become a Participant through the execution of a Participation Agreement contract with OGC. Most Bidders are expected to propose a combination of cost-sharing request and in-kind contribution (though solely in-kind contributions are also welcomed).

    • Participants are selected OGC member organizations that generate empirical information through the definition of interfaces, implementation of prototype components, and documentation of all related findings and recommendations in Engineering Reports, Change Requests and other artifacts. They might be receiving cost-share funding, but they can also make purely in-kind contributions. Participants assign business and technical representatives to represent their interests throughout Initiative execution.

    • Observers are individuals from OGC member organizations that have agreed to OGC intellectual property requirements in exchange for the privilege to access Initiative communications and intermediate work products. They may contribute recommendations and comments, but the COSI Team has the authority to table any of these contributions if there’s a risk of interfering with any primary Initiative activities.

    • Supporters are OGC member organizations who make in-kind contributions aside from the technical deliverables. For example, a member could donate the use of their facility for the Kickoff event.

    • The COSI Team is the management team that will oversee and coordinate the Initiative. This team is comprised of OGC staff, representatives from member organizations, and OGC consultants. The COSI Team communicates with Participants and other stakeholders during Initiative execution, provides Initiative scope and schedule control, and assists stakeholders in understanding OGC policies and procedures.

    • The term Stakeholders is a generic label that encompasses all Initiative actors, including representatives of Sponsors, Participants, and Observers, as well as the COSI Team.

    • Suppliers are organizations (not necessarily OGC members) who have offered to supply specialized resources such as cloud credits. OGCs role is to assist in identifying an initial alignment of interests and performing introductions of potential consumers to these suppliers. Subsequent discussions would then take place directly between the parties.

  • Proposals from non-members or individual members will be considered provided that a completed application for organizational membership (or a letter of intent) is submitted prior to or with the proposal.

  • Any individual wishing to gain access to the Initiative’s intermediate work products in the restricted area of the Portal (or attend private working meetings / telecons) must be a member-approved user of the OGC Portal system.

  • Individuals from any OGC member organization that does not become an initiative Sponsor or Participant may still (as a benefit of membership) observe activities by registering as an Observer.

  • Prior initiative participation is not a direct bid evaluation criterion. However, prior participation could accelerate and deepen a Bidder’s understanding of the information presented in the CFP.

  • All else being equal, preference will be given to proposals that include a larger proportion of in-kind contribution.

  • All else being equal, preference will be given to proposed components that are certified OGC-compliant.

  • All else being equal, a proposal addressing all of a deliverable’s requirements will be favored over one addressing only a subset. Each Bidder is at liberty to control its own proposal, of course. But if it does choose to propose only a subset for any particular deliverable, it might help if the Bidder prominently and unambiguously states precisely what subset of the deliverable requirements are being proposed.

  • The Sponsor(s) will be given an opportunity to review selection results and offer advice, but ultimately the Participation Agreement (PA) contracts will be formed bilaterally between OGC and each Participant organization. No multilateral contracts will be formed. Beyond this, there are no restrictions regarding how a Participant chooses to accomplish its deliverable obligations so long as these obligations are met in a timely manner (whether a 3rd-party subcontractor provides assistance is up to the Participant).

  • In general, only one organization will be selected to receive cost-share funding per deliverable, and that organization will become the Assigned Participant upon which other Participants will rely for delivery. Optional in-kind contributions may be made provided that they don’t disrupt delivery of required, reliable contributions from the assigned Participants.

  • A Bidder may propose against any or all deliverables. Participants in past initiatives have often been assigned to make only a single deliverable. On the other hand, several Participants in prior initiatives were selected to make multiple deliverables.

  • In general, the Participant Agreements will not require delivery of any component source code to OGC.

    • What is delivered to OGC is the behavior of the component installed on the Participant’s machine, and the corresponding documentation of findings, recommendations, and technical artifacts contributed to Engineering Report(s).

    • In some instances, a Sponsor might expressly require a component to be developed under open-source licensing, in which case the source code would become publicly accessible outside the Initiative as a by-product of implementation.

  • Results of other recent OGC initiatives can be found in the OGC Public Engineering Report Repository.

Appendix F: Abbreviations

The following table lists all abbreviations used in this CFP.

ARA

Agile Reference Architecture

CFP

Call for Participation

COG

Cloud Optimized GeoTIFF

COSI

Collaborative Solutions and Innovation Progam

DER

Draft Engineering Report

DWG

Domain Working Group

E2E

End–to-End

ECMWF

European Centre for Medium-Range Weather Forecasts

ER

Engineering Report

GDC

GeoDataCube

GIMI

GEOINT Imagery Media for ISR

HIEF

High Efficiency Image File Format

I-GUIDE

Institute for Geospatial Understanding through an Integrative Discovery Environment

IPT

Integrity, Provenance, and Trust

ISO/IEC

International Standards Organization / International Electrotechnical Commission

ISOBMFF

ISO Base Media File Format

ISR

Intelligence, Surveillance, and Reconnaissance

MARS

Meteorological Archival and Retrieval System

NGIIS

The Next Generation ISR Imagery Standards

NITF

National Imagery Transmission Format

NWP

Numerical Weather Prediction

OGC

Open Geospatial Consortium

PA

Participation Agreement

POC

Point of Contact

Q&A

Questions and Answers

SOW

Statement of Work

SWG

Standards Working Group

TBD

To Be Determined

TC

OGC Technical Committee

TEM

Technical Evaluation Meeting

TIE

Technology Integration / Technical Interoperability Experiment

URL

Uniform Resource Locator

WG

Working Group (SWG or DWG)

Appendix G: Corrigenda & Clarifications

G.1. Corrigenda Table

The following table identifies all corrections that have been applied to this CFP compared to the original release. Minor editorial changes (spelling, grammar, etc.) are not included.

Table 3. Corrigenda Table
Section Description Date of Change

2.1.1

Updated the link for the Building blocks web-site

May 13th 2024

2.3.4

Updated description for deliverable D021 regarding the schedule

May 23rd 2024

G.2. Clarifications Table

The following table identifies all clarifications that have been provided in response to questions received from organizations interested in this CFP.

Please use this convenience link to navigate to the end of the table.

Table 4. Clarifications Table
Question Clarification

For the Geodatacube API task, you have foreseen an in-person workshop for the usability testing in Februar 2025. However, recommendations from usability testing shall already be included in an engineering report, which needs to be ready by mid of November 2024. Shall a usability test already being conducted during the Testbed or only with the workshop in February 2025? Shall the effort included in this task consider both, usability test during the Testbed and the in-person workshop afterwards?

The initial phases of usability testing will be conducted during the testbed by the team that is responsible for this deliverable. The results will be captured in the preliminary draft Engineering Report (M09). The participant may also conduct a virtual workshop to gather initial results for the usability testing and include them in the DER, which will be ready for review at the M12 Milestone. However, the Usability Testing Report for D021 GDC will not be due at the M13 Milestone deadline, as the results of the in-person workshop will be included in the final DER after the workshop. The M13 milestone will be conducted shortly after the inclusion of the results.

The GeoDataCubes work packages, especially D140, don’t mention any work to advance clients for the GDC API. Is it foreseen to also fund client-related work that builds on top of Testbed 19 or is the sole focus on server-side implementations and specification extensions?

Development on the client side is also welcome.

-- Pre-Release --

Q:

A:

G.3. End of Clarifications Table (convenience link)

.


1. Coverage: feature that acts as a function to return values from its range for any direct position within its domain (ISO 19123)