t21banner

Table of Contents

1. Introduction

The Open Geospatial Consortium (OGC) is releasing this Call for Participation (CFP) to solicit proposals for the OGC Testbed-21 initiative. The initiative will explore the following tasks:

  • Data Quality for Integrity, Provenance, and Trust (DQ4IPT)

  • GEOINT Imagery Media for Intelligence, Surveillance, and Reconnaissance for OGC Testbed-21 (GIMI-T21)

  • Conformance Testing Tool Development (CTTG)

1.1. Background

The Open Geospatial Consortium (OGC) is a worldwide community of more than 400 organizations from industry, government, research, and academia that collaborate to make geospatial (location) information and services more accessible and usable. OGC’s Standards development process is member-driven and creates royalty-free, publicly available, and open geospatial standards. The standards development process moves at the pace of innovation, with constant input from technology forecasting, trends analysis, practical prototyping, real-world testing, and community engagement.

The OGC Testbed series is an annual research and development program that explores geospatial technology while taking current and potential future OGC Standards into account. OGC Testbeds provide a unique opportunity to explore how interoperability could be optimized within specific contexts. Combining technologies in a single initiative, and bringing several organizations together, creates an environment that closely resembles the interoperability challenges faced in real-world situations. Consequently, Testbeds bring about synergetic effects by facilitating collaboration among several sponsors and experts from member organizations. These initiatives aim to advance open standards that make geospatial information more Findable, Accessible, Interoperable, and Reusable (FAIR). The initiatives are conducted under the OGC’s Policies & Procedures of the OGC Collaborative Solutions and Innovation (COSI) Program.

1.2. Benefits of Participation

This initiative provides an outstanding opportunity to engage with the latest research on geospatial system design, concept development, and rapid prototyping with government organizations (Sponsors) across the globe. The initiative provides a business opportunity for stakeholders to mutually define, refine, and evolve service interfaces and protocols in the context of hands-on experience and feedback. The outcomes are expected to shape the future of geospatial software development and data publication. The Sponsors are supporting this vision with cost-sharing funds to partially offset the costs associated with development, engineering, and demonstration of these outcomes, offering selected Participants a unique opportunity to recoup a portion of their initiative expenses. The selected Testbed Participants benefit from access to funded research & development, reduced development costs, risks, and lead-time of new products or solutions, close relationships with potential customers, first-to-market competitive advantage on the latest geospatial innovations, influence on the development of global standards, partnership opportunities within our community of experts, and broader market reach via the recognition that OGC standards bring. Exceptional demonstrators may be selected for ongoing support and public showcasing beyond the Testbed period (see Miscellaneous).

1.3. Main Schedule

The following table details the major Initiative milestones and events. Dates are subject to change.

Table 1. Main schedule
Milestone Date Event

M1

2025-01-24

Request for Information (RFI) is released

M2

2025-01-29

RFI Responders Q&A; online answers will be provided as questions are submitted

M3

2025-02-07

RFI responses due

M4

2025-02-12

Call for Sponsors (CFS) is released

M5

2025-03-21

Sponsor commitments due

M6

2025-08-04

Call for Participation (CFP) is released

M7

2025-08-07

Questions from CFP Bidders for the Q&A Webinar due. (Submit Questions here.)

M8

2025-08-08

Bidders Q&A Webinar to be held 03:00pm - 04:00pm UTC MS Teams Link Here. The recording is here.

M9

2025-08-10

Intentionally left blank

M10

2025-09-09

CFP Proposal Submission Deadline (11:59pm AoE)

M11

2025-09-23

All participants selected and signed

M12

2025-10-07 to 2025-10-08

Testbed Kick-off meeting (in Virginia, USA). 09:00am - 05:00pm EDT each day.

M13

2025-11-06

Proposed solution architecture documented

M14

2025-12-06

Design of Technology Integration Experiments (TIE) documented

M15

2026-01-15

Technology Integration Experiments (TIE) executed and documented

M16

2026-01-29

Initial Draft Reports due

M17

2026-02-28

Final demonstrations (see note (a) below)

M18

2026-03-30

Final Draft Reports due (see note (b) below)

M19

2026-04-29

Final Deliverables due (see note (c) below)

Note
(a) All work on the component/software deliverables must be completed before the Final demonstrations. (b) The Final Draft Reports must address internal and Working Group feedback. (c) Milestone M16 applies to all deliverable reports and summary reports.

2. Technical Architecture

This section provides the technical architecture and identifies all requirements and corresponding work items. It references the OGC standards baseline, i.e. the complete set of member-approved Abstract Specifications, Standards including Profiles and Extensions, and Community Practices where necessary.

Please note that some documents referenced below may not have been released to the public yet. These reports require a login to OGC file resources. If you do not have a login, please contact OGC using the Message textbox in the OGC Contact Form with Standards as the selected category.

The Testbed deliverables are organized in a number of tasks:

2.1. Data Quality for Integrity, Provenance, and Trust (DQ4IPT)

There is a growing recognition that robust approaches for validating data integrity, provenance, and trust (IPT) are needed. This is more so for Earth Observation (EO) data because much imagery-supported decision making relies on analysing data from different sources. A prerequisite for achieving trust by assuring data integrity and provenance, is to ensure that data quality is documented, and communicated effectively. This is particularly important because, as noted by Bugbee et al. [1], "Incorrect or missing information may impede a user from determining the dataset’s fitness for a particular research question or new application." To facilitate the assessment of a dataset’s fitness for a particular application, there is a need for data quality reports in metadata to be expressed in a comparable way.

One of the standards that are relevant to this topic is ISO 19115, the international standard for geospatial metadata. Historically encoded in XML, a new activity has begun to develop a JSON encoding of the metadata model defined by the standard. The emerging ISO 19115-4 candidate standard describes an integrated JSON implementation of ISO 19115-1, ISO 19115-2 and 19157-1. The specification provides a set of JSON schema files which define a JSON encoding of the concepts defined by conceptual schemas in ISO 19115-1, ISO 19115-2 and 19157-1.

Another specification that is relevant to this topic is the NASA Unified Metadata Model (UMM) - an extensible metadata model that provides a crosswalk for mapping between metadata standards supported by the Common Metadata Repository (CMR). The UMM offers mappings between: GCMD DIF10, ECHO 10, ISO19115-1, ISO19115-2, and GCMD SERF. The UMM JSON Schema support the validation of the structure of UMM metadata documents. A group of specifications that may also be relevant to this topic is the Coalition for Content Provenance and Authenticity (C2PA) family of standards. C2PA is designed to provide an open specification for content creators, publishers, and consumers to establish the origin, provenance and edits of digital content.

2.1.1. Problem Statement

Standards such as ISO 19115 (for Geospatial Metadata) and ISO 19157 (for Geospatial Data Quality) have been used to help record the lineage and quality of geospatial datasets. However, to date, the absence of a shared vocabulary for describing data quality measures has prevented many systems from achieving a greater level of interoperability when exchanging metadata about the quality of a dataset. Leveraging existing ontologies, such a shared vocabulary could also facilitate interoperability between systems that support ISO 19115/ISO 19157 models and sector-specific models such as the NASA Unified Metadata Model (UMM).

One of the areas that could significantly benefit from a shared vocabulary for describing data quality measures is that of Machine Learning (ML) for Artificial Intelligence (AI). The quality of training datasets plays a key role in the performance of an ML model and thus acts as a requirement for a successful AI application. This is more so in Earth Observation applications since 'bad data' can often lead to a deluge of false positives in feature extraction from satellite imagery. Therefore the recent approval and release of the OGC Training Data Markup Language for AI (TrainingDML-AI) standard is expected to lead to better ML models and improvements in AI applications. To achieve such improvements in AI applications, there is a need for a shared vocabulary for describing data quality measures and the integration of those measures into an IPT framework.

2.1.2. Aim

The aim of this task is to:

  • explore the integration of data quality considerations into IPT frameworks in such a way that users of Earth Observation data can have confidence in the data they use for analysis.

The objectives of the task are therefore to:

  • develop a reference architecture that integrates data quality considerations into IPT frameworks

  • demonstrate how the reference architecture can improve confidence in Earth Observation data

  • identify the challenges likely to be encountered when adapting existing platforms to the approach proposed by the reference architecture

Note
It is not the intent of the testbed to assess data quality, but rather to provide a standards-based approach through which data producers such as space agencies can document and communicate the quality of their data products in a way that is both human and machine readable.

2.1.3. Research Questions

Key research questions may include:

  1. How should existing metadata platforms be adapted in order to offer data quality information that encourages trust in the data?

  2. How should IPT frameworks consume or leverage metadata that describes the quality of data?

  3. How effectively could a data quality-aware IPT framework facilitate interoperability across a data management system?

  4. How could an organization apply a data quality-aware IPT framework to reduce the risk of data contamination in their AI applications?

  5. What are the challenges and limitations likely to be encountered when adapting existing platforms to support data quality-aware IPT frameworks?

2.1.4. Previous Work

In an era of growing data volumes and geospatial analysis demands, maintaining IPT is critical, especially in distributed systems where data availability can fluctuate. The OGC Testbed-20 IPT activity addressed this need by developing resilient data services to safeguard IPT throughout the data lifecycle [2]. Two IPT Server instances were demonstrated in OGC Testbed-20, including the Federated Agile Collaborative Trusted System (FACTS) to manage Smart Certificates, and a Data Verification system that leveraged Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs).

A number of previous OGC Testbeds explored interoperability solutions for Imagery Quality [3] and Quality Assessment Services [4]. Some of the lessons from those initiatives have led to a joint effort by OGC and ISO/TC 211 to develop a machine-actionable data quality measures register. ISO 19157-3 Geographic information — Data quality — Part 3: Data quality measures register will be the standard defining the components and content structure of a register for data quality measures, and the registration and maintenance procedure (ISO/AWI 19157-3). OGC will host the ISO 19157-3 register on OGC RAINBOW, a registry of terms and definitions. The register is in development, however experimentation during the July 2024 OGC Code Sprint demonstrated the referencing of registered data quality measures in documents conforming to the TrainingDML-AI standard.

2.1.5. Work Items and Deliverables

The following subsections document all work items and deliverables of this task.

Components

D101 DQ-enabled IPT Server 1: An OGC API - Processes implementation, along with supporting software components, that together implement IPT capabilities. At least one of the processes offered by the API must offer outputs that include a processed image (originally collected by a satellite) and JSON-encoded ISO 19115 metadata that includes data quality reports that reference definitions of quality measures from OGC RAINBOW.

D102 DQ-enabled IPT Server 2: An OGC API - Records implementation, along with supporting software components, that together implement IPT capabilities. The metadata served by the OGC API - Records implementation must include links to JSON-encoded ISO 19115 metadata that includes data quality reports that reference definitions of quality measures from OGC RAINBOW.

D103 DQ-enabled software library: A software library capable of accessing metadata and data quality reports from the servers implemented by Components D101 and D102, and then retrieving data from the servers. The software library must be able to convert metadata from ISO 19115-3 conformant XML documents to ISO 19115-4 (draft) conformant JSON documents. The software library must also support conversion of metadata from NASA UMM conformant JSON documents to ISO 19115-4 (draft) conformant JSON documents. Furthermore, the software library must demonstrate conversion of metadata from TrainingDML-AI JSON documents to ISO 19115-4 (draft) conformant JSON documents.

Report

D001 Data Quality for Integrity, Provenance and Trust Report: A report that captures all results and lessons learned.

2.2. GEOINT Imagery Media for ISR for OGC Testbed-21 (GIMI-T21)

Over a number of decades, several imagery formats have been developed across industry. In many cases, those formats were designed to support either Still imagery or Motion imagery, and rarely both. However, recent technological advances have led to a growing call to define an image format that can support both Still and Motion imagery. The Next Generation ISR Imagery Standards (NGIIS) initiative was established, in part, to answer this growing call. The NGIIS initiative aims to fundamentally transform the standards for Intelligence, Surveillance, and Reconnaissance (ISR) imagery. One of the key outputs of the NGIIS initiative is the GEOINT Imagery Media for ISR (GIMI) family of standards, pronounced "gimmie." [5].

The GIMI family of standards integrates advanced media standards such as the ISO/IEC 14496-12 ISO Base Media File Format (ISOBMFF) and the ISO/IEC 23008-12 High Efficiency Image File Format (HEIF). Some of the media that GIMI is envisaged to support are single still images, motion imagery, tiled imagery, audio, security marking, timing information, and content identification (id) information. The modular and unifying approach of GIMI is envisaged to offer a successor to older formats such as the National Imagery Transmission Format (NITF) which does not support some of the modern computing technological advances. OGC Testbed-21 looks to build on the progress made during OGC Testbed-20 by continuing permissively-licensed open source contributions to the GStreamer and libheif libraries to enable GIMI application development.

2.2.1. Problem Statement and Research Questions

The primary goal of the OGC Testbed-21 GIMI task is to enable GIMI application development by continuing permissively-licensed open source contributions to the GStreamer and libheif libraries, building on the progress made during OGC Testbed-20. OGC Testbed-21 will include a new effort focused on investigating, implementing, and evaluating the GEOINT Media Standards Board’s (GMSB) emerging Imagery Domain Ontology (IDO) by implementing and testing ontology structured metadata sidecar files. These files provide descriptive metadata information for imagery inside GIMI files. The Imagery Domain Ontology is based on Basic Formal Ontology (BFO) and Common Core Ontologies (CCO).

The technical requirements for this testbed thread are listed below:

  1. Extensions to open-source library implementation to support GIMI tool developers & associated testing/evaluation of the capabilities: During OGC Testbed-20, OGC made contributions to the open source libheif and GStreamer projects to support foundational NGA.STND.0076 file read/write capabilities. The following requirements continue the addition of GIMI related capabilities to the libheif and GStreamer libraries.

    1. Uncompressed Imagery Coding: Implement missing ISO/IEC 23001-17 features in the open source libHEIF and GStreamer libraries:

      1. Unsigned integer components: Implement 10, 12, 14, and 16-bit read/write capabilities. Note: This continues the OGC Testbed-20 work to support unsigned integer data formatting options.

      2. Signed integer, floating point, and complex number component formats: Implement read/write support for all size options.

      3. Component description extensions: Implement the component description extensions listed in Section 6 of ISO/IEC 23001-17. A minimum implementation includes support for:

        1. Component Pattern Definition Box: Read/write support for Bayer imagery and all four quadrant possibilities (RGGB, BGGR, GRBG, GBRG).

        2. Sensor Non-Uniformity Correction Box

        3. Sensor Bad Pixels Map

        4. Polarization Pattern Definition Box

      4. ISO/IEC 23001-17 Amd1: TAI timestamps: Implement TAI Timestamp annotations of image items and image samples.

      5. ISO/IEC 23001-17 Amd2: Generic compression: Implement read/write support for numerically lossless (gzip, Brotli, deflate, etc.) compression and decomp

    2. Libheif Interframe Sequence Support: Extend the image sequence implementation to support motion-compensated interframe sample compression when using ISO/IEC 14496-10:AVC and ISO/IEC 23008-2:HEVC. This effort leverages appropriate AVC and HEVC codecs, such as x264 and x265. Profile support shall focus on highly interoperable AVC and HEVC profiles, such as 24- bit YUV 4:2:0.

    3. GStreamer: Add necessary support to address remaining gaps for non-consumer video formats:

      1. JPEG 2000: Implement capabilities to read/write 8, 10, 12, 14, & 16-bit monochrome and three band (RGB & YUV) image sequence and/or video tracks.

      2. Uncompressed imagery coding: Implement capabilities to read/write 8, 10, 12, 14, & 16-bit monochrome and three band (RGB/YUV) image sequence and/or video tracks.

        1. Include support for uncompressed 8, 10, 12, 14, & 16-bit raw Bayer image sequence and/or video tracks.

      3. Fragmented files: Implement read/write capabilities using both fragmented and non- fragmented MP4 files.

  2. Implement & evaluate NGA Imagery Domain Ontology sidecar files

    1. Cloud-based GIMI/IDO player/viewer – single unitary image

      1. Acquire samples of GEOINT imagery with associated geospatial metadata in the form of corner points for the image. The imagery is to be formatted and stored in an NGA.STND.0076 GIMI file. Content IDs shall be generated for the image and its components. Of note, GIMI security markings are not required for the file.

      2. The geodetic corner points for the image shall be formatted into a TriG formatted RDF file, in a manner conformant to the NGA IDO.

      3. Utilize available software where possible and implement otherwise, to store the GIMI and ontology metadata files on a web server. For example, an OGC API – Records instance extended to support storage of ontology metadata files along with the GIMI files they describe.

      4. Develop an open-source browser or stand-alone software client application to submit the necessary SPARQL queries to retrieve the ontology structured metadata associated with the geodetic corner points. The ontology structured metadata may be retrieved in JSON-LD format, instead of TriG, to enable parsing in a web browser.

      5. Use cloud-optimization techniques to probe the MetaBox of the GIMI file and retrieve the byte range addressing information for the image carried in the file. Use the byte range addressing information to retrieve the image content via HTTP byte range requests.

      6. Using the corner point information, overlay the image on a map display in the correct location.

      7. Evaluate performance of data queries, data interaction, image content download and display, etc. The evaluation should be benchmarked against GeoTIFF performance.

    2. Cloud-based GIMI/IDO player/viewer – single tiled 'grid' image

      1. Take a single unitary image (greater than 4K x 4K) and tile the image into 1Kx1K tiles. Write the individual tiles into a GIMI file as 1Kx1K image items. Generate a HEIF 'grid' item type, using the previously stored image items as the inputs to the grid. Generate Content IDs for each of the tile image items, and the grid item.

      2. Generate (estimate if necessary) corner point geodetics for the corners of each tile (image items). Generate an RDF file containing all the corner point geodetic information.

      3. Use cloud-optimization techniques to probe the MetaBox of the GIMI file and retrieve the information defining the arrangement of the image items in the grid and the byte range addressing information for the individual image items (tiles).

      4. Use the byte range addressing information to retrieve the image content for individual tiles via HTTP byte range requests.

      5. Using the corner point information, overlay the image tile on a map display in the correct location. Repeat, one tile at a time, until the complete grid is formed and overlayed on the map.

      6. Evaluate performance of data queries, data interaction, image content download and display, etc. The evaluation should be benchmarked against Cloud Optimized GeoTIFF (COG) performance.

  3. Reporting

    1. Generate a report (deliverable D205) on observations, findings, and evaluation of each primary capability implemented from both a performance and functionality perspective. Make recommendations where appropriate.

2.2.2. Previous Work

The GIMI format is very versatile as it is based in HEIF, which is structured in boxes that can contain both data and metadata. In the OGC Testbed-20 initiative some best practices on how to combine these boxes and propositions for new boxes were made. During the OGC Testbed-20 initiative, several implementations were developed to support generating image files conforming to the Geographic High Efficiency Image Format (GeoHEIF) candidate standard that is proposed for use in GIMI. The Testbed also benchmarked the reading of GeoHEIF files with specific open source libraries. In addition, a motion imagery use case for GIMI video support was analyzed.

Prior to OGC Testbed-20, the OGC Code Sprint held in the United Kingdom in October 2023 demonstrated the viability of the draft GIMI Profile of ISOBMFF as an alternative container of GEOINT data. As documented in the resulting Engineering Report (OGC 23-059), the Code Sprint participants identified several recommended future actions to advance GIMI as an industry standard. Amongst the recommendations was that with minor changes to popular open-source base libraries, a wide range of software could quickly make use of GIMI capabilities. Some of those libraries were enabled to support some GIMI capabilities by the OGC Testbed-20 initiative.

2.2.3. Work Items and Deliverables

In detail, the following work items will be addressed:

Components

D210 OGC API - Records instance (in-kind only): OGC API – Records instance extended to support storage of ontology metadata files along with the GIMI files they describe.

D211 GIMI instance 1: Extension to GStreamer open source library to support the technical requirements 1a and 1c described in Problem Statement above. The software should also support the creation of GIMI files and the sidecar metadata files required to support technical requirements 2a and 2b.

D212 GIMI instance 2: Extension to libheif open source library to support the technical requirements 1a and 1b described in Problem Statement above. The software should also support the creation of GIMI files and the sidecar metadata files required to support technical requirements 2a and 2b.

D213 GeoTIFF and COG instances for GIMI benchmarking: The benchmarking environment (e.g. Jupyter Notebooks and sample files) for evaluating the performance of data queries, data interaction, image content download and display, etc. The sample files should include GIMI, GeoTIFF, and COG files used in the benchmarking.

D214 Cloud-based GIMI/IDO player/viewer: An open-source browser or stand-alone software client application for playing or viewing GIMI files as described in technical requirements 2a and 2b.

Reports

D201 Documentation of updates to libheif: Documentation describing the updates to libheif implemented during the Testbed. The report should uniquely identify and describe the Pull Requests / Contributions submitted. Those contributions that are accepted into libheif during the period of performance of the Testbed should also be identified.

D202 Documentation of updates to GStreamer: Documentation describing the updates to GStreamer implemented during the Testbed. The report should uniquely identify and describe the Pull Requests / Contributions submitted. Those contributions that are accepted into GStreamer during the period of performance of the Testbed should also be identified.

D203 Baseline OGC Testbed-21 Specification for GIMI: A report documenting the baseline specification of GIMI for this Testbed.

D204 Ontologies for Coverage Description Report: A report documenting the implementation and testing of IDO-based structured metadata sidecar files. The report should identify options for integrating RDF-encoded coverage descriptions based on the Coverage Implementation Schema (CIS) standard (OGC 09-146r8).

D205 Evaluation Report on the performance of GIMI data queries, interaction, download and display: A report on observations, findings, and evaluation of each primary capability implemented from both a performance and functionality perspective. Make recommendations where appropriate.

D206 Advancement of the GIMI Standard Report: Report and Presentation to the Testbed Sponsors, at the end of OGC Testbed-21, summarizing the work performed to advance the GIMI candidate standard, evaluation of results, and recommendations based on findings.

2.3. Conformance Testing Tool Development (CTTG)

The OGC Compliance Program is a certification process that ensures organizations' solutions are compliant with OGC Standards. It is a universal credential that allows agencies, industry, and academia to better integrate their solutions. OGC compliance provides confidence that a product will seamlessly integrate with other compliant solutions regardless of the vendor that created them.

The GEOINT Imagery Media for ISR (GIMI) family of standards integrates advanced media standards such as the ISO/IEC 14496-12 ISO Base Media File Format (ISOBMFF) and the ISO/IEC 23008-12 High Efficiency Image File Format (HEIF) [5]. GIMI is designed to support still images, motion imagery, tiled imagery, audio, security marking, timing information, and content identification (id) information.

Amongst the resources that software developers depend on to support their implementation of a standard are conformance testing tools. Such tools assess whether a software product correctly implements the requirements specified in a standard. For data and metadata encoding standards, the assessment is carried out on files generated by the software product. The Conformance Testing Tool Development (CTTG) task of this Testbed, therefore seeks to facilitate the development of a tool for testing for conformance to the GIMI standard.

2.3.1. Problem Statement and Research Questions

OGC uses TEAM Engine (Test, Evaluation, And Measurement Engine) for conformance testing. TEAM Engine is a Java-based application for testing web services and other information resources. It executes test suites developed using the TestNG framework and, alternatively OGC Compliance Test Language (CTL) scripts. TEAM Engine can be used to test almost any type of service or information resource. It is the official test harness used by the OGC’s Compliance Program.

The primary goal of the OGC Testbed-21 Conformance Testing Tool Development (CTTG) task is to provide a GIMI conformance testing tool that can be integrated into TEAM Engine.

The task shall address the following research questions:

  • What does a TEAM Engine-based conformance testing tool for GIMI look like?

  • What would training users on the use of the conformance testing tool involve?

  • What would documentation for the testing tool look like?

2.3.2. Previous Work

The previous OGC Testbed-17 initiative provided a thread to advance testing of implementations of OGC Standards. An executable test suite for OGC API – Processes — Part 1: Core was developed using the TestNG framework (the current recommended OGC approach), and another executable test suite was developed using NeoTL as an alternate approach [6].

2.3.3. Work Items and Deliverables

In detail, the following work items will be addressed:

Components

D301 The GIMI Conformance Test Tool: This will be implemented as a Java-based TestNG application for integration into TEAM Engine. If necessary, it is acceptable to use the Java Native Interface (JNI) or an API to integrate other utilities into the application. An initial version of the GIMI Conformance Test Tool should be demonstrated within 120 days of the Testbed Kick-Off, during the first of the code sprints (see the Code Sprints Appendix for details about the code sprints). An initial version of the GIMI Conformance Test Tool should then be delivered within 30 days after the first code sprint. Subsequent versions can then be delivered as and when needed during the testbed.

Reports

D302 Documentation for the Conformance Test Tool: The documentation will be designed to support the installation and execution of the tool. An initial version of the documentation should be delivered within 90 days of the Testbed Kick-Off. Subsequent versions can then be delivered as and when needed during the testbed.

D303 Training for the GIMI Conformance Test Tool: This will include a slide deck and an accompanying document to be used for providing training to end users. The participant is expected to conduct a training session during the second of the code sprints (see the Code Sprints Appendix for details about the code sprints). Therefore an initial version of the Training should be delivered within 240 days of the Testbed Kick-Off. Subsequent versions can then be delivered as and when needed during the testbed.

3. Deliverables Summary

The following tables summarize the full set of Initiative deliverables. Technical details can be found in section Technical Architecture.

Bidders are invited to submit proposals on all items of interest. Funding is available for all deliverables, apart from Deliverable D210.

Table 2. CFP Deliverables - Grouped by Task

Task

Funding

ID and Name

Data Quality for Integrity, Provenance, and Trust (IPT)

Available

  • D101 DQ-enabled IPT Server 1

  • D102 DQ-enabled IPT Server 2

  • D103 DQ-enabled software library

  • D001 Data Quality for Integrity, Provenance and Trust Report

GEOINT Imagery Media for ISR for OGC Testbed-21 (GIMI-T21)

Available

  • D211 GIMI instance 1

  • D212 GIMI instance 2

  • D213 GeoTIFF and COG instances for GIMI benchmarking

  • D214 Cloud-based GIMI/IDO player/viewer

  • D201 Documentation of updates to libheif

  • D202 Documentation of updates to GStreamer

  • D203 Baseline OGC Testbed-21 Specification for GIMI

  • D204 Ontologies for Coverage Description Report

  • D205 Evaluation Report on the performance of GIMI data queries, interaction, download and display

  • D206 Advancement of the GIMI Standard Report

Conformance Testing Tool Development (CTTG)

Available

  • D301 The GIMI Conformance Test Tool

  • D302 Documentation for the Conformance Test Tool

  • D303 Training for the GIMI Conformance Test Tool

The following deliverable is unfunded:

  • D210 OGC API - Records instance (in-kind only)

4. Miscellaneous

OGC Testbeds are open to proposals from organizational members only. If you’re not yet a member but are interested in participating, we’d love to hear from you — learn more about membership and how to get involved here.

Call for Participation (CFP): The CFP includes of a description of deliverables against which bidders may submit proposals. Several deliverables are more technical in nature, such as documents and component implementations. Others are more administrative, such as monthly reports and meeting attendance. The arrangement of deliverables on the timeline is presented in the Main Schedule.

Each proposal in response to the CFP should include the bidder’s technical solution(s), its cost-sharing request(s) for funding, and its proposed in-kind contribution(s) to the initiative. These inputs should all be entered on a per-deliverable basis, and proposal evaluations will take place on the same basis.

Once the original CFP has been published, ongoing updates and answers to questions can be tracked by monitoring the CFP Corrigenda Table and the CFP Clarifications Table. The HTML version of the CFP will be updated automatically and stored at the same URL as the original version. The PDF version will have to be re-downloaded with each revision.

Bidders may submit questions using the OGC Testbed-21 CFP Questions Form. Question submitters will remain anonymous, and answers will be regularly compiled and published in the CFP clarifications.

A Bidders Q&A Webinar will be held on the date listed in the Main Schedule. The webinar is open to the public. Questions are due on the date listed in the Main Schedule.

Participant Selection and Agreements: Following the submission deadline, OGC will evaluate received proposals, review recommendations with Sponsors, and negotiate Participation Agreement (PA) contracts, including statements of work (SOWs). Participant selection will be complete once PA contracts have been signed with all Participants.

Kickoff: The Kickoff is a meeting where Participants, guided by the Initiative Architect, will refine the Initiative architecture and settle upon specific use-cases and interface models to be used as a baseline for prototype component interoperability. Participants will be required to attend the Kickoff meeting in person, including breakout sessions, and will be expected to use these breakouts to collaborate with other Participants and confirm intended Component Interface Designs.

Regular Teleconferences After the Kickoff, participants will meet frequently via weekly teleconferences.

Development of Deliverables: Development of Components, Reports, Change Requests, and other deliverables will commence during or immediately after Kickoff.

Under the Participation Agreement contracts, ALL Participants will be responsible for contributing content to the Reports, particularly regarding their component implementation experiences, findings, and future recommendations. But the Report Editor will be the primary author on the shared sections such as the Executive Summary.

More detailed deliverable descriptions appear under Types of Deliverables.

Final Summary Reports, Demonstration Event and Other Stakeholder Meetings: Participant Final Summary Reports will constitute the close of funded activity. Further development work might take place to prepare and refine assets to be shown at webinars, demonstration events, and other meetings.

Assurance of Service Availability: Participants selected to implement service components must maintain availability for a period of no less than six months after the Participant Final Summary Report milestone.

Demonstrator Maintenance: OGC staff may select a limited number of exceptional demonstrators each year for maintenance beyond the period of performance of the Testbed to enable ongoing public availability. These carefully chosen participants will be recognized for their exemplary demonstrators by OGC hosting these demonstrators and showcasing their capabilities to the public. The conditions and practical arrangements for this support will be arranged on a case-by-case basis, tailored to the unique needs of each deserving winner.

Appendix A: GIMI Specifications

The primary goal of the OGC Testbed-21 GIMI task is to enable GIMI application development by continuing open source software library implementation that builds on the progress made during OGC Testbed-20.

A.1. Resources

The following resources are relevant to this task.

A.1.1. ISO Standards

  1. ISO/IEC 14496-12, Information technology - Coding of audio-visual objects - Part 12: ISO base media file format.

  2. ISO/IEC 23008-12:2025 (Ed. 3), Information technology - High efficiency coding and media delivery in heterogeneous environments - Part 12: Image File Format.

  3. ISO/IEC 23001-17:2024, Information technology – Carriage of uncompressed video and images in ISO Base Media File Format.

  4. ISO/IEC 15444-16 Ed. 3, Information technology - JPEG 2000 image coding system — Part 16: Encapsulation of JPEG 2000 images into ISO/IEC 14496-12.

  5. ISO/IEC 15444-1:2019, Information technology - JPEG 2000 image coding system: Part 1: Core coding system.

  6. ISO/IEC 15444-15, Information technology — JPEG 2000 image coding system — Part 15: High-Throughput JPEG 2000.

A.1.2. GMSB Documents

  1. NGA.STND.0076, GEOINT Imagery Media for ISR, v1.0

  2. NGA.SIG.0045, Standard Information/Guidance (SIG) ISO Base Media File Format (ISOBMFF) Overview for NGA Applications, v1.0

Appendix B: Testbed Organization and Execution

B.1. Initiative Policies and Procedures

This initiative will be conducted within the policy framework of OGC’s Bylaws and Intellectual Property Rights Policy ("IPR Policy"), as agreed to in the OGC Membership Agreement, and in accordance with the OGC COSI Program Policies and Procedures and the OGC Principles of Conduct, the latter governing all related personal and public interactions.

Several key requirements are summarized below for ready reference:

  • Each selected Participant will agree to notify OGC staff if it is aware of any claims under any issued patents (or patent applications) which would likely impact an implementation of the specification or other work product which is the subject of the initiative. Participant need not be the inventor of such patent (or patent application) in order to provide notice, nor will Participant be held responsible for expressing a belief which turns out to be inaccurate. Specific requirements are described under the "Necessary Claims" clause of the IPR Policy.

  • Each selected Participant will agree to refrain from making any public representations that draft Report content has been endorsed by OGC before the draft report has been approved in an OGC Technical Committee (TC) vote.

  • Each selected Participant will agree to provide more detailed requirements for its assigned deliverables, and to coordinate with other initiative Participants, at the Kickoff event.

B.2. Initiative Roles

The roles generally played in any OGC COSI Program initiative include Sponsors, Bidders, Participants, Observers, and the COSI Program Team (led by OGC Staff). Explanations of the roles are provided in Tips for New Bidders.

OGC Staff for this Initiative will include an Initiative Director and an Initiative Architect. Unless otherwise stated, the Initiative Director will serve as the primary point of contact (POC) for the OGC.

The Initiative Architect will work with Participants and Sponsors to ensure that Initiative activities and deliverables are properly assigned and performed. They are responsible for scope and schedule control, and will provide timely escalation to the Initiative Director regarding any high-impact issues or risks that might arise during execution.

B.3. Types of Deliverables

All activities in this testbed will result in a Deliverable. These Deliverables generally take the form of Documents or Component Implementations.

B.3.1. Documents

Reports and Change Requests (CR) will be prepared in accordance with OGC published templates. Reports will be delivered by posting on the (members-only) OGC Pending directory when complete and the document has achieved a satisfactory level of consensus among interested participants, contributors and editors. Reports are the formal mechanism used to deliver results of OGC Initiatives to Sponsors and to the OGC Standards Program for consideration by way of Standards Working Groups and Domain Working Groups.

Tip

A common Report Template will be used as the starting point for each document. Various template files will contain requirements such as the following (from the 00-executive_summary.adoc file):

This Executive Summary, including the Overview, Future Outlook and Value proposition, is a mandatory section for all deliverable Reports. The Executive Summary is a high-level overview of the document and should be written in a way that is accessible to a non-technical audience.

Ideas for meeting this particular requirement can be found in the CFP Background as well as in previous Report content such as the business case in the SELFIE Executive Summary.

Document content should follow this OGC Document Editorial Guidance (scroll down to view PDF file content). File names for documents posted to Pending should follow this pattern (replacing the document name and deliverable ID): OGC Testbed-21: Ontologies for Coverage Description Report (D204).

B.3.2. Component Implementations

Component Implementations include services, clients, datasets, and tools. A service component is typically delivered by deploying an endpoint via an accessible URL. A client component typically exercises a service interface to demonstrate interoperability. Implementations should be developed and deployed in all threads for integration testing in support of the technical architecture.

Important

Under the Participation Agreement contracts, ALL Participants will be responsible for contributing content to the Reports, particularly regarding their component implementation experiences, findings, and future recommendations. But the Report Editor will be the primary author on the shared sections such as the Executive Summary.

Component implementations are often used as part of outreach demonstrations near the end of the timeline. To support these demos, component implementations are required to include Demo Assets. For clients, the most common approach to meet this requirement is to create a video recording of a user interaction with the client. These video recordings may optionally be included in a new YouTube Playlist such as this one for Testbed-15.

Tip

Videos to be included in the new YouTube Playlist should follow these instructions:

  • Upload the video recording to the designated OGC file system (to be provided), and

  • Include the following metadata in the Description field of the upload dialog box:

    • A Title that starts with "OGC Testbed-21:", keeping in mind that there is a 100-character limit [if no title is provided, we’ll insert the file name],

    • Abstract: [1-2 sentence high-level description of the content],

    • Author(s): [organization and/or individuals], and

    • Keywords: [for example, OGC, machine learning, OGC Testbed-21, analysis ready data, etc.].

Since server components often do not have end-user interfaces, participants may instead support outreach by delivering static UML diagrams, wiring diagrams, screenshots, etc. In many cases, the images created for a Report will be sufficient as long as they are suitable for showing in outreach activities such as Member Meetings and public presentations. A server implementer may still choose to create a video recording to feature their organization more prominently in the new YouTube playlist. Another reason to record a video might be to show interactions with a "developer user" (since these interactions might not appear in a client recording for an "end user").

Tip

Demo-asset deliverables are slightly different from TIE testing deliverables. The latter don’t necessarily need to be recorded (though they often appear in a recording if the TIE testing is demonstrated as part of one of the recorded weekly telecons).

B.4. Proposal Evaluation

Proposals are expected to be brief, broken down by deliverable and precisely addressing the work items of interest to the bidder. Details of the proposal submission process are provided under the General Proposal Submission Guidelines.

Proposals will be evaluated based on criteria in two areas: technical and management/cost.

B.4.1. Technical Evaluation Criteria

  • Concise description of each proposed solution and how it contributes to achievement of the particular deliverable requirements described the Technical Architecture,

  • Overall quality and suitability of each proposed solution, and

  • Where applicable, whether the proposed solution is OGC-compliant.

B.4.2. Management/Cost Evaluation Criteria

  • Willingness to share information and work in a collaborative environment,

  • Contribution toward Sponsor goals of enhancing availability of standards-based offerings in the marketplace,

  • Feasibility of each proposed solution using proposed resources, and

  • Proposed in-kind contribution in relation to proposed cost-share funding request.

Note that all Participants are required to provide at least some level of in-kind contribution (costs for which no cost-share compensation has been requested). As a rough guideline, a proposal should include at least one dollar of in-kind contribution for every dollar of cost-share compensation requested. All else being equal, higher levels of in-kind contributions will be considered more favorably during evaluation. Participation may also take place by purely in-kind contributions (no cost-share request at all).

Once the proposals have been evaluated and cost-share funding decisions have been made, OGC Staff will begin notifying Bidders of their selection to enter negotiations to become and initiative Participant. Each selected bidder will enter into a Participation Agreement (PA), which will include a Statement of Work (SOW) describing the assigned deliverables.

B.5. Reporting

Participants will be required to report the progress and status of their work; details will be provided during contract negotiation. Additional administrative details such as invoicing procedures will also be included in the contract.

B.5.1. Monthly Reporting

OGC Staff will provide monthly progress reports to Sponsors. Ad hoc notifications may also occasionally be provided for urgent matters. To support this reporting, each testbed participant must submit (1) a Monthly Technical Report and (2) a Monthly Business Report by the first working day on or after the 3rd of each month. Templates and instructions for both of these report types will be provided.

The purpose of the Monthly Business Report is to provide initiative management with a quick indicator of project health from each participant’s perspective. OGC Staff will review action item status on a weekly basis with assigned participants. Initiative participants must remain available for the duration of the timeline so these contacts can be made.

B.5.2. Participant Final Summary Reports

Each Participant should submit a Final Summary Report by the milestone indicated in the Main Schedule. These reports should include the following information:

  1. Briefly summarize Participant’s overall contribution to the testbed (for an executive audience),

  2. Describe, in detail, the work completed to fulfill the Participation Agreement Statement of Work (SOW) items (for a more technical audience), and

  3. Present recommendations on how we can better manage future OGC Initiatives.

This report may be in the form of email text or a more formal attachment (at the Participant’s discretion).

B.6. Code Sprints

As part of OGC Testbed-21, OGC will hold two hybrid OGC Code Sprints to help accelerate the development of Testbed deliverables. An OGC Code Sprint is a collaborative developer workshop focused on advancing geospatial standards and their implementations through coding, testing, and documentation. Hybrid code sprints offer both in-person and remote participation modes of participation.

As a minimum, all bidders selected to participate in the Testbed are also expected to remotely participate in the two hybrid OGC Code Sprints. During the course of the Testbed, there will be an opportunity for Testbed participants to apply for Travel Support funding to participate in the Code Sprints in-person.

Each Code Sprint will be a three-day event. The first Code Sprint will be held within 120 days of the Testbed Kick-Off. The second Code Sprint will be held within 240 days of the Testbed Kick-Off.

Appendix C: Proposal Submission

C.1. General Proposal Submission Guidelines

This section presents general guidelines for submitting a CFP proposal. Detailed instructions for submitting a response proposal using the Bid Submission Form web page can be found in the Step-by-Step Instructions below.

Important

Please note that the content of the "Proposed Contribution" text box in the Bid Submission Form will be accessible to all Stakeholders and should contain no confidential information such as labor rates.

Similarly, no sensitive information should be included in the Attached Document of Explanation.

Proposals must be submitted before the deadline indicated in the Main Schedule.

Bidders responding to this CFP must be organizational OGC members familiar with the OGC mission, organization, and process.

OGC Testbeds are open to proposals from organizational members only. If you’re not yet a member but are interested in participating, we’d love to hear from you — learn more about membership and how to get involved here.

Information submitted in response to this CFP will be accessible to OGC and Sponsor staff members. This information will remain in the control of these stakeholders and will not be used for other purposes without prior written consent of the Bidder. Once a Bidder has agreed to become a Participant, they will be required to release proposal content (excluding financial information) to all initiative stakeholders. Sensitive information other than labor-hour and cost-share estimates should not be submitted.

Bidders will be selected for cost share funds on the basis of adherence to the CFP requirements and the overall proposal quality. The general testbed objective is to inform future OGC standards development with findings and recommendations surrounding potential new specifications. Each proposed deliverable should formulate a path for (1) producing executable interoperable prototype implementations meeting the stated CFP requirements and (2) documenting the associated findings and recommendations. Bidders not selected for cost share funds may still request to participate on a purely in-kind basis.

Bidders should avoid attempts to use the initiative as a platform for introducing new requirements not included in Technical Architecture. Any additional in-kind scope should be offered outside the formal bidding process, where an independent determination can be made as to whether it should be included in initiative scope or not. Out-of-scope items could potentially be included in another OGC IP initiative.

Each selected Participant (even one not requesting any funding) will be required to enter into a Participation Agreement contract ("PA") with the OGC. The reason this requirement applies to purely in-kind Participants is that other Participants will likely be relying upon their delivery. Each PA will include a Statement of Work ("SOW") identifying specific Participant roles and responsibilities.

C.2. Questions and Clarifications

Once the original CFP has been published, ongoing updates and answers to questions can be tracked by monitoring the CFP Corrigenda Table and the CFP Clarifications Table

Bidders may submit questions using the OGC Testbed-21 CFP Questions Form. Question submitters will remain anonymous, and answers will be regularly compiled and published in the CFP clarifications.

A Bidders Q&A Webinar (MS Teams Link Here) will be held on the date listed in the Main Schedule. The webinar is open to the public. Questions are due on the date listed in the Main Schedule. The recording of the webinar is here.

C.3. Proposal Submission Procedures

The process for a Bidder to complete a proposal is essentially embodied in the online Bid Submission Form. Once this site is fully prepared to receive submissions (soon after the CFP release), it will include a series of web forms, one for each deliverable of interest. A summary is provided here for the reader’s convenience.

For any individual who has not used this form in the past, a new account will need to be created first. The user will be taken to a home page indicating the "Status of Your Proposal." If any defects in the form are discovered, this page includes a link for notifying OGC. The user can return to this page at any time by clicking the OGC logo in the upper left corner.

Any submitted bids will be treated as earnest submissions, even those submitted well before the response deadline. Be certain that you intend to submit your proposal before you click the Submit button on the Review page.

Important

Because the Bid Submission Form is still relatively new, it might contain some areas that are still brittle or in need of repair. Please notify OGC of any discovered defects. Periodic updates will be provided as needed.

Please consider making local backup copies of all inputs in case any need to be re-entered.

C.3.1. High-Level Overview

Clicking on the Propose link will navigate to the Bid Submission Form. The first time through, the user should provide organizational information on the Organizational Background Page and click Update and Continue.

The following screenshot shows the Organizational Background page:

organizational background page
Figure 1. Sample Organizational Background Page

Clicking Update and Continue will navigate to an "Add Deliverable" page that will resemble the following:

proposal submission form AddDeliverable
Figure 2. Sample "Add Deliverables" Page

The user should complete this form for each proposed deliverable.

Tip

For component implementations having multiple identical instances of the same deliverable, the bidder only needs to propose just one instance. For simplicity, each bidder should just submit against the lowest-numbered deliverable ID. OGC will assign a unique deliverable ID to each selected Participant later (during negotiations).

On the far right, the Review link navigates to a page summarizing all the deliverables the Bidder is proposing. This Review tab won’t appear until the user has actually submitted at least one deliverable under the Propose tab first.

Tip

Consider regularly creating printed output copies of this Review page at various points during proposal creation.

Once the Submit button is clicked, the user will receive an immediate confirmation on the website that their proposal has been received. The system will also send an email to the bidder and to OGC staff.

Tip

In general, up until the time that the user clicks this Submit button, the proposal may be edited as many times as the user wishes. However, this initial version of the form contains no "undo" capability, so please use caution in over-writing existing information.

The user is afforded an opportunity under Done Adding Deliverables at the bottom of this page to attach an optional Attached Document of Explanation.

proposal submission form attached doc
Figure 3. Sample Dialog for an "Attached Document of Explanation"
Important

No sensitive information (such as labor rates) should be included in the Attached Document of Explanation.

If this attachment is provided, it is limited to one per proposal and must be less than 5Mb.

This document could conceivably contain any specialized information that wasn’t suitable for entry into a Proposed Contribution field under an individual deliverable. It should be noted, however, that this additional documentation will only be read on a best-effort basis. There is no guarantee it will be used during evaluation to make selection decisions; rather, it could optionally be examined if the evaluation team feels that it might help in understanding any specialized (and particularly promising) contributions.

C.3.2. Step-by-Step Instructions

The Propose link takes the user to the first page of the proposal entry form. This form contains fields to be completed once per proposal such as names and contact information.

It also contains an optional Organizational Background field where Bidders (particularly those with no experience participating in an OGC initiative) may provide a description of their organization. It also contains a click-through check box where each Bidder will be required (before entering any data for individual deliverables) to acknowledge its understanding and acceptance of the requirements described in this appendix.

Clicking the Update and Continue button then navigates to the form for submitting deliverable-by-deliverable bids. On this page, existing deliverable bids can be modified or deleted by clicking the appropriate icon next to the deliverable name. Any attempt to delete a proposed deliverable will require scrolling down to click a Confirm Deletion button.

To add a new deliverable, the user would scroll down to the Add Deliverable section and click the Deliverable drop-down list to select the particular item.

The user would then enter the required information for each of the following fields (for this deliverable only). Required fields are indicated by an asterisk ("*"):

  • Estimated Projected Labor Hours* for this deliverable,

  • Funding Request*: total U.S. dollar cost-share amount being requested for this deliverable (to cover burdened labor only),

  • Estimated In-kind Labor Hours* to be contributed for this deliverable, and

  • Estimated In-Kind Contribution: total U.S. dollar estimate of the in-kind amount to be contributed for this deliverable (including all cost categories).

Tip

There’s no separate text box to enter a global in-kind contribution. Instead, please provide an approximate estimate on a per-deliverable basis.

Cost-sharing funds may only be used for the purpose of offsetting burdened labor costs of development, engineering, documentation, and demonstration related to the Participant’s assigned deliverables. By contrast, the costs used to formulate the Bidder’s in-kind contribution may be much broader, including supporting labor, travel, software licenses, data, IT infrastructure, and so on.

Theoretically there is no limit on the size of the Proposed Contribution for each deliverable (beyond the raw capacity of the underlying hardware and software). But bidders are encouraged to incorporate content by reference where possible (rather than inline copying and pasting) to avoid overloading the amount of material to be read in each proposal. There is also a textbox on a separate page of the submission form for inclusion of Organizational Background information, so there is no need to repeat this information for each deliverable.

Important

A breakdown (by cost category) of the "Inkind Contribution" may be included in the Proposed Contribution text box for each deliverable.

However, please note that the content of this text box will be accessible to all Stakeholders and should contain no confidential information such as labor rates.

Similarly, no sensitive information should be included in the Attached Document of Explanation.

This field Proposed Contribution (Please include any proposed datasets) should also be used to provide a succinct description of what the Bidder intends to deliver for this work item to meet the requirements expressed in the Technical Architecture. This language could potentially include a brief elaboration on how the proposed deliverable will contribute to advancing the OGC standards baseline, or how implementations enabled by the specification embodied in this deliverable could add specific value to end-user experiences.

A Bidder proposing to deliver a Service Component Implementation can also use this field to identify what suitable datasets would be contributed (or what data should be acquired from another identified source) to support the proposed service.

Tip

In general, please try to limit the length of each Proposed Contribution to about one text page per deliverable.

Note that images cannot be pasted into the field Proposed Contribution textbox. Bidders should instead provide a link to a publicly available image.

A single bid may propose deliverables arising from any number of threads or tasks. To ensure that the full set of sponsored deliverables are made, OGC might negotiate with individual Bidders to drop and/or add selected deliverables from their proposals.

C.4. Tips for New Bidders

Bidders who are new to OGC initiatives are encouraged to review the following tips:

  • In general, the term "activity" is used as a verb describing work to be performed in an initiative, and the term "deliverable" is used as a noun describing artifacts to be developed and delivered for inspection and use.

  • The roles generally played in any OGC COSI Program initiative are defined in the OGC COSI Program Policies and Procedures, from which the following definitions are derived and extended:

    • Sponsors are OGC member organizations that contribute financial resources to steer Initiative requirements toward rapid development and delivery of proven candidate specifications to the OGC Standards Program. These requirements take the form of the deliverables described herein. Sponsors representatives help serve as "customers" during Initiative execution, helping ensure that requirements are being addressed and broader OGC interests are being served.

    • Bidders are organizations who submit proposals in response to this CFP. A Bidder selected to participate will become a Participant through the execution of a Participation Agreement contract with OGC. Most Bidders are expected to propose a combination of cost-sharing request and in-kind contribution (though solely in-kind contributions are also welcomed).

    • Participants are selected OGC member organizations that generate empirical information through the definition of interfaces, implementation of prototype components, and documentation of all related findings and recommendations in Reports, Change Requests and other artifacts. They might be receiving cost-share funding, but they can also make purely in-kind contributions. Participants assign business and technical representatives to represent their interests throughout Initiative execution.

    • Observers are individuals from OGC member organizations that have agreed to OGC intellectual property requirements in exchange for the privilege to access Initiative communications and intermediate work products. They may contribute recommendations and comments, but OGC Staff have the authority to table any of these contributions if there’s a risk of interfering with any primary Initiative activities.

    • Supporters are OGC member organizations who make in-kind contributions aside from the technical deliverables. For example, a member could donate the use of their facility for the Kickoff event.

    • OGC Staff is the team of employees of the OGC that are tasked with managing, overseeing and coordinating the Initiative.

    • The COSI Team is comprised of OGC staff, representatives from member organizations, and OGC consultants. The COSI Team communicates with Participants and other stakeholders during Initiative execution, provides Initiative scope and schedule control, and assists stakeholders in understanding OGC policies and procedures.

    • The term Stakeholders is a generic label that encompasses all Initiative actors, including representatives of Sponsors, Participants, and Observers, as well as OGC Staff.

    • Suppliers are organizations (not necessarily OGC members) who have offered to supply specialized resources such as cloud credits. OGCs role is to assist in identifying an initial alignment of interests and performing introductions of potential consumers to these suppliers. Subsequent discussions would then take place directly between the parties.

  • OGC Testbeds are open to proposals from organizational members only. If you’re not yet a member but are interested in participating, we’d love to hear from you — learn more about membership and how to get involved here.

  • Any individual wishing to gain access to the Initiative’s intermediate work products in the restricted area of the OGC file system (or attend private working meetings / telecons) must be a member-approved user of the OGC file system.

  • Individuals from any OGC member organization that does not become an initiative Sponsor or Participant may still (as a benefit of membership) observe activities by registering as an Observer.

  • Prior initiative participation is not a direct bid evaluation criterion. However, prior participation could accelerate and deepen a Bidder’s understanding of the information presented in the CFP.

  • All else being equal, preference will be given to proposals that include a larger proportion of in-kind contribution.

  • All else being equal, preference will be given to proposed components that are certified OGC-compliant.

  • All else being equal, a proposal addressing all of a deliverable’s requirements will be favored over one addressing only a subset. Each Bidder is at liberty to control its own proposal, of course. But if it does choose to propose only a subset for any particular deliverable, it might help if the Bidder prominently and unambiguously states precisely what subset of the deliverable requirements are being proposed.

  • The Sponsor(s) will be given an opportunity to review selection results and offer advice, but ultimately the Participation Agreement (PA) contracts will be formed bilaterally between OGC and each Participant organization. No multilateral contracts will be formed. Beyond this, there are no restrictions regarding how a Participant chooses to accomplish its deliverable obligations so long as these obligations are met in a timely manner (whether a 3rd-party subcontractor provides assistance is up to the Participant).

  • In general, only one organization will be selected to receive cost-share funding per deliverable, and that organization will become the Assigned Participant upon which other Participants will rely for delivery. Optional in-kind contributions may be made provided that they don’t disrupt delivery of required, reliable contributions from the assigned Participants.

  • A Bidder may propose against any or all deliverables. Participants in past initiatives have often been assigned to make only a single deliverable. On the other hand, several Participants in prior initiatives were selected to make multiple deliverables.

  • In general, the Participant Agreements will not require delivery of any component source code to OGC.

    • What is delivered to OGC is the behavior of the component installed on the Participant’s machine, and the corresponding documentation of findings, recommendations, and technical artifacts contributed to Report(s).

    • In some instances, a Sponsor might expressly require a component to be developed under open-source licensing, in which case the source code would become publicly accessible outside the Initiative as a by-product of implementation.

  • Results of other recent OGC initiatives can be found in the OGC Public Report Repository.

Appendix D: Abbreviations

The following table lists all abbreviations used in this CFP.

ARA

Agile Reference Architecture

CFP

Call for Participation

COG

Cloud Optimized GeoTIFF

COSI

Collaborative Solutions and Innovation Program

DR

Draft Report

DWG

Domain Working Group

E2E

End–to-End

ECMWF

European Centre for Medium-Range Weather Forecasts

GDC

GeoDataCube

GIMI

GEOINT Imagery Media for ISR

HIEF

High Efficiency Image File Format

I-GUIDE

Institute for Geospatial Understanding through an Integrative Discovery Environment

IPT

Integrity, Provenance, and Trust

ISO/IEC

International Standards Organization / International Electrotechnical Commission

ISOBMFF

ISO Base Media File Format

ISR

Intelligence, Surveillance, and Reconnaissance

MARS

Meteorological Archival and Retrieval System

NGIIS

The Next Generation ISR Imagery Standards

NITF

National Imagery Transmission Format

NWP

Numerical Weather Prediction

OGC

Open Geospatial Consortium

PA

Participation Agreement

POC

Point of Contact

Q&A

Questions and Answers

SOW

Statement of Work

SWG

Standards Working Group

TBD

To Be Determined

TC

OGC Technical Committee

TEM

Technical Evaluation Meeting

TIE

Technology Integration / Technical Interoperability Experiment

URL

Uniform Resource Locator

WG

Working Group (SWG or DWG)

Appendix E: Corrigenda & Clarifications

E.1. Corrigenda Table

The following table identifies all corrections that have been applied to this CFP compared to the original release. Minor editorial changes (spelling, grammar, etc.) are not included.

Table 3. Corrigenda Table
Section Description Date of Change

4

removed requirement to register for the bidders' webinar

2025-08-07

C.2.

removed requirement to register for the bidders' webinar

2025-08-07

1.3

added a link to the recording of the bidders' webinar

2025-08-08

C.2.

added a link to the recording of the bidders' webinar

2025-08-08

1.3

Clarification that Kick Off meeting is a two-day event.

2025-08-11

2.1.4

Updated link to July 2024 Open Standards Code Sprint Engineering Report.

2025-08-16

E2

Updated the Clarifications Table

2025-09-05

E.2. Clarifications Table

The following table identifies all clarifications that have been provided in response to questions received from organizations interested in this CFP.

Please use this convenience link to navigate to the end of the table.

Table 4. Clarifications Table
Question Clarification

Q1: GIMI-T21 has a report requirement "D203 Baseline OGC Testbed-21 Specification for GIMI", which is described as documenting the baseline specification of GIMI for this Testbed. Given that the baseline specification of GIMI is NGA.STND.0076, can OGC provide more detail on this report, and which task 2 components it is expected to relate to?

A: Yes, indeed, the baseline for the GIMI specification is NGA.STND.0076-01_V1.0.0_GIM. NGA.STND.0076 is intended to be the starting point for a GIMI specification that could be adopted by the OGC Membership as an OGC Standard in the future. This means phrasing GIMI requirements and associated abstract tests in a way that can be documented in OGC Standards. For example, assigning unique identifies to requirements, grouping them into requirements classes, and then describing the abstract tests and grouping those into conformance classes.

Q2: GIMI-T21 requires production of TriG formatted RDF conformant to the NGA IDO, which is an emerging activity. Is that ontology expected to be available by the kickoff meeting? If not, is there an estimate for where in the schedule it will be available?

A: Yes, the ontology will be ready by the kick-off meeting.

Q3: Where are the code sprints likely to be held? While the travel costs (if approved) are outside of the proposal cost, the time zone and potential travel time / recovery time for in-person attendance is a planning and costing consideration.

A: The code sprints are likely to be held either in St Louis, Missouri (at NGA Moonshot Labs) or in London, England (at Ordnance Survey’s Geovation Hub). If neither of those two venues is available, then a possible third venue would be in the Washington DC area.

Q4: Could the kickoff meeting be made virtual? The kickoff meeting for Testbed 20 ran for one day as a virtual session, and seemed relatively successful. At this stage, its very close to the final participant selection (less than two weeks, given travel time), and travel time, recovery time, and international flights / late-notice hotel bookings represents a significant cost element in the proposal.

A: Thank you for the suggestion. While we recognize that the virtual format used for the Testbed 20 kickoff was effective, for this particular kickoff meeting, we’ve determined that an in-person format remains essential. Given the nature of the discussions and the collaborative planning required at this stage, the benefits of face-to-face engagement outweigh the logistical and cost considerations. We understand the challenges related to travel time, recovery, and short-notice arrangements, and we’re doing our best to support participants through this process. We appreciate your understanding and flexibility as we move forward.

Q: Is the Kick Off meeting a single-day event?

A: No, the Kick Off meeting will be a two-day event from 2025-10-07 to 2025-10-08 in the Washington, DC area.

Q: On the Data Quality for IPT thread, it mainly lists servers and there are no client applications mentioned. Could you please confirm if there are no client applications expected in this thread?

A: There is no specific client application identified as a deliverable in the DQ4IPT thread, however there is a DQ-enabled software library (deliverable D103) that would access the DQ-enabled IPT Servers (deliverables D101 and D102). There is no restriction on whether the specified deliverables could have a Graphical User Interface (GUI), so it is up to the participants to decide whether or not to implement a GUI for those components.

Q: Are the code sprints mentioned the common ones that OGC normally organizes, like the OGC API Code Sprints or will they be independent code sprints?

A: We are planning to ensure that the code sprints are open to the general public. They will be organised in the same way as the OGC API Code Sprints.

Q: In some OGC Code Sprints, we used to develop a story to connect the different threads. Do you see any value in developing a story to integrate the different threads?

A: We intend to ensure that the testbed participants work across the threads as much as possible. So we will encourage knowledge sharing across the testbed. However, the threads are distinct, so they will have different obligations.

Q:

A:

-- Pre-Release --

Q:

A:

E.3. End of Clarifications Table (convenience link)

.

References

[1] Bugbee, K., Roux, J. le, Sisco, A., Kaulfus, A., Staton, P., Woods, C., Dixon, V., Lynnes, C., Ramachandran, R.: Improving discovery and use of NASA’s earth observation data through metadata quality assessments. Data Science Journal. 20, 17–17 (2021).

[2] Churchyard, P.: Testbed-20: Integrity, Provenance, and Trust (IPT) Report. OGC 24-033. Open Geospatial Consortium, available from https://docs.ogc.org/per/24-033.html.

[3] Masó, J., Zabala, A.: OGC Testbed-12 Imagery Quality and Accuracy Engineering Report. OGC 16-050. Open Geospatial Consortium, available from https://docs.ogc.org/per/16-050.html.

[4] Balaban, A.: OGC Testbed-13: Quality Assessment Service Engineering Report. OGC 17-025r2. Open Geospatial Consortium, available from https://docs.ogc.org/per/17-025r2.html.

[5] NGA: GEOINT Imagery Media for Intelligence, Surveillance, and Reconnaissance (ISR) (GIMI) Volume 1 Profile of ISOBMFF & HEIF (Version 1.0.0). NGA.STND.0076-01_V1.0.0_GIM. National Geospatial Intelligence Agency, available from https://portal.ogc.org/files/?artifact_id=111555.

[6] Bermudez, L.: OGC Testbed 17: CITE Engineering Report. OGC 21-044. Open Geospatial Consortium, available from https://docs.ogc.org/per/21-044.html.