You are viewing a plain text version of this content. The canonical link for it is here.
Posted to geospatial@apache.org by Martin Desruisseaux <ma...@geomatys.com> on 2016/09/26 16:32:00 UTC

Report from September 2016 OGC meeting

Hello all

There is a report of some items that caught my attention in the Open
Geospatial Consortium (OGC) meeting that happened last week in Orlando.
As usual this is only a small subset of all the discussions that
happened, and reflect only my understanding. This email content is:

  * Open Architecture Board (OAB)
      o Open API
      o Web Integration Service
  * Coordinate Reference System (CRS)
  * JSON encoding of Moving Features
  * Aviation (WFS update proposition)
  * Meteorology and oceanography
  * Imagery, elevation and metadata standard for defence
  * Citizen and science
  * Quality of services
  * Data formats
      o GeoTIFF
      o NetCDF and HDF5
      o GRIB 2

Full agenda of OGC meeting is available at
http://www.opengeospatial.org/events/1609tcagenda


    Open Architecture Board

The role of Open Architecture Board (OAB) is to address issues that
impact many OGC standards. The following aspects (among others) were
discussed:


      Open API

The OAB presented their latest draft of the API white paper. A motion
passed for making this draft public for comment. I will keep this list
informed when it will happen. The reason for this paper is that recent
proliferation of APIs (e.g. in Javascript world) has degraded open
interoperability previously established by open standard. OGC is looking
for a way to address this issue, possibly by converging to a small set
of "essential" attributes usable across otherwise disparate API. OGC is
considering to use the Open API Specification (OAS) - formerly known as
Swagger - for managing API specifications.

This topic is of particular importance to Apache SIS, which depends
strongly on GeoAPI which itself depends on OGC approach regarding API.


      Web Integration Service

The OGC Web Integration Service propose a way for clients to
automatically discover which OGC web services are available at an
endpoint. The intend is to not only list which services are available on
a server (WMS, WCS, /etc./), but also to describe their relationship.
For example a Web Coverage Service (WCS) provides raster data while Web
Map Service (WMS) provides visual representation of data. A Web
Integration Service could tell "Layer /XYZ/ served by that WMS is the
visual representation of raster /ABC/ served by that WCS". This kind of
information is useful for catalogues.


    Coordinate Reference System (CRS)

The Coordinate Reference System (CRS) working group had a presentation
about 3- or 4-dimensional CRS with height measured as atmospheric or
oceanographic pressure. The issue is about what to do when a client
application do not know how to handle such kind of heights. For example
a client application may want heights in metres, but the formula for
transforming pressure into elevation depend on the data (e.g.
oceanographers will use different formulas for the open ocean than for
the Black Sea; meteorologist may have transformation parameters that
change every days). So a server that publish data relative to a
specialized vertical CRS may need a way to inform the client about the
coordinate transformation process from those heights to elevations in
metres. Potential changes to ISO 19111 and 19162 standards needed for
addressing this use case where mentioned.

Apache SIS is very close to supporting such use case. I think it may be
the most advanced Open Source library in that aspect, given its Well
Known Text (WKT) 2 support, multi-dimensional CRS and coordinate
operation chains.

Other subjects of discussion were: GeodeticCRS subtypes (e.g.
GeographicCRS and GeocentricCRS - names may vary); changes needed in the
definition of datum epoch in order to support dynamic datum (for taking
in account tectonic plates movement); "ensemble" or "collection" of CRS
or datum for those who do not care about dynamic datum.


    JSON encoding of Moving Features

Moving Features JSON encoding (MF-JSON) is based on IETF GeoJSON with
the addition of temporal geometric object (whose location changes over
time) and dynamic non-spatial attributes (whose values varies with
time). The format uses GeoJSON "foreign members" for achieving this
goal. A REST API is defined for accessing those Moving Features objects.
Examples of API usage are available there:

    https://ksookim.github.io/mf-access/

If I'm understanding right, I think we can expect a MF-JSON
specification open for public comments soon.


    Aviation (WFS update proposition)

The aviation working group finds limitations in the current Web Feature
Service (WFS) 2.0. Time is treated as every other property; plain WFS
2.0 allows to retrieve only the full history for every element on the
map, not the status at a certain time. The aviation working group wants
dynamic queries with a "time of evaluation" parameter. A "WFS
temporality extension" discussion paper is under preparation.


    Meteorology and oceanography

The "MetOcean" profile is an extension of OGC Web Coverage Service (WCS)
standard and recommendations about how to use that service in the
meteorological and oceanographic communities. This profile is derived
from the "Earth Observation Metadata" profile and the "Observations &
measurements" model (ISO 19156) among others. Some main drivers for the
MetOcean profile are dimensionality (data are inherently 4D),
interoperability (describe data with community-controlled vocabulary, in
particular from World Meteorological Organization (WMO) [1]) and
sub-setting of large dataset on the fly. In particular there is a need
to get four-dimensional data with a single identifier instead than
thousands of requests for 2D slices.

Some MetOceans concepts are applied directly to the core of the Web
Coverage Service (WCS) 2.1 standard. For example the concept of coverage
collections (e.g. many climate simulations, or a mosaic of radar images)
is included in WCS 2.1.

Inside a coverage, phenomenons like temperature and salinity are no
longer considered as coverage on their own, but rather as "elements"
(ISO 19115 "attribute" in current Apache SIS implementation) of a
"multi-bands" coverage. Those elements are described in the metadata
associated to the coverage. The "MetOcean" profile provides also a
mechanism (cis:irregular) for handling non-regular axes, for example a
coverage where data are available at depths of 3, 10, 15, 50 and 100 metres.

Future evolutions of the "MetOcean" profile may introduce a mechanism
for allowing client to get data in a corridor. The main use case is to
get meteorological conditions on a plane trajectory. The group also
proposes to add GRIB 2 (a WMO compact streaming table-driven format for
exchange of meteorological data) in the list of formats used for data
exchange through WCS.

[1] http://codes.wmo.int/grib2/codeflag/4.5


    Imagery, elevation and metadata standard for defence

This group provides profiles for some commonly used formats. Those
profiles add restrictions in order to improve inter-operability, while
still keeping the functionalities needed for defence purpose. For
example the group uses the GeoTIFF format with restrictions on the way
to encode the transparency mask, internal tiling, multi-bands data (4 to
8 bands), etc. Formats used by the group includes:

  * GeoTIFF: with support for multi-spectral imagery and improvement of
    vertical reference system based on EPSG codes. A major issue
    reported by the group is that the GeoTIFF standard references
    outdated EPSG codes.
  * JPEG2000 + GML annotations: use a simplified version of GML. A
    limitation is that current profile can not handle Referenceable
    coverages or sensor image. An extension proposal addressing those
    needs is under submission at OGC.

The following future geospatial trends were considered of high
importance: text analytic, points clouds, big data technologies, agile
processing chain, image processing, ontologies and semantics, human and
geography, internet of things, uncertainty, data science.


    Citizen and science

The purpose of this new working group is to explore issues that need to
be resolved for using data provided by citizen in scientific research
Some issues are: which vocabularies to use, what data quality assurance
processes look like and how they can be documented, how citizen science
data can be made persistent and accessible beyond the lifetime of the
original research project, how to simplify data access, etc. The group
uses the "Observation & Measurement" standard as the base model, then
add specialized types. A real-world example of a data survey design was
presented.


    Quality of services

A new group is about to be formed about technical reliability and
performance of network services. The metrics to be measured are
typically error rates, throughput, availability and delay or request
response time. But it is difficult to automatically identify if the
monitored services are behaving as expected, because different services
and operations may have very different response times in a normal situation.

This new group will not create standards on its own, but may propose
revision of existing standards. For example there is no standardised way
for retrieving the operational status of a particular OGC Web Service
instance, and no standardised way for declaring the operating hours and
scheduled maintenance. This group may propose extensions to existing OGC
web services.

The US Federal Geographic Data Committee (FGDC.gov) provides a service
for monitoring the performance and reliability of geospatial web
services. Their "FGDC Service status checker" is working for years. This
checker has a list of services. Users can add a service either by
creating manually a XML file, or by metadata mining. Once added on the
list, the service is tested on a daily basis and summary reports can be
emailed to the service feed owner. A dashboard on the web reports new,
modified and deleted services.

The checker tests the web service "getCapabilities" operation.  If it
failed, the checker tries to generated it. Then the checker tests the
"getMap" operation. The checker does not verify the content; it checks
only if the operation succeed. The meeting had a discussion about
whether the checker should leverage the CITE tests for verifying also
the content. But verifying content would have a significant performance
cost, and checking conformance versus checking availability are two
different topics. Conformance usually do not need to be verified daily.

More information can be found on
http://external.opengeospatial.org/twiki_public/QualityOfService/WebHome


    Data formats

All data formats described below are defined outside OGC, but adopted by
OGC as standards. Consequently most engineering work for the next topics
happen outside OGC, and the work remaining in OGC are mostly about
standardisation process and education.


      GeoTIFF image format

The OGC GeoTIFF standard group is dormant for a while. However another
group during this OGC meeting submitted an issue that need to be
addressed (there is a strong need to update the version of the EPSG
dataset referenced by the GeoTIFF standard). They were a call to restart
the GeoTIFF working group, which got strong support.


      NetCDF and HDF5

We still have a need for a clear, full, unique, and well-documented
specification about how to encode ISO 19115 Core and Dublin Core
(discovery) metadata in the OGC netCDF encoding. The NetCDF Attribute
Convention for Dataset Discovery (ACDD) [1] still our best reference as
far as I know, but we may need to incorporate them in current OGC
standards. This topic is of relevance for the Apache SIS "NetCDF to ISO
19115" mapping [2].

NetCDF is a binary format, but the OGC 14-100r2 (CF-netCDF 3.0 encoding
using GML) provides the same metadata structure in a XML format. This
specification seems to be an extension of ncML with the addition of GML
objects like features.

HDF5 is another format serving similar purposes than NetCDF. Version 4
of NetCDF is basically a slightly restricted version of HDF 5, so the
later can be seen as an evolution of NetCDF 3 (also known as "NetCDF
classic"). A new working group as been created for specifying the HDF
format in an OGC standard.

[1] http://www.unidata.ucar.edu/software/thredds/current/netcdf-java/metadata/DataDiscoveryAttConvention.html
[2] http://sis.apache.org/apidocs/org/apache/sis/storage/netcdf/AttributeNames.html