You are viewing a plain text version of this content. The canonical link for it is here.
Posted to geospatial@apache.org by Martin Desruisseaux <ma...@geomatys.com> on 2017/04/06 16:33:57 UTC

Report on OGC meeting in Delft

Hello all

An Open Geospatial Consortium (OGC) meeting happened in Netherlands two
weeks ago. More than 230 people attended the meeting. A concurrent
meeting with the World Wide Web Consortium (W3C) happened in joint
OGC-W3C Spatial Data on the Web Working Group (SDWWG) [1]. This OGC
meeting had a focus on linked data and discussion about future
directions, with emphasis on veracity and uncertainty of big data
analytics. Below is a summary of some points that I noted. Of course, my
notes cover only a small fraction of the subjects discussed in such
meetings and I may have misunderstood items. The list of topics at that
meeting can be viewed at [2]. Summary slides for each working group are
available at [3] (309 slides).

Content of this email:

  * GeoAPI and Moving Features
  * Coordinate Reference Systems (CRS)
  * Metadata and file formats: HDF, NetCDF
  * Consequence of geometric errors in 3D city models
  * Electromagnetic spectrum
  * Web Services: WMS, WCS, WPS
  * Future directions

Some other topics discussed at the meeting but not mentioned in this
email are INSPIRE, health, agriculture, hydrology, marine, unmanned
systems, Discrete Global Grid System (DGGS), Copernicus Data, security, etc.

[1] https://www.w3.org/2015/spatial/wiki/Main_Page
[2] https://portal.opengeospatial.org/public_ogc/sched/agenda.php?meeting=1703tc&my_session=all
[3] https://portal.opengeospatial.org/files/?artifact_id=73443


    GeoAPI and Moving Features

GeoAPI [1] is the set of interfaces implemented by Apache SIS. The group
is dormant for now, but we had discussion with a few other OGC members
about restarting it. We will propose soon a new charter. We had talk
about expanding GeoAPI scope to other languages, starting with Python.
We are exploring the use of a web site helping us to define the types in
a more language-neutral way [2].

An issue related to GeoAPI has been discussed during the Moving Feature
session. The Feature model currently proposed for a next GeoAPI release
is based on ISO 19109, which is a static model. The Moving Features
model is based on ISO 19141, which has been published before the ISO
19109 revision that we are using. How to conciliate the two models is
still an open question that need to be resolved before we can complete
the implementation in Apache SIS.

Moving Features Access has been approved as OGC standard. This standard
defines a REST API. A difference between REST API and GeoAPI is that
REST and Web Services are "high level" APIs, providing response encoded
in some file formats (XML, JSON, etc.). It is caller's task to decode
those file formats. GeoAPI is a lower level API giving access to the
components inside the data file. The NetCDF UCAR library [3] already
does that in a file-format independent way. GeoAPI aims to pursued the
same goal in an implementation independent way with conceptual models
close to OGC/ISO standards.

The Moving Features JSON encoding [4] has also been approved as an OGC
best practice paper.

[1] http://www.geoapi.org/
[2] http://geographical-features.net/GeoAPI
[3] https://www.unidata.ucar.edu/software/netcdf/
[4] https://github.com/opengeospatial/mf-json


    Coordinate Reference Systems (CRS)

ISO 19111 revision is making good progress. A draft submission may be
ready at end of July.

About the handling of time dimension:

  * Temporal CRS will be part of the ongoing ISO 19111 revision.
  * Temporal Well Known Text (WKT) is already supported by ISO 19162 as
    time on a continuous axis (no calendar). Calendar support in WKT
    could be an ISO 19162 extension.
  * One change that may be considered is to not define temporal quantity
    in respect to a base SI units. For example instead of
    TIMEUNIT["day", 86400] (because there is 86400 seconds in a day), we
    could have TIMECOUNT["day"]. The reason is that conversion from e.g.
    day to seconds may require context (e.g. for leap seconds).

About the way CRS are stored in some file formats, I have learn that the
Geopackage specification chooses to use two separated fields for CRS
defined in Well Know Text (WKT) version 1 and WKT version 2. Maybe it
would have been possible to avoid this redundancy (and the inconsistency
risk) since WKT 2 is mostly backward compatible with WKT 1, but the
single field approach has not been retained.


    Metadata and file formats: HDF, NetCDF

On metadata aspects, we have done the discovery part. Now, future
developments need to move focus on "how to use?" and "can I trust?"
questions. ISO 19115 already provides some fields for those questions,
but I think they are not yet widely used. For NetCDF files, the
CF-convention needs to be enriched in order to answers those questions.
There is at least two efforts for that:

  * Uncertainty Conventions (OGC 11-163) discussion paper [1]
  * Earth Observation (EO) Metadata Conventions [2] by ESA

Related to above EO metadata on top of CF-conventions, there is also EO
metadata on top of Observation & Measurements standard [3]. The European
Space Agency (ESA) is an author of both EO conventions (if I understood
correctly, ESA uses those EO conventions for Sentinel data). The scope
of those EO metadata is currently limited to remote sensing data. There
is no mapping yet from NetCDF Earth Observation metadata to ISO 19115.
But I think that the Apache SIS mapping from CF-convention to ISO 19115
[4] could easily be extended with EO metadata listed in [2].

The Hierarchical Data Format (HDF) group had its first meeting. The HDF
format is defined outside OGC; the proposal is to adopt the existing
specification as an OGC standard in the same way than OGC adopted NetCDF
3 specification as an OGC standard. The HDF format can be described as
Attributes + Datasets + Groups. The HDF specification defines computer
science data types; it does not include disciplinary data types defined
by conventions.

CF-conventions are widely used with NetCDF files, and can be applied to
HDF too. Recent activity in NetCDF-CF community include draft proposals
for satellite swath data, observational data and representing geometries
in NetCDF files (current CF-convention allows representation of points,
profiles or trajectories). A draft proposal for geometries in NetCDF is
available at [5]. Other efforts are listed at [6].

New features are becoming more often designed for NetCDF 4 format, which
is equivalent to HDF. But because the classical NetCDF 3 format is still
in wide use, we need to define a mapping from "NetCDF 4 features not
supported in NetCDF 3" to "NetCDF 3 workaround". An example is the way
enumerations are implemented in NetCDF 3.

Most current rasters are "Referenced Grid Coverage", but there is also a
need to support "Referencable Grid Coverage". Such extension fits
naturally in "GML in JPEG2000" (GMLJP2) format.

It seems to me that a GeoTIFF group was under consideration in recent
OGC meeting, but I didn't heard about that this month.

[1] https://portal.opengeospatial.org/files/?artifact_id=46702
[2] https://wiki.services.eoportal.org/tiki-download_wiki_attachment.php?attId=3271&download=y
[3] http://docs.opengeospatial.org/is/10-157r4/10-157r4.html
[4] http://sis.apache.org/apidocs/org/apache/sis/storage/netcdf/AttributeNames.html
[5] https://github.com/twhiteaker/netCDF-CF-simple-geometry
[6] https://github.com/Unidata/EC-netCDF-CF/wiki


    Consequence of geometric errors in 3D city models

There is two standards for city models: CityGML goes from world to
buildings, while /Industry Foundation Classes/ (IFC) goes in the
opposite direction, from floor to buildings. The two standards overlap
at the buildings scale, but with two different models. How to interface
the two standards is still under investigation. As a side effect of
their work at the buildings scale, groups from Delft university got
various examples of issues caused by geometric errors in city models:

  * The lack of height in a building causes "fighting" between the floor
    and the roof at rendering time, which results in flickering.
  * Wrong surface orientations cause walls to appear as if they were
    missing. They also affect computation of solar potential, because of
    roofs oriented toward the house interior.
  * Offsets between surfaces cause the shapes to not be topologically
    connected, which make impossible to compute volume even if the house
    looks fine at rendering time.
  * Numerical models for wind simulations in city (e.g. for pedestrians)
    based on fluid mechanic have very strict input requirement.

The group found no city model in GML which is 100% valid. Every model
that they tried required semi-manual corrections. To make that task a
little bit easier, the group created a software for geometric validation
of 3D primitives: val3dity [1]. There is many kind of errors (e.g.
repeated vertices), but the most common error is non-planar polygons.
Most errors are not visible. According the group, most practitioners are
not aware of the rules. To improve the situation, one possible path
could be an OGC CityGML quality interoperability experiment.

[1] https://github.com/tudelft3d/geovalidation.server/blob/master/val3dity/md/about.md


    Electromagnetic spectrum

One justification about why OGC created a working group about
electromagnetic spectrum is because the industry is moving from a static
way to assign frequencies to a more dynamic way. There is also a desire
to mitigate interference with Earth Observation satellite
communications. The Standard Spectrum Resource Format (SSRF) version 3
specification could be used as a starting point (a Java implementation
under Apache 2 license is provided by the OpenSSRF project [1]). Their
data model has Antenna, Transmitter, Station, Location and other objects
having location elements.

[1] http://openssrf.org/


    Web Services: WMS, WCS, WPS

The MetOcean group have done comparisons between two approaches of
handling multi-dimensional data in a web services. A traditional
approach is to view data as a stack of 2D data, but this approach result
in very large getCapabilities documents, with as much as 10000
individual coverages. The other approach described in MetOcean /best
practice/ paper, which view data as a 4D cube instead, result in much
more compact XML documents, with the number of lines shrinking from
30000 lines to 200 lines.

A specification for Web Processing Services (WPS) REST API is being
drafted, using the OBC TestBed 12 Engineering Report (OGC 16-035) as a
starting point. The current plan is to submit the draft to public
comment after a JSON encoding has been defined. For verifying
implementation conformance, WPS 1 has 28 CITE tests but WPS 2 does not
yet have any test. The group plans to continue work on WPS 1 tests and
create WPS 2 tests together.

OWS context has the need for a pagination approach. Such an approach is
specified in OpenSearch-EO and GeoJSON/JSON-LD encoding. Those
approaches need to be aligned.

We had a reminder that the availability of Web Services can be monitored
by http://www.spatineo.com/.


    Future directions

Streaming is increasingly important for OGC standards, especially mobile
streaming of GIS data in bandwidth-constrained environments. A binary
protocol for streaming data, more efficient than XML or JSON, has been
presented: the Layered Terrain Format (LTF) built on top of Google
Protocol Buffer (Apache Avro will be tested in OGC Testbed 13). Initial
benchmarks with LTF suggest that binary data are 2 times smaller and 15
times faster than compressed WFS/WCS data. Streaming NetCDF is also
experimented [1].

We had a discussion about the four "V" in big data analytic: Volume,
Velocity, Variety, Veracity. It was emphasized that a common language
for expressing veracity and uncertainty is important. We also need to
way to understand the veracity of analytical results performed by deep
learning algorithms.

We had a discussion about the complexity of standards, and the roadmap
to specification simplicity. There is two possible approaches:

 1. Write more complete (which often implies complex) standards first,
    then define simpler profiles for different common use cases.
 2. Write simple standards first, then complete with missing features later.

The preferred approach among attendees is the first one, because of the
risk of missing important features that may be hard to retrofit in a
compatible way when beginning with too simple standards. An inconvenient
is that there is often a delay between a standard publication and
definition of simpler profiles, which may contribute to the perception
of standards complexity.

Creation of a China forum has been approved for future discussions of
aspects more specific to China (other forums already exist for Europe
and other areas).

[1] https://www.unidata.ucar.edu/software/thredds/v4.3/netcdf-java/stream/NcStream.html


Martin