Re: [EXTERNAL] [gdal-dev] Static/Dynamic datum problems

classic Classic list List threaded Threaded
9 messages Options
Reply | Threaded
Open this post in threaded view
|

Re: [EXTERNAL] [gdal-dev] Static/Dynamic datum problems

Cameron Shorter

Thanks Doug, you have pointed me to Matt Higgins who is one of the Australians on this email's CC list. He has contributed toward identifying this problem. I believe we need to bring this conversation into an international forum, probably headed up by the OGC.

You might be able to suggest who we should connect with?

Warm regards, Cameron

On 24/6/19 11:30 pm, Newcomb, Doug wrote:
Cameron,
 I just emailed  someone working on this .  He sent me the links below.  


On Thu, Jun 20, 2019 at 5:15 PM Cameron Shorter <[hidden email]> wrote:
Hi folks,

Our Australian spatial data users are about to face a systematic mismatch challenge when trying to use multiple static datums (GDA2020, GDA94) with the dynamic datum (WGS84). At the moment, it is government agencies grappling with the problem, but it is about to become a mainstream issue.

Australia now has static datums for the years 1994 and 2020 and uses WGS84 (a time-dependent datum!), for web-mapping and web-services.  We recognise:
1. Transforming from GDA94 to GDA2020 reflects Australia’s tectonic movement of ~ 1.8 metres to the North East.
2. GDA94 was defined as ‘equal to WGS84’ in 1994.
3. GDA2020  was defined as ‘equal to WGS84’ in 2020.
All three statements can’t be accurate. It results in mis-aligned maps in WGS84

I believe this is a problem the whole world needs to address, given the upcoming modernsation of significant national datums including the U.S and we need to bring this topic into an international conversation, ASAP.
I'm interested to know if anyone here is looking into and/or has opinions on how it should be solved. I'd like to incorporate your ideas into the recommendations that we are putting foward.
-- 
Cameron Shorter
Technology Demystifier
Open Technologies and Geospatial Consultant

M +61 (0) 419 142 254





--
Doug Newcomb - Cartographer
USFWS
551F Pylon Dr
Raleigh, NC
919-856-4520 ext. 14 [hidden email]
---------------------------------------------------------------------------------------------------------

NOTE: This email correspondence and any attachments to and from this sender is subject to the Freedom of Information Act (FOIA) and may be disclosed to third parties.
-- 
Cameron Shorter
Technology Demystifier
Open Technologies and Geospatial Consultant

M +61 (0) 419 142 254

_______________________________________________
PROJ mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/proj
Reply | Threaded
Open this post in threaded view
|

Re: [EXTERNAL] [gdal-dev] Static/Dynamic datum problems

Nick Mein
Hi Evan, Cameron,

See https://docs.opengeospatial.org/as/18-005r4/18-005r4.html#116  

That is a great document. Thanks for sharing!

At bottom of page 16 of this guidance note, there is even a quite funny
discussion about whether NAD83(2011) should be considered as a static or
dynamic CRS

It really depends on where you are working. Over most of the continental US you can consider NAD83(2011) to be a static reference frame. (I.e. you can consider the velocity of points to be zero.) But that isn't true if you are working in California.

But those terminology discussions don't solve Cameron's pratical problem which
is  that when transforming data that is originally in GDA94 or GDA2020 to
WGS84 or WebMercator (using WGS84), the recommended transformations in EPSG,
ESRI, etc... are null transformations, causign misalignments when mixing
sources from GDA94 and GDA2020.

Right. But there is a solution to Cameron's practical problem, which is to transform his GDA94 dataset(s) to GDA2020.
In the longer term, we can all (practitioners, geospatial software providers) help clear up misunderstandings by being more precise with our terminology. Including pretending that there is such a thing as a precise WGS-84 coordinate, or that GDA, GDA2020, NAD83, etc are equivalent to "WGS-84". 

Let's define a "global mosaiced static CRS", that is the union / pachwork of
the national/regional/continental CRS in current use.

I'm pretty sure that is what products such as Google Maps must do, though I'd love to get verification from someone that knows for sure.

A key point is - how do you access whatever reference frame you are using? Traditionally you would do that by locating physical marks on the ground, and looking up their published coordinates. Today, you are likely to using a real time GNSS corrections network, or a post-processing service such as OPUS/AUSPOS/etc, or you are going to be using a precise point positioning service. The coordinates that you get are going to be either in a local/regional reference frame such as GDA/NAD83/ETRF, or they are going to be ITRF. For (large scale?) web mapping applications, munging together data sets from different reference frames is fine. But ultimately you need to be able to drill down to the original data.

Regards,
Nick.

On Tue, 25 Jun 2019 at 05:04, Cameron Shorter <[hidden email]> wrote:

Thanks Doug, you have pointed me to Matt Higgins who is one of the Australians on this email's CC list. He has contributed toward identifying this problem. I believe we need to bring this conversation into an international forum, probably headed up by the OGC.

You might be able to suggest who we should connect with?

Warm regards, Cameron

On 24/6/19 11:30 pm, Newcomb, Doug wrote:
Cameron,
 I just emailed  someone working on this .  He sent me the links below.  


On Thu, Jun 20, 2019 at 5:15 PM Cameron Shorter <[hidden email]> wrote:
Hi folks,

Our Australian spatial data users are about to face a systematic mismatch challenge when trying to use multiple static datums (GDA2020, GDA94) with the dynamic datum (WGS84). At the moment, it is government agencies grappling with the problem, but it is about to become a mainstream issue.

Australia now has static datums for the years 1994 and 2020 and uses WGS84 (a time-dependent datum!), for web-mapping and web-services.  We recognise:
1. Transforming from GDA94 to GDA2020 reflects Australia’s tectonic movement of ~ 1.8 metres to the North East.
2. GDA94 was defined as ‘equal to WGS84’ in 1994.
3. GDA2020  was defined as ‘equal to WGS84’ in 2020.
All three statements can’t be accurate. It results in mis-aligned maps in WGS84

I believe this is a problem the whole world needs to address, given the upcoming modernsation of significant national datums including the U.S and we need to bring this topic into an international conversation, ASAP.
I'm interested to know if anyone here is looking into and/or has opinions on how it should be solved. I'd like to incorporate your ideas into the recommendations that we are putting foward.
-- 
Cameron Shorter
Technology Demystifier
Open Technologies and Geospatial Consultant

M +61 (0) 419 142 254





--
Doug Newcomb - Cartographer
USFWS
551F Pylon Dr
Raleigh, NC
919-856-4520 ext. 14 [hidden email]
---------------------------------------------------------------------------------------------------------

NOTE: This email correspondence and any attachments to and from this sender is subject to the Freedom of Information Act (FOIA) and may be disclosed to third parties.
-- 
Cameron Shorter
Technology Demystifier
Open Technologies and Geospatial Consultant

M +61 (0) 419 142 254
_______________________________________________
PROJ mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/proj

_______________________________________________
PROJ mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/proj
Reply | Threaded
Open this post in threaded view
|

Re: [EXTERNAL] [gdal-dev] Static/Dynamic datum problems

Kristian Evers-2
This is a great discussion! I am happy to see that people outside of the
geodesy community is starting to realise that things can’t continue as they
have been. We’ve certainly done our fair share of the work here in PROJ
community but as Cameron points out there’s still some ways to go. From the
discussion it seems to me that everyone is converging towards a common
understanding of dynamic reference frames and what the challenges of them are.
I have worked quite extensively with the topic over the last couple of years
(see [0] for a summary) and there’s two important things that has not been
touched in this discussion yet:

1. WGS84 is practically equivalent to ITRF2008 and ITRF2014

The two frames coincide at the cm level [1] and hence there’s a
null-transformation between the two systems. This fact can be leveraged to
expand the number of direct transformation between WGS84 and regional/nation
frames. As Even pointed out often times the transformation between WGS84 and
national frame registered by the EPSG is a null transformation. Often times
there will be a transformation to a ITRFxxxx that offers better accuracy. This
of course requires that coordinates come with a timestamp, which brings me to
my next point:

2. GIS software does not offer reliable ways to store the observation time of
coordinates

To me this is the core of the problem in bringing dynamic reference frames into
practical use. Dynamic reference frames are inherently spatiotemporal - all
coordinates must consist of three spatial components and one temporal,
otherwise they simply are of no use. The timestamp of a coordinate has to be
the observation time of the coordinate and the timestamp must not be changed
during transformations (otherwise you can’t do the reverse transformation). If
those two criteria are met you can reliably transform between all reference
frames that are based on ITRFxxxx, for example between GDA94, WGS84 and GDA2020
as has been mentioned as currently being problematic in that regard.

So, if you keep track of time, it’s not all that difficult to work with dynamic
reference frames. The problem is that it is impossible in practice since there
is no standard that describes how to do this. The OGC Simple Features standard
which most file formats are based on simply doesn’t include time as a
dimension. At best, we can store X, Y, Z and M, with the M value being a
“measure” of some kind. This could in principle be observation times of the
coordinate but how would you distinguish between a time measure and some other
type of measure (e.g. a velocity)?

Cameron, if you want to take this up with the OGC, this is where you should
start. Most of what has been mentioned in this thread as problems are actually
solved in recent versions of PROJ and GDAL but we need GIS data formats that
can handle 4D coordinates before all those nice new features can be used to
their full extent. Most important is the Simple Features standard but I can
image that some minor tweaks will be needed in ISO19111 as well.

/Kristian


[1] ftp://itrf.ensg.ign.fr/pub/itrf/WGS84.TXT 


On 25 Jun 2019, at 03:15, Nick Mein <[hidden email]> wrote:

Hi Evan, Cameron,

See https://docs.opengeospatial.org/as/18-005r4/18-005r4.html#116  

That is a great document. Thanks for sharing!

At bottom of page 16 of this guidance note, there is even a quite funny
discussion about whether NAD83(2011) should be considered as a static or
dynamic CRS

It really depends on where you are working. Over most of the continental US you can consider NAD83(2011) to be a static reference frame. (I.e. you can consider the velocity of points to be zero.) But that isn't true if you are working in California.

But those terminology discussions don't solve Cameron's pratical problem which
is  that when transforming data that is originally in GDA94 or GDA2020 to
WGS84 or WebMercator (using WGS84), the recommended transformations in EPSG,
ESRI, etc... are null transformations, causign misalignments when mixing
sources from GDA94 and GDA2020.

Right. But there is a solution to Cameron's practical problem, which is to transform his GDA94 dataset(s) to GDA2020.
In the longer term, we can all (practitioners, geospatial software providers) help clear up misunderstandings by being more precise with our terminology. Including pretending that there is such a thing as a precise WGS-84 coordinate, or that GDA, GDA2020, NAD83, etc are equivalent to "WGS-84". 

Let's define a "global mosaiced static CRS", that is the union / pachwork of
the national/regional/continental CRS in current use.

I'm pretty sure that is what products such as Google Maps must do, though I'd love to get verification from someone that knows for sure.

A key point is - how do you access whatever reference frame you are using? Traditionally you would do that by locating physical marks on the ground, and looking up their published coordinates. Today, you are likely to using a real time GNSS corrections network, or a post-processing service such as OPUS/AUSPOS/etc, or you are going to be using a precise point positioning service. The coordinates that you get are going to be either in a local/regional reference frame such as GDA/NAD83/ETRF, or they are going to be ITRF. For (large scale?) web mapping applications, munging together data sets from different reference frames is fine. But ultimately you need to be able to drill down to the original data.

Regards,
Nick.

On Tue, 25 Jun 2019 at 05:04, Cameron Shorter <[hidden email]> wrote:

Thanks Doug, you have pointed me to Matt Higgins who is one of the Australians on this email's CC list. He has contributed toward identifying this problem. I believe we need to bring this conversation into an international forum, probably headed up by the OGC.

You might be able to suggest who we should connect with?

Warm regards, Cameron

On 24/6/19 11:30 pm, Newcomb, Doug wrote:
Cameron,
 I just emailed  someone working on this .  He sent me the links below.  


On Thu, Jun 20, 2019 at 5:15 PM Cameron Shorter <[hidden email]> wrote:
Hi folks,

Our Australian spatial data users are about to face a systematic mismatch challenge when trying to use multiple static datums (GDA2020, GDA94) with the dynamic datum (WGS84). At the moment, it is government agencies grappling with the problem, but it is about to become a mainstream issue.

Australia now has static datums for the years 1994 and 2020 and uses WGS84 (a time-dependent datum!), for web-mapping and web-services.  We recognise:
1. Transforming from GDA94 to GDA2020 reflects Australia’s tectonic movement of ~ 1.8 metres to the North East.
2. GDA94 was defined as ‘equal to WGS84’ in 1994.
3. GDA2020  was defined as ‘equal to WGS84’ in 2020.
All three statements can’t be accurate. It results in mis-aligned maps in WGS84

I believe this is a problem the whole world needs to address, given the upcoming modernsation of significant national datums including the U.S and we need to bring this topic into an international conversation, ASAP.
I'm interested to know if anyone here is looking into and/or has opinions on how it should be solved. I'd like to incorporate your ideas into the recommendations that we are putting foward.
-- 
Cameron Shorter
Technology Demystifier
Open Technologies and Geospatial Consultant

M +61 (0) 419 142 254





--
Doug Newcomb - Cartographer
USFWS
551F Pylon Dr
Raleigh, NC
919-856-4520 ext. 14 [hidden email]
---------------------------------------------------------------------------------------------------------

NOTE: This email correspondence and any attachments to and from this sender is subject to the Freedom of Information Act (FOIA) and may be disclosed to third parties.
-- 
Cameron Shorter
Technology Demystifier
Open Technologies and Geospatial Consultant

M +61 (0) 419 142 254
_______________________________________________
PROJ mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/proj
_______________________________________________
PROJ mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/proj


_______________________________________________
PROJ mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/proj
Reply | Threaded
Open this post in threaded view
|

Re: [EXTERNAL] [gdal-dev] Static/Dynamic datum problems

Martin Desruisseaux-3

Hello all

ISO 19111:2019 (Referencing by coordinates) already includes the timestamp of coordinates. The standard defines a CoordinateMetadata class with two properties:

  • crs: Identifier of the coordinate reference system to which a coordinate set is referenced.
  • coordinateEpoch: Epoch at which coordinate referenced to a dynamic CRS are valid.

ISO 19107 (the standard that defines geometry objects) has been revised. I did not yet had a chance to look closely at it, but last time I attended to the discussions at OGC my understanding was that the new geometry objects would be associated to ISO 19111 CoordinateMetadata class (and consequently include the coordinate epoch) instead than to only the CRS.

Simple features are derived from ISO 19107 by the same author. I think I have seen emails on OGC mailing list about drafts available, but did not yet had a chance to look at them. At least, the current Simple Feature editor is well aware of this coordinate epoch topic since he was present at OGC meetings that debated about it. So I guess there is good chances that coordinate epoch will be present in some way in simple features (but did not verified).

    Martin



_______________________________________________
PROJ mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/proj
Reply | Threaded
Open this post in threaded view
|

Re: [EXTERNAL] [gdal-dev] Static/Dynamic datum problems

Kristian Evers-2

Hi Martin,

 

That is good news. I find it hard to keep up with the standards when they are still in drafts so I’m glad you can give us an update from time to time. I hope that you are right and the Simple Features model will be expanded to include the temporal dimension as well. If that is the case there’s light at the end of the tunnel :-) It is probably still going to take quite a while before the complete GIS software stack absorbs the updated standards but that is to be expected with such a radical change.

 

/Kristian

 

From: PROJ <[hidden email]> On Behalf Of Martin Desruisseaux
Sent: 25. juni 2019 09:41
To: [hidden email]
Subject: Re: [PROJ] [EXTERNAL] [gdal-dev] Static/Dynamic datum problems

 

Hello all

ISO 19111:2019 (Referencing by coordinates) already includes the timestamp of coordinates. The standard defines a CoordinateMetadata class with two properties:

  • crs: Identifier of the coordinate reference system to which a coordinate set is referenced.
  • coordinateEpoch: Epoch at which coordinate referenced to a dynamic CRS are valid.

ISO 19107 (the standard that defines geometry objects) has been revised. I did not yet had a chance to look closely at it, but last time I attended to the discussions at OGC my understanding was that the new geometry objects would be associated to ISO 19111 CoordinateMetadata class (and consequently include the coordinate epoch) instead than to only the CRS.

Simple features are derived from ISO 19107 by the same author. I think I have seen emails on OGC mailing list about drafts available, but did not yet had a chance to look at them. At least, the current Simple Feature editor is well aware of this coordinate epoch topic since he was present at OGC meetings that debated about it. So I guess there is good chances that coordinate epoch will be present in some way in simple features (but did not verified).

    Martin

 


_______________________________________________
PROJ mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/proj
Reply | Threaded
Open this post in threaded view
|

Re: [EXTERNAL] [gdal-dev] Static/Dynamic datum problems

Even Rouault-2
In reply to this post by Kristian Evers-2
>  The timestamp of a
> coordinate has to be the observation time of the coordinate and the
> timestamp must not be changed during transformations (otherwise you can’t
> do the reverse transformation).

Unless you do a point motion operation, right ?

Quoting http://docs.opengeospatial.org/as/18-005r4/18-005r4.html#68 :
"""The coordinates of a data set may be changed to any other epoch. Plate
motion or other crustal deformation models are often used for this when
estimated coordinate velocities are not available. Such models facilitate for
example the change of coordinates from being referenced to ITRF2008 at epoch
2017.53 to being referenced to ITRF2008 at epoch 2005.0."""

Which would be the case if you want to create for example a global raster
dataset : you want all tiles to be referenced to the same coordinate epoch.
But this would be the final step in the coordinate transformation process.
In a vector dataset, you may accept different features to be at different
coordinate epochs.

To be noted tha the EPSG dataset is currently rather sparse in the area of
point motion operations. There's just a single one available for the Canadian
NAD83(CSRS)v6 case using a velocity grid. So currently if you have a
coordinate referenced to a ITRF CRS, you can't transform it easily to another
epoch in this CRS. Or in the Australian case, you'd have to go back to GDA2020
using the ITRF2014<-->GDA2020 time dependent Helmet transformation... :

From ITRF2014@2018.00 to GDA2020
$ echo "-30 120 0 2018.00" | cs2cs -f "%.9f" EPSG:9000 EPSG:7844
-29.999998944 120.000000758 -0.000339830 2018.00

From GDA2020 to ITRF2014@2025.00
$ echo "-29.999998944 120.000000758 -0.000339830 2025.00" | \
                 cs2cs -I -f "%.9f" EPSG:9000 EPSG:7844
-29.999996305 120.000002652 -0.001189402 2025.00

Which makes it obvious that, even when/if we have point motion operation
available to do directry CRS A @ epoch 1 -> CRS A @ epoch B, we don't have a
clean way currently in PROJ of specifying the target coordinate epoch.

Actually I could do it in just one step by hacking the ITRF2014->GDA2020
pipeline and using t_epoch=2025.00 to make it a ITRF2014 @ input_epoch ->
ITRF2014 @ 2025.00 valid for Australia (probably only within a few years
around epoch 2020.00, which also brings into light the lack in the EPSG
dataset of metadata indicating the time range against which a coordinate
operation is valid with the published accuracy...):

$ echo "-30 120 0 2018.00" | cct -d 9 +proj=pipeline \
    +step +proj=axisswap +order=2,1 \
    +step +proj=unitconvert +xy_in=deg +xy_out=rad \
    +step +proj=cart +ellps=GRS80 \
    +step +proj=helmert +x=0 +y=0 +z=0 +rx=0 +ry=0 +rz=0 +s=0 \
     +dx=0 +dy=0 +dz=0 +drx=0.00150379 +dry=0.00118346 +drz=0.00120716 +ds=0 \    
     +t_epoch=2025 +convention=coordinate_frame \
    +step +inv +proj=cart +ellps=GRS80 \
    +step +proj=unitconvert +xy_in=rad +xy_out=deg \
    +step +proj=axisswap +order=2,1
-29.999996305  120.000002652  -0.001189396     2018.0000

Except that 2018.000 in the output should be read as 2025 given the hack...


> The problem is that it is impossible in practice
> since there is no standard that describes how to do this. The OGC Simple
> Features standard which most file formats are based on simply doesn’t
> include time as a dimension.

You probably want the time to be included as a general metadata of the
geometry rather than a per-vertex value (hopefully a single geometry is
referenced to the same epoch...), which would be along the CoordinateMetadata
class mentionned by Martin.
Yes, there is the issue of standards, but then it must percolate down to file
/ geospatial database formats. GeoPackage and PostGIS for example are
extensions of the WKB encoding of Simple Features, so if the later is updated,
then the former might be able to upgrade. For web services, GML / WFS should
be upgraded as well, both on the response side (you need the coordinate epoch)
and in the request side (you probably also need to be able to specify that you
want geometries at a given coordinate epoch)
The issue is the same with raster formats. How to do encode in a GeoTIFF the
coordinate epoch
(I've just created https://github.com/opengeospatial/geotiff/issues/78), etc
...

I guess that's why people keep creating new static CRS. That's so much
convenient given the inertia on the file format side (how do you fix
shapefiles to include coordinate epoch... A new sidecar file myshape.epoch :-)
?)

Even

--
Spatialys - Geospatial professional services
http://www.spatialys.com
_______________________________________________
PROJ mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/proj
Reply | Threaded
Open this post in threaded view
|

Re: [EXTERNAL] [gdal-dev] Static/Dynamic datum problems

Kristian Evers-2
> Unless you do a point motion operation, right ?

As long as you work entirely in dynamic coordinates it is okay to change
the timestamps as part of a transformation. When you transform coordinates
from a dynamic/temporal frame to a static frame (e.g. ITRF2014 -> ETRS89)
you need to keep the timestamp from coordinates in ITRF2014 if you want
to retain the ability to transform your data back to ITRF2014. Changing the
timestamp to 1989 or the realization epoch of the local ETRS89 realization
would make it impossible to return to the original coordinates. Unless the
information is kept track of elsewhere. But where would you do that? Worst
case a vector dataset has different observation times for all coordinates.

> Which makes it obvious that, even when/if we have point motion operation
> available to do directry CRS A @ epoch 1 -> CRS A @ epoch B, we don't have a
> clean way currently in PROJ of specifying the target coordinate epoch.

I agree with that. We could simply adapt the same convention as used by
geodesists:

        cs2cs  <source> <destination>@<destination epoch>

I agree that it makes sense to change the coordinate timestamp when
Transforming from one dynamic reference to another. But there's a
caveat:

> Except that 2018.000 in the output should be read as 2025 given the hack...

How do you make the inverse transformation if you change the coordinate
time from 2018.0 to 2025.0? For the particular PJ object that made that
transformation you will not be able to reverse it, since you specified
+t_epoch=2025.0 your inverse operation will now be a null transformation
moving the coordinate from 2025.0 to 2025.0. This is why I opted to not
change the timestamps when I implemented the temporal Helmert.

It is not set in stone that the inverse of an operation should be able to be created
but it definitely makes life simpler. It is a principal decision to make if we want
break the possibility of creating the inverse of a temporal Helmert operation.

> I guess that's why people keep creating new static CRS. That's so much
> convenient given the inertia on the file format side (how do you fix
> shapefiles to include coordinate epoch... A new sidecar file myshape.epoch :-)
> ?)

It is an approach that is absolutely worth considering by geodetic authorities.
It is many times simpler to work with and you still keep the good stuff from
the dynamic frame.


/Kristian

-----Original Message-----
From: Even Rouault <[hidden email]>
Sent: 25. juni 2019 14:41
To: [hidden email]
Cc: Kristian Evers <[hidden email]>; Nick Mein <[hidden email]>
Subject: Re: [PROJ] [EXTERNAL] [gdal-dev] Static/Dynamic datum problems

>  The timestamp of a
> coordinate has to be the observation time of the coordinate and the
> timestamp must not be changed during transformations (otherwise you can’t
> do the reverse transformation).

Unless you do a point motion operation, right ?

Quoting http://docs.opengeospatial.org/as/18-005r4/18-005r4.html#68 :
"""The coordinates of a data set may be changed to any other epoch. Plate
motion or other crustal deformation models are often used for this when
estimated coordinate velocities are not available. Such models facilitate for
example the change of coordinates from being referenced to ITRF2008 at epoch
2017.53 to being referenced to ITRF2008 at epoch 2005.0."""

Which would be the case if you want to create for example a global raster
dataset : you want all tiles to be referenced to the same coordinate epoch.
But this would be the final step in the coordinate transformation process.
In a vector dataset, you may accept different features to be at different
coordinate epochs.

To be noted tha the EPSG dataset is currently rather sparse in the area of
point motion operations. There's just a single one available for the Canadian
NAD83(CSRS)v6 case using a velocity grid. So currently if you have a
coordinate referenced to a ITRF CRS, you can't transform it easily to another
epoch in this CRS. Or in the Australian case, you'd have to go back to GDA2020
using the ITRF2014<-->GDA2020 time dependent Helmet transformation... :

From ITRF2014@2018.00 to GDA2020
$ echo "-30 120 0 2018.00" | cs2cs -f "%.9f" EPSG:9000 EPSG:7844
-29.999998944 120.000000758 -0.000339830 2018.00

From GDA2020 to ITRF2014@2025.00
$ echo "-29.999998944 120.000000758 -0.000339830 2025.00" | \
                 cs2cs -I -f "%.9f" EPSG:9000 EPSG:7844
-29.999996305 120.000002652 -0.001189402 2025.00

Which makes it obvious that, even when/if we have point motion operation
available to do directry CRS A @ epoch 1 -> CRS A @ epoch B, we don't have a
clean way currently in PROJ of specifying the target coordinate epoch.

Actually I could do it in just one step by hacking the ITRF2014->GDA2020
pipeline and using t_epoch=2025.00 to make it a ITRF2014 @ input_epoch ->
ITRF2014 @ 2025.00 valid for Australia (probably only within a few years
around epoch 2020.00, which also brings into light the lack in the EPSG
dataset of metadata indicating the time range against which a coordinate
operation is valid with the published accuracy...):

$ echo "-30 120 0 2018.00" | cct -d 9 +proj=pipeline \
    +step +proj=axisswap +order=2,1 \
    +step +proj=unitconvert +xy_in=deg +xy_out=rad \
    +step +proj=cart +ellps=GRS80 \
    +step +proj=helmert +x=0 +y=0 +z=0 +rx=0 +ry=0 +rz=0 +s=0 \
     +dx=0 +dy=0 +dz=0 +drx=0.00150379 +dry=0.00118346 +drz=0.00120716 +ds=0 \    
     +t_epoch=2025 +convention=coordinate_frame \
    +step +inv +proj=cart +ellps=GRS80 \
    +step +proj=unitconvert +xy_in=rad +xy_out=deg \
    +step +proj=axisswap +order=2,1
-29.999996305  120.000002652  -0.001189396     2018.0000

Except that 2018.000 in the output should be read as 2025 given the hack...


> The problem is that it is impossible in practice
> since there is no standard that describes how to do this. The OGC Simple
> Features standard which most file formats are based on simply doesn’t
> include time as a dimension.

You probably want the time to be included as a general metadata of the
geometry rather than a per-vertex value (hopefully a single geometry is
referenced to the same epoch...), which would be along the CoordinateMetadata
class mentionned by Martin.
Yes, there is the issue of standards, but then it must percolate down to file
/ geospatial database formats. GeoPackage and PostGIS for example are
extensions of the WKB encoding of Simple Features, so if the later is updated,
then the former might be able to upgrade. For web services, GML / WFS should
be upgraded as well, both on the response side (you need the coordinate epoch)
and in the request side (you probably also need to be able to specify that you
want geometries at a given coordinate epoch)
The issue is the same with raster formats. How to do encode in a GeoTIFF the
coordinate epoch
(I've just created https://github.com/opengeospatial/geotiff/issues/78), etc
...

I guess that's why people keep creating new static CRS. That's so much
convenient given the inertia on the file format side (how do you fix
shapefiles to include coordinate epoch... A new sidecar file myshape.epoch :-)
?)

Even

--
Spatialys - Geospatial professional services
http://www.spatialys.com
_______________________________________________
PROJ mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/proj
Reply | Threaded
Open this post in threaded view
|

Re: [EXTERNAL] [gdal-dev] Static/Dynamic datum problems

Even Rouault-2
> I agree with that. We could simply adapt the same convention as used by
> geodesists:
>
> cs2cs  <source> <destination>@<destination epoch>

Makes sense (there would be the risk that people believes that the
@destination epoch is part of the CRS designation, but
http://docs.opengeospatial.org/as/18-005r4/18-005r4.html#68 indeed uses it.

> How do you make the inverse transformation if you change the coordinate
> time from 2018.0 to 2025.0?

Good question ! Actually in my ISO related work of past months, I spotted this
issue but left it a bit apart as there were so many other hotter topics to
deal with.

ISO-19111 identifies the optional sourceCoordinateEpoch and
targetCoordinateEpoch to be properties of the CoordinateOperation class.
See http://docs.opengeospatial.org/as/18-005r4/18-005r4.html#58
In PROJ, we have a 4D X,Y,Z,time tuple instead, so there's a modelling gap
here, which is propably the core reason for the issue you mention.

If we have
+proj=point_motion_geocentric +src_epoch= +tgt_epoch= +vx= +vy= +vz=
then this can be reversible.

You could have different modes:
- t_epoch only: then you take into account the input T, and this is not
reversible
- s_epoch only: only possible to use in the reverse direction
- both: and then in the the forward direction, you must decide if you error
out if the input T != src_epoch, or if you just override it with src_epoch.
Similarly in the reverse direction.

If you chain a point motion with a 15-parameter Helmert as in some example of  
IOGP guidance note 25, then you need to change the coordinate epoch after the
point motion operation (either by modifying the T in the coordinate tuple, or
other internal means like filling the sourceCoordinateEpoch on the Helmert
transformation which would then add a src_epoch parameter to the corresponding
PROJ string ?), so that the 15-parameter Helmert operates on the new
coordinate epoch.

> For the particular PJ object that made that
> transformation you will not be able to reverse it, since you specified
> +t_epoch=2025.0 your inverse operation will now be a null transformation
> moving the coordinate from 2025.0 to 2025.0. This is why I opted to not
> change the timestamps when I implemented the temporal Helmert.

Temporal Helmert nominately does not change the coordinate epoch, so that was
a wise decision.

My example was clearly a hack to try to emulate the lack of a point motion
operation for ITRF2014. In IOGP guidance note 25, point motion operations are
either a velocity vector applied in the geocentric or geographic domains:
https://www.epsg-registry.org/export.htm?gml=urn:ogc:def:method:EPSG::1064
https://www.epsg-registry.org/export.htm?gml=urn:ogc:def:method:EPSG::1067
or with a grid:
https://www.epsg-registry.org/export.htm?gml=urn:ogc:def:method:EPSG::1070
But indeed there's perhaps a lack of expressing rotational terms in a compact
way like Helmert allows to do.

> It is not set in stone that the inverse of an operation should be able to be
> created  but it definitely makes life simpler.

Indeed, I use and abuse of that a lot in the createOperations() code !

> It is a principal decision
> to make if we want break the possibility of creating the inverse of a
> temporal Helmert operation.

As said above, I think the issue would affect only dedicated point motion
operations, not 15-parameter Helmert whose central epoch parameter shouldn't
affect coordinate epochs.
By the way: in 't_epoch', how should the t_ be interpreted: Time , Target,
cenTral :-) ?

--
Spatialys - Geospatial professional services
http://www.spatialys.com
_______________________________________________
PROJ mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/proj
Reply | Threaded
Open this post in threaded view
|

Re: [EXTERNAL] [gdal-dev] Static/Dynamic datum problems

Kristian Evers-2


> On 25 Jun 2019, at 17:38, Even Rouault <[hidden email]> wrote:
>
>> I agree with that. We could simply adapt the same convention as used by
>> geodesists:
>>
>> cs2cs  <source> <destination>@<destination epoch>
>
> Makes sense (there would be the risk that people believes that the
> @destination epoch is part of the CRS designation, but
> http://docs.opengeospatial.org/as/18-005r4/18-005r4.html#68 indeed uses it.
>
>> How do you make the inverse transformation if you change the coordinate
>> time from 2018.0 to 2025.0?
>
> Good question ! Actually in my ISO related work of past months, I spotted this
> issue but left it a bit apart as there were so many other hotter topics to
> deal with.
>
> ISO-19111 identifies the optional sourceCoordinateEpoch and
> targetCoordinateEpoch to be properties of the CoordinateOperation class.
> See http://docs.opengeospatial.org/as/18-005r4/18-005r4.html#58
> In PROJ, we have a 4D X,Y,Z,time tuple instead, so there's a modelling gap
> here, which is propably the core reason for the issue you mention.
>
> If we have
> +proj=point_motion_geocentric +src_epoch= +tgt_epoch= +vx= +vy= +vz=
> then this can be reversible.
>
> You could have different modes:
> - t_epoch only: then you take into account the input T, and this is not
> reversible
> - s_epoch only: only possible to use in the reverse direction
> - both: and then in the the forward direction, you must decide if you error
> out if the input T != src_epoch, or if you just override it with src_epoch.
> Similarly in the reverse direction.
>
> If you chain a point motion with a 15-parameter Helmert as in some example of  
> IOGP guidance note 25, then you need to change the coordinate epoch after the
> point motion operation (either by modifying the T in the coordinate tuple, or
> other internal means like filling the sourceCoordinateEpoch on the Helmert
> transformation which would then add a src_epoch parameter to the corresponding
> PROJ string ?), so that the 15-parameter Helmert operates on the new
> coordinate epoch.

This could work, yes. Of course my suggestion above also extends to the source
epoch when applicable:

cs2cs  <source>@<source epoch>  <destination>@<destination epoch>

>
>> For the particular PJ object that made that
>> transformation you will not be able to reverse it, since you specified
>> +t_epoch=2025.0 your inverse operation will now be a null transformation
>> moving the coordinate from 2025.0 to 2025.0. This is why I opted to not
>> change the timestamps when I implemented the temporal Helmert.
>
> Temporal Helmert nominately does not change the coordinate epoch, so that was
> a wise decision.

Well, that’s a first for me :-)

>
> My example was clearly a hack to try to emulate the lack of a point motion
> operation for ITRF2014. In IOGP guidance note 25, point motion operations are
> either a velocity vector applied in the geocentric or geographic domains:
> https://www.epsg-registry.org/export.htm?gml=urn:ogc:def:method:EPSG::1064
> https://www.epsg-registry.org/export.htm?gml=urn:ogc:def:method:EPSG::1067
> or with a grid:
> https://www.epsg-registry.org/export.htm?gml=urn:ogc:def:method:EPSG::1070
> But indeed there's perhaps a lack of expressing rotational terms in a compact
> way like Helmert allows to do.
>
>> It is not set in stone that the inverse of an operation should be able to be
>> created  but it definitely makes life simpler.
>
> Indeed, I use and abuse of that a lot in the createOperations() code !
>
>> It is a principal decision
>> to make if we want break the possibility of creating the inverse of a
>> temporal Helmert operation.
>
> As said above, I think the issue would affect only dedicated point motion
> operations, not 15-parameter Helmert whose central epoch parameter shouldn't
> affect coordinate epochs.
> By the way: in 't_epoch', how should the t_ be interpreted: Time , Target,
> cenTral :-) ?
>

Time. In hindsigt I think +t_0 would have been a better parameter name. When
I first wrote this code I thought it would be necessary to include more +t_
parameters, for example +t_obs that was used in the deformation operation initially.
You live and learn, I guess.

> --
> Spatialys - Geospatial professional services
> http://www.spatialys.com

_______________________________________________
PROJ mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/proj