[pdal] PDAL Python3 issues

classic Classic list List threaded Threaded
14 messages Options
Reply | Threaded
Open this post in threaded view
|

[pdal] PDAL Python3 issues

jfprieur
Hello, I am following the documentation on 


In order to test simply opening a LAS file with PDAL as shown in the example, on a fresh Debian stretch vanilla install.

I think there is a small typo in the line

pipeline = pdal.Pipeline(pipeline)
should be
pipeline = pdal.Pipeline(json)

When I try to execute the script, I get the following errors

>>> pipeline = pdal.Pipeline(json)
>>> pipeline.validate()
Warning 1: Cannot find pcs.csv
True
>>> pipeline.loglevel = 9
>>> count = pipeline.execute()
>>> arrays = pipeline.arrays
RuntimeError: _ARRAY_API is not PyCObject object
Segmentation fault


For the first warning, I have my GDAL_DATA path set and the pcs.csv file is there
$ sudo find  / -name pcs.csv -type f
/usr/share/gdal/2.1/pcs.csv

$ echo $GDAL_DATA
/usr/share/gdal/2.1/

I have installed gdal, pdal, python3-gdal, python3-numpy, python3-pdal so not too sure why the arrays command fails.

Any help is appreciated, trying to replace liblas as we have memory usage problems with it. When we read multiple LAS files (open and close thousands of LAS files) with liblas the memory just runs out eventually, even with a close() statement. Happens on both windows and linux (thought it was a windows dll problem perhaps). Need to solve this with PDAL and am pretty close ;)

Thanks for any help
JF Prieur

_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal
Reply | Threaded
Open this post in threaded view
|

Re: [pdal] PDAL Python3 issues

Howard Butler-3

> On Jan 20, 2017, at 4:27 PM, Jean-Francois Prieur <[hidden email]> wrote:
>
>
> I think there is a small typo in the line
>
> pipeline = pdal.Pipeline(pipeline)
> should be
> pipeline = pdal.Pipeline(json)

Filed. https://github.com/PDAL/PDAL/issues/1476

>
> When I try to execute the script, I get the following errors
>
> >>> pipeline = pdal.Pipeline(json)
> >>> pipeline.validate()
> Warning 1: Cannot find pcs.csv
> True
> >>> pipeline.loglevel = 9
> >>> count = pipeline.execute()
> >>> arrays = pipeline.arrays
> RuntimeError: _ARRAY_API is not PyCObject object
> Segmentation fault

Hmm. I have tested the Python extension on both Python 2 and Python 3, and the Python extensions are built and tested as part of the Travis continuous integration tests [1]. I'm a bit stumped by this particular issue, and I have never seen any behavior like this before. Some wild guesses I have would be there's some mix up of Numpy headers and actual installed version, or there's somehow a Python 3.x runtime vs compiled-against-2.x numpy issue.


[1] https://travis-ci.org/PDAL/PDAL/jobs/193471435#L3786

> For the first warning, I have my GDAL_DATA path set and the pcs.csv file is there
> $ sudo find  / -name pcs.csv -type f
> /usr/share/gdal/2.1/pcs.csv
>
> $ echo $GDAL_DATA
> /usr/share/gdal/2.1/

Can you set CPL_DEBUG=ON and PROJ_DEBUG=ON in your environment before running?

> I have installed gdal, pdal, python3-gdal, python3-numpy, python3-pdal so not too sure why the arrays command fails.

Is there a python3-pdal package now?

> Any help is appreciated, trying to replace liblas as we have memory usage problems with it. When we read multiple LAS files (open and close thousands of LAS files) with liblas the memory just runs out eventually, even with a close() statement. Happens on both windows and linux (thought it was a windows dll problem perhaps). Need to solve this with PDAL and am pretty close ;)

A description of your workflow might also help. The Python extensions is really about making it convenient for people to access the point data of a particular PDAL-readable file. A common workflow we use is Python or Javascript build-up of a pipeline, and then push it off to `pdal pipeline` for execution (with some kind of process tasking queuing engine). Reading up lots of data into the Python process is likely to be fraught.

Howard

Howard
_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal
Reply | Threaded
Open this post in threaded view
|

Re: [pdal] PDAL Python3 issues

jfprieur
Thanks again for your always fast replies.

For the Python debugging, i will take note of your comments (setting the environment flags) and try again a bit later, it is entirely possible I borked the setup.

Yes, there is a python3-pdal package on stretch(testing). I believe stretch is due for release in the next few weeks.


As for our workflow, pull up a chair ;) Apologies if this is long-winded, want to provide proper context.

We are a research lab working in precision forestry and trying to get our scientific solutions to scale up to operational levels.  We use airborne laser scanner (ALS) data to perform individual tree crown (ITC) segmentation. We then calculate lidar features (height percentiles, first order statistics as well as crown kurtosis, shape, etc.) on each individual tree and then run balanced random forest classifications to determine the species (about 15 commercially important ones). There are about 30 lidar features calculated per tree crown.  

This is a project that a masters student has worked on for the past 2 years for an industrial partner on their 700 square mile forest plantation. This person has done an amazing job while learling a lot of things at the same time. Windows was used as this is the environment that our department is comfortable with (due to ties with Esri so need to run Arc and Office) and was the environment that let us produce results quickly. For my own PhD studies I am using linux for my work (with a windows VM ;) ) and will eventually port everything to that.

When you are dealing with smaller areas (as most scientific studies do), the number of crowns processed is not an issue as you are usually under the thousands of crowns number.

In the case of this production forest, we have 2200 LAS tiles, 1km*1km each. Each tile can have between 20,000 and 100,000 individual tree crowns, and those 30 lidar features need to be calculated for each. LibLAS runs into the aforementionned memory issues around 40,000 crowns.

When the student started (almost 2 years ago), we used OSGeo4W open source tools for development. The initial workflow was awesome. Read each file with PDAL, use pgwriter to send it to postgres, calculate all the metrics in the database. Worked like a charm until pgwriter dissapeared from the osgeo4w version of PDAL (we completely understand how this can happen, this is not a complaint!) so this production chain was broken. We both did not have the time (at the time) to figure out how to install everything in linux so she decided to press forward using Python. The end product is still in Postgres, it is the initial 'reading the LAS file' part that pgwriter performed flawlessly that is causing issues now.

A python 3 script using libLAS opens the LAS tile, runs through each crown to find the points associated to it and stores the result as a LAS file. The issue is that an individual LAS file is created for each tree crown, when we have more than 40,000 crowns per tile the system starts swapping (windows and linux) and the process just gets very slow. Then another script reads the las points, calculates metrics which are then stored in the database. This 'clipping' operation for the tree crowns only happens once at the beginning, it is not a problem. But it would take a month right now using libLAS which is not acceptable.

So all I am looking for ;) is a linux python library that can write up tp 100,000 'mini-LAS' tree crowns from a las tile without running out of memory like libLAS does. Believe PDAL could do that quite simply via Python hence my attempts. I know that laspy exists but it is only for Python 2.

Thanks for any insights the list may have, keeping in mind we are relative programming noob scientists that don`t mind to work and read!
Sorry for the book!
JF Prieur





On Mon, Jan 23, 2017 at 9:29 AM Howard Butler <[hidden email]> wrote:

> On Jan 20, 2017, at 4:27 PM, Jean-Francois Prieur <[hidden email]> wrote:
>
>
> I think there is a small typo in the line
>
> pipeline = pdal.Pipeline(pipeline)
> should be
> pipeline = pdal.Pipeline(json)

Filed. https://github.com/PDAL/PDAL/issues/1476

>
> When I try to execute the script, I get the following errors
>
> >>> pipeline = pdal.Pipeline(json)
> >>> pipeline.validate()
> Warning 1: Cannot find pcs.csv
> True
> >>> pipeline.loglevel = 9
> >>> count = pipeline.execute()
> >>> arrays = pipeline.arrays
> RuntimeError: _ARRAY_API is not PyCObject object
> Segmentation fault

Hmm. I have tested the Python extension on both Python 2 and Python 3, and the Python extensions are built and tested as part of the Travis continuous integration tests [1]. I'm a bit stumped by this particular issue, and I have never seen any behavior like this before. Some wild guesses I have would be there's some mix up of Numpy headers and actual installed version, or there's somehow a Python 3.x runtime vs compiled-against-2.x numpy issue.


[1] https://travis-ci.org/PDAL/PDAL/jobs/193471435#L3786

> For the first warning, I have my GDAL_DATA path set and the pcs.csv file is there
> $ sudo find  / -name pcs.csv -type f
> /usr/share/gdal/2.1/pcs.csv
>
> $ echo $GDAL_DATA
> /usr/share/gdal/2.1/

Can you set CPL_DEBUG=ON and PROJ_DEBUG=ON in your environment before running?

> I have installed gdal, pdal, python3-gdal, python3-numpy, python3-pdal so not too sure why the arrays command fails.

Is there a python3-pdal package now?

> Any help is appreciated, trying to replace liblas as we have memory usage problems with it. When we read multiple LAS files (open and close thousands of LAS files) with liblas the memory just runs out eventually, even with a close() statement. Happens on both windows and linux (thought it was a windows dll problem perhaps). Need to solve this with PDAL and am pretty close ;)

A description of your workflow might also help. The Python extensions is really about making it convenient for people to access the point data of a particular PDAL-readable file. A common workflow we use is Python or Javascript build-up of a pipeline, and then push it off to `pdal pipeline` for execution (with some kind of process tasking queuing engine). Reading up lots of data into the Python process is likely to be fraught.

Howard

Howard

_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal
Reply | Threaded
Open this post in threaded view
|

Re: [pdal] PDAL Python3 issues

Sebastiaan Couwenberg
In reply to this post by Howard Butler-3
On 01/23/2017 03:29 PM, Howard Butler wrote:
>> On Jan 20, 2017, at 4:27 PM, Jean-Francois Prieur <[hidden email]> wrote:
> Hmm. I have tested the Python extension on both Python 2 and Python 3, and the Python extensions are built and tested as part of the Travis continuous integration tests [1]. I'm a bit stumped by this particular issue, and I have never seen any behavior like this before. Some wild guesses I have would be there's some mix up of Numpy headers and actual installed version, or there's somehow a Python 3.x runtime vs compiled-against-2.x numpy issue.
>
> [1] https://travis-ci.org/PDAL/PDAL/jobs/193471435#L3786
>
>> I have installed gdal, pdal, python3-gdal, python3-numpy, python3-pdal so not too sure why the arrays command fails.
>
> Is there a python3-pdal package now?

It has been there since I first packaged PDAL for Debian.

Unfortunately libpdal-plang is only built with Python 2, which may be
the source of this issue. I've patched out the version check to keep
building both version when that check was added, because I'd like to
have to build PDAL for every Python version (which can be more than two
when a transition to a new Python 3 version is started).

If python-pdal can only work with the same Python version as
libpdal-plang was built with, we'll need to drop the python3-pdal
package or add a libpdal-plang3 for it.

Kind Regards,

Bas

--
 GPG Key ID: 4096R/6750F10AE88D4AF1
Fingerprint: 8182 DE41 7056 408D 6146  50D1 6750 F10A E88D 4AF1
_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal
Reply | Threaded
Open this post in threaded view
|

Re: [pdal] PDAL Python3 issues

Sebastiaan Couwenberg
In reply to this post by jfprieur
On 01/23/2017 06:31 PM, Jean-Francois Prieur wrote:
> Yes, there is a python3-pdal package on stretch(testing). I believe stretch
> is due for release in the next few weeks.

The full freeze in preparation of the stretch release is in a few weeks
(February 5th, geospatial devroom day @ FOSDEM), see:

 https://release.debian.org/#release-dates

Because of the ongoing transition to OpenSSL 1.1.0 amongst others, it
will take some time until the actual release. My expectation is Q3 2017,
but it may very well be later or earlier.

Because of the current soft freeze with 10-day migration delays, changes
for inclusion in stretch need to be uploaded before this Thursday.

So if I need to drop the python3-pdal package, I need to make that
change ASAP.

Kind Regards,

Bas

--
 GPG Key ID: 4096R/6750F10AE88D4AF1
Fingerprint: 8182 DE41 7056 408D 6146  50D1 6750 F10A E88D 4AF1
_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal
Reply | Threaded
Open this post in threaded view
|

Re: [pdal] PDAL Python3 issues

Howard Butler-3
In reply to this post by Sebastiaan Couwenberg

> On Jan 23, 2017, at 11:53 AM, Sebastiaan Couwenberg <[hidden email]> wrote:
>
> If python-pdal can only work with the same Python version as
> libpdal-plang was built with, we'll need to drop the python3-pdal
> package or add a libpdal-plang3 for it.

This is a true statement. plang and PDAL Python extension should not use mixed Python runtimes. There is a check in setup.py that attempts to prevent it.

Howard
_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal
Reply | Threaded
Open this post in threaded view
|

Re: [pdal] PDAL Python3 issues

Howard Butler-3

> On Jan 23, 2017, at 12:19 PM, Howard Butler <[hidden email]> wrote:
>
>
>> On Jan 23, 2017, at 11:53 AM, Sebastiaan Couwenberg <[hidden email]> wrote:
>>
>> If python-pdal can only work with the same Python version as
>> libpdal-plang was built with, we'll need to drop the python3-pdal
>> package or add a libpdal-plang3 for it.
>
> This is a true statement. plang and PDAL Python extension should not use mixed Python runtimes. There is a check in setup.py that attempts to prevent it.

Followup ticket by Bas https://github.com/PDAL/PDAL/issues/1478
_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal
Reply | Threaded
Open this post in threaded view
|

Re: [pdal] PDAL Python3 issues

Sebastiaan Couwenberg
On 01/23/2017 07:41 PM, Howard Butler wrote:

>
>> On Jan 23, 2017, at 12:19 PM, Howard Butler <[hidden email]> wrote:
>>
>>
>>> On Jan 23, 2017, at 11:53 AM, Sebastiaan Couwenberg <[hidden email]> wrote:
>>>
>>> If python-pdal can only work with the same Python version as
>>> libpdal-plang was built with, we'll need to drop the python3-pdal
>>> package or add a libpdal-plang3 for it.
>>
>> This is a true statement. plang and PDAL Python extension should not use mixed Python runtimes. There is a check in setup.py that attempts to prevent it.
>
> Followup ticket by Bas https://github.com/PDAL/PDAL/issues/1478

Since building PDAL for both Python 2 and Python 3 is non-trivial, and
the full freeze deadline approaching, I've updated the python-pdal
(source) package to no longer build the python3-pdal (binary) packages.
The other way around (building both pdal & python-pdal only with Python
3) was possible too, but based on the popcon scores the Python 2 version
is much more popular.

Kind Regards,

Bas

--
 GPG Key ID: 4096R/6750F10AE88D4AF1
Fingerprint: 8182 DE41 7056 408D 6146  50D1 6750 F10A E88D 4AF1
_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal
Reply | Threaded
Open this post in threaded view
|

Re: [pdal] PDAL Python3 issues

Howard Butler-3
In reply to this post by jfprieur

> On Jan 23, 2017, at 11:31 AM, Jean-Francois Prieur <[hidden email]> wrote:
>
> When the student started (almost 2 years ago), we used OSGeo4W open source tools for development. The initial workflow was awesome. Read each file with PDAL, use pgwriter to send it to postgres, calculate all the metrics in the database. Worked like a charm until pgwriter dissapeared from the osgeo4w version of PDAL (we completely understand how this can happen, this is not a complaint!) so this production chain was broken. We both did not have the time (at the time) to figure out how to install everything in linux so she decided to press forward using Python. The end product is still in Postgres, it is the initial 'reading the LAS file' part that pgwriter performed flawlessly that is causing issues now.

Well that's a bummer. Your use case is actually a good one for pgpointcloud, and you had a good workflow going. Sorry to break things for you :(

I was recently contacted by NRCan about them paying to get a 1.8.1 OSGeo4W64 libLAS build together, but I have not heard back anything after I gave a quote.  I think 1.8.0 definitely had a memory management issue where it leaked file handles. IIRC, it was cleaned up in 1.8.1, but IMO pgointcloud, which you already had working is the better solution here.

An alternative that might give you traction is to use Docker http://www.pdal.io/quickstart.html The PDAL docker build is feature-complete with pgpointcloud support (and most other filters), and you could use it to get data in/out of your database by calling docker commands on windows. See the Quickstart http://www.pdal.io/quickstart.html for a teaser and the Workshop materials http://www.pdal.io/workshop/exercises/index.html for in-depth docker-on-windows examples. Docker might require Windows 10 for smooth usage, however.

A better solution of course is pgpointcloud support and current OSGeo4W64 binaries for windows. If you are willing to live dangerously, PDAL's continuous integration build, based off of OSGeo464, builds pgpointcloud-enabled binaries. It's just that you can only get a .zip file of the binaries, and you will need to do some %PATH% plumbing and other junk to get them to work with a current OSGeo4W64 environment. After every successful AppVeyor build, the zip file is placed at https://s3.amazonaws.com/pdal/osgeo4w/pdal.zip This means a constantly-changing but constantly up-to-date build is available. No promises.

A note for others watching PDAL's Windows situation: the problem is not getting builds done -- they're available via AppVeyor. The problem is smooth integration with OSGeo4W64, and a convenient packaging script to push releases at OSGeo4W64. I used to manually maintain this for libLAS, and it was awful. The first few OSGeo4W64 builds were the same. The task is an integration one, not so much a development one.

> A python 3 script using libLAS opens the LAS tile, runs through each crown to find the points associated to it and stores the result as a LAS file. The issue is that an individual LAS file is created for each tree crown, when we have more than 40,000 crowns per tile the system starts swapping (windows and linux) and the process just gets very slow. Then another script reads the las points, calculates metrics which are then stored in the database. This 'clipping' operation for the tree crowns only happens once at the beginning, it is not a problem. But it would take a month right now using libLAS which is not acceptable.
>
> So all I am looking for ;) is a linux python library that can write up tp 100,000 'mini-LAS' tree crowns from a las tile without running out of memory like libLAS does. Believe PDAL could do that quite simply via Python hence my attempts. I know that laspy exists but it is only for Python 2.

So you do indeed want to "touch the points"... but I think it would be best and cleanest to get back to pgpointcloud. You can get back there with Docker for i/o or try to bleed on the bleeding edge with the AppVeyor build and feather it into your OSGeo4W64 build.

> Thanks for any insights the list may have, keeping in mind we are relative programming noob scientists that don`t mind to work and read!

> Sorry for the book!

On the contrary, this kind of feedback lets us know how well or not well PDAL is doing the job for people. As I've said before, we have a particular set of use cases we use PDAL for, and it is encouraging that people are finding other ways to make it useful. We want to remove obvious blockers that prevent it from being so. Windows builds and integration are a tough one due to the fact that none of the PDAL developers work natively on that platform.

Thanks for the feedback!

Howard


_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal
Reply | Threaded
Open this post in threaded view
|

Re: [pdal] PDAL Python3 issues

jfprieur
Thank you for the insights on how the sausage is made ;), I am not tied to Windows and am actively trying to get away from it!

Will try the docker tools if we must follow the windows route, thanks again. Will keep you posted on our progress. Keep focused on linux ;) as I stated I am removing windows from my workflow as much as possible. You did not break anything rest assured!

JF

On Mon, Jan 23, 2017 at 2:18 PM Howard Butler <[hidden email]> wrote:

> On Jan 23, 2017, at 11:31 AM, Jean-Francois Prieur <[hidden email]> wrote:
>
> When the student started (almost 2 years ago), we used OSGeo4W open source tools for development. The initial workflow was awesome. Read each file with PDAL, use pgwriter to send it to postgres, calculate all the metrics in the database. Worked like a charm until pgwriter dissapeared from the osgeo4w version of PDAL (we completely understand how this can happen, this is not a complaint!) so this production chain was broken. We both did not have the time (at the time) to figure out how to install everything in linux so she decided to press forward using Python. The end product is still in Postgres, it is the initial 'reading the LAS file' part that pgwriter performed flawlessly that is causing issues now.

Well that's a bummer. Your use case is actually a good one for pgpointcloud, and you had a good workflow going. Sorry to break things for you :(

I was recently contacted by NRCan about them paying to get a 1.8.1 OSGeo4W64 libLAS build together, but I have not heard back anything after I gave a quote.  I think 1.8.0 definitely had a memory management issue where it leaked file handles. IIRC, it was cleaned up in 1.8.1, but IMO pgointcloud, which you already had working is the better solution here.

An alternative that might give you traction is to use Docker http://www.pdal.io/quickstart.html The PDAL docker build is feature-complete with pgpointcloud support (and most other filters), and you could use it to get data in/out of your database by calling docker commands on windows. See the Quickstart http://www.pdal.io/quickstart.html for a teaser and the Workshop materials http://www.pdal.io/workshop/exercises/index.html for in-depth docker-on-windows examples. Docker might require Windows 10 for smooth usage, however.

A better solution of course is pgpointcloud support and current OSGeo4W64 binaries for windows. If you are willing to live dangerously, PDAL's continuous integration build, based off of OSGeo464, builds pgpointcloud-enabled binaries. It's just that you can only get a .zip file of the binaries, and you will need to do some %PATH% plumbing and other junk to get them to work with a current OSGeo4W64 environment. After every successful AppVeyor build, the zip file is placed at https://s3.amazonaws.com/pdal/osgeo4w/pdal.zip This means a constantly-changing but constantly up-to-date build is available. No promises.

A note for others watching PDAL's Windows situation: the problem is not getting builds done -- they're available via AppVeyor. The problem is smooth integration with OSGeo4W64, and a convenient packaging script to push releases at OSGeo4W64. I used to manually maintain this for libLAS, and it was awful. The first few OSGeo4W64 builds were the same. The task is an integration one, not so much a development one.

> A python 3 script using libLAS opens the LAS tile, runs through each crown to find the points associated to it and stores the result as a LAS file. The issue is that an individual LAS file is created for each tree crown, when we have more than 40,000 crowns per tile the system starts swapping (windows and linux) and the process just gets very slow. Then another script reads the las points, calculates metrics which are then stored in the database. This 'clipping' operation for the tree crowns only happens once at the beginning, it is not a problem. But it would take a month right now using libLAS which is not acceptable.
>
> So all I am looking for ;) is a linux python library that can write up tp 100,000 'mini-LAS' tree crowns from a las tile without running out of memory like libLAS does. Believe PDAL could do that quite simply via Python hence my attempts. I know that laspy exists but it is only for Python 2.

So you do indeed want to "touch the points"... but I think it would be best and cleanest to get back to pgpointcloud. You can get back there with Docker for i/o or try to bleed on the bleeding edge with the AppVeyor build and feather it into your OSGeo4W64 build.

> Thanks for any insights the list may have, keeping in mind we are relative programming noob scientists that don`t mind to work and read!

> Sorry for the book!

On the contrary, this kind of feedback lets us know how well or not well PDAL is doing the job for people. As I've said before, we have a particular set of use cases we use PDAL for, and it is encouraging that people are finding other ways to make it useful. We want to remove obvious blockers that prevent it from being so. Windows builds and integration are a tough one due to the fact that none of the PDAL developers work natively on that platform.

Thanks for the feedback!

Howard



_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal
Reply | Threaded
Open this post in threaded view
|

Re: [pdal] PDAL Python3 issues

Jennifer Simeon
Hi Jean-Francois,

Don't know if it is still relevant for you after the expert replies, but there exists a laspy version I've been using with Python 3.4.  You can clone it from GitHub.


Best, J.

On 23 January 2017 at 20:46, Jean-Francois Prieur <[hidden email]> wrote:
Thank you for the insights on how the sausage is made ;), I am not tied to Windows and am actively trying to get away from it!

Will try the docker tools if we must follow the windows route, thanks again. Will keep you posted on our progress. Keep focused on linux ;) as I stated I am removing windows from my workflow as much as possible. You did not break anything rest assured!

JF

On Mon, Jan 23, 2017 at 2:18 PM Howard Butler <[hidden email]> wrote:

> On Jan 23, 2017, at 11:31 AM, Jean-Francois Prieur <[hidden email]> wrote:
>
> When the student started (almost 2 years ago), we used OSGeo4W open source tools for development. The initial workflow was awesome. Read each file with PDAL, use pgwriter to send it to postgres, calculate all the metrics in the database. Worked like a charm until pgwriter dissapeared from the osgeo4w version of PDAL (we completely understand how this can happen, this is not a complaint!) so this production chain was broken. We both did not have the time (at the time) to figure out how to install everything in linux so she decided to press forward using Python. The end product is still in Postgres, it is the initial 'reading the LAS file' part that pgwriter performed flawlessly that is causing issues now.

Well that's a bummer. Your use case is actually a good one for pgpointcloud, and you had a good workflow going. Sorry to break things for you :(

I was recently contacted by NRCan about them paying to get a 1.8.1 OSGeo4W64 libLAS build together, but I have not heard back anything after I gave a quote.  I think 1.8.0 definitely had a memory management issue where it leaked file handles. IIRC, it was cleaned up in 1.8.1, but IMO pgointcloud, which you already had working is the better solution here.

An alternative that might give you traction is to use Docker http://www.pdal.io/quickstart.html The PDAL docker build is feature-complete with pgpointcloud support (and most other filters), and you could use it to get data in/out of your database by calling docker commands on windows. See the Quickstart http://www.pdal.io/quickstart.html for a teaser and the Workshop materials http://www.pdal.io/workshop/exercises/index.html for in-depth docker-on-windows examples. Docker might require Windows 10 for smooth usage, however.

A better solution of course is pgpointcloud support and current OSGeo4W64 binaries for windows. If you are willing to live dangerously, PDAL's continuous integration build, based off of OSGeo464, builds pgpointcloud-enabled binaries. It's just that you can only get a .zip file of the binaries, and you will need to do some %PATH% plumbing and other junk to get them to work with a current OSGeo4W64 environment. After every successful AppVeyor build, the zip file is placed at https://s3.amazonaws.com/pdal/osgeo4w/pdal.zip This means a constantly-changing but constantly up-to-date build is available. No promises.

A note for others watching PDAL's Windows situation: the problem is not getting builds done -- they're available via AppVeyor. The problem is smooth integration with OSGeo4W64, and a convenient packaging script to push releases at OSGeo4W64. I used to manually maintain this for libLAS, and it was awful. The first few OSGeo4W64 builds were the same. The task is an integration one, not so much a development one.

> A python 3 script using libLAS opens the LAS tile, runs through each crown to find the points associated to it and stores the result as a LAS file. The issue is that an individual LAS file is created for each tree crown, when we have more than 40,000 crowns per tile the system starts swapping (windows and linux) and the process just gets very slow. Then another script reads the las points, calculates metrics which are then stored in the database. This 'clipping' operation for the tree crowns only happens once at the beginning, it is not a problem. But it would take a month right now using libLAS which is not acceptable.
>
> So all I am looking for ;) is a linux python library that can write up tp 100,000 'mini-LAS' tree crowns from a las tile without running out of memory like libLAS does. Believe PDAL could do that quite simply via Python hence my attempts. I know that laspy exists but it is only for Python 2.

So you do indeed want to "touch the points"... but I think it would be best and cleanest to get back to pgpointcloud. You can get back there with Docker for i/o or try to bleed on the bleeding edge with the AppVeyor build and feather it into your OSGeo4W64 build.

> Thanks for any insights the list may have, keeping in mind we are relative programming noob scientists that don`t mind to work and read!

> Sorry for the book!

On the contrary, this kind of feedback lets us know how well or not well PDAL is doing the job for people. As I've said before, we have a particular set of use cases we use PDAL for, and it is encouraging that people are finding other ways to make it useful. We want to remove obvious blockers that prevent it from being so. Windows builds and integration are a tough one due to the fact that none of the PDAL developers work natively on that platform.

Thanks for the feedback!

Howard



_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal



--
Jennifer SIMEON
Data Scientist
Responsable Développement Big Data 3D
-----------------------------------------------------
Geosat - Société de Géomètres-Experts

17 rue Thomas Edison  
33600 Pessac, France
Tél: +33 5 56 78 14 33 ext 5011

Displaying


_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal
Reply | Threaded
Open this post in threaded view
|

Re: [pdal] PDAL Python3 issues

jfprieur
Hi Jennifer, yes I found that GitHub page yesterday, we are using 3.5 (of course) but am going to give it a shot next week.

Thanks for the link!

On Tue, Jan 24, 2017, 03:46 Jennifer Simeon <[hidden email]> wrote:
Hi Jean-Francois,

Don't know if it is still relevant for you after the expert replies, but there exists a laspy version I've been using with Python 3.4.  You can clone it from GitHub.


Best, J.

On 23 January 2017 at 20:46, Jean-Francois Prieur <[hidden email]> wrote:
Thank you for the insights on how the sausage is made ;), I am not tied to Windows and am actively trying to get away from it!

Will try the docker tools if we must follow the windows route, thanks again. Will keep you posted on our progress. Keep focused on linux ;) as I stated I am removing windows from my workflow as much as possible. You did not break anything rest assured!

JF

On Mon, Jan 23, 2017 at 2:18 PM Howard Butler <[hidden email]> wrote:

> On Jan 23, 2017, at 11:31 AM, Jean-Francois Prieur <[hidden email]> wrote:
>
> When the student started (almost 2 years ago), we used OSGeo4W open source tools for development. The initial workflow was awesome. Read each file with PDAL, use pgwriter to send it to postgres, calculate all the metrics in the database. Worked like a charm until pgwriter dissapeared from the osgeo4w version of PDAL (we completely understand how this can happen, this is not a complaint!) so this production chain was broken. We both did not have the time (at the time) to figure out how to install everything in linux so she decided to press forward using Python. The end product is still in Postgres, it is the initial 'reading the LAS file' part that pgwriter performed flawlessly that is causing issues now.

Well that's a bummer. Your use case is actually a good one for pgpointcloud, and you had a good workflow going. Sorry to break things for you :(

I was recently contacted by NRCan about them paying to get a 1.8.1 OSGeo4W64 libLAS build together, but I have not heard back anything after I gave a quote.  I think 1.8.0 definitely had a memory management issue where it leaked file handles. IIRC, it was cleaned up in 1.8.1, but IMO pgointcloud, which you already had working is the better solution here.

An alternative that might give you traction is to use Docker http://www.pdal.io/quickstart.html The PDAL docker build is feature-complete with pgpointcloud support (and most other filters), and you could use it to get data in/out of your database by calling docker commands on windows. See the Quickstart http://www.pdal.io/quickstart.html for a teaser and the Workshop materials http://www.pdal.io/workshop/exercises/index.html for in-depth docker-on-windows examples. Docker might require Windows 10 for smooth usage, however.

A better solution of course is pgpointcloud support and current OSGeo4W64 binaries for windows. If you are willing to live dangerously, PDAL's continuous integration build, based off of OSGeo464, builds pgpointcloud-enabled binaries. It's just that you can only get a .zip file of the binaries, and you will need to do some %PATH% plumbing and other junk to get them to work with a current OSGeo4W64 environment. After every successful AppVeyor build, the zip file is placed at https://s3.amazonaws.com/pdal/osgeo4w/pdal.zip This means a constantly-changing but constantly up-to-date build is available. No promises.

A note for others watching PDAL's Windows situation: the problem is not getting builds done -- they're available via AppVeyor. The problem is smooth integration with OSGeo4W64, and a convenient packaging script to push releases at OSGeo4W64. I used to manually maintain this for libLAS, and it was awful. The first few OSGeo4W64 builds were the same. The task is an integration one, not so much a development one.

> A python 3 script using libLAS opens the LAS tile, runs through each crown to find the points associated to it and stores the result as a LAS file. The issue is that an individual LAS file is created for each tree crown, when we have more than 40,000 crowns per tile the system starts swapping (windows and linux) and the process just gets very slow. Then another script reads the las points, calculates metrics which are then stored in the database. This 'clipping' operation for the tree crowns only happens once at the beginning, it is not a problem. But it would take a month right now using libLAS which is not acceptable.
>
> So all I am looking for ;) is a linux python library that can write up tp 100,000 'mini-LAS' tree crowns from a las tile without running out of memory like libLAS does. Believe PDAL could do that quite simply via Python hence my attempts. I know that laspy exists but it is only for Python 2.

So you do indeed want to "touch the points"... but I think it would be best and cleanest to get back to pgpointcloud. You can get back there with Docker for i/o or try to bleed on the bleeding edge with the AppVeyor build and feather it into your OSGeo4W64 build.

> Thanks for any insights the list may have, keeping in mind we are relative programming noob scientists that don`t mind to work and read!

> Sorry for the book!

On the contrary, this kind of feedback lets us know how well or not well PDAL is doing the job for people. As I've said before, we have a particular set of use cases we use PDAL for, and it is encouraging that people are finding other ways to make it useful. We want to remove obvious blockers that prevent it from being so. Windows builds and integration are a tough one due to the fact that none of the PDAL developers work natively on that platform.

Thanks for the feedback!

Howard



_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal



--
Jennifer SIMEON
Data Scientist
Responsable Développement Big Data 3D
-----------------------------------------------------
Geosat - Société de Géomètres-Experts

17 rue Thomas Edison  
33600 Pessac, France
Tél: +33 5 56 78 14 33 ext 5011

Displaying


_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal
Reply | Threaded
Open this post in threaded view
|

Re: [pdal] PDAL Python3 issues

Albert Godfrind
Going back to the original issue:

> A python 3 script using libLAS opens the LAS tile, runs through each crown to find the points associated to it and stores the result as a LAS file. The issue is that an individual LAS file is created for each tree crown, when we have more than 40,000 crowns per tile the system starts swapping (windows and linux) and the process just gets very slow. Then another script reads the las points, calculates metrics which are then stored in the database. This 'clipping' operation for the tree crowns only happens once at the beginning, it is not a problem. But it would take a month right now using libLAS which is not acceptable.
>
> So all I am looking for ;) is a linux python library that can write up tp 100,000 'mini-LAS' tree crowns from a las tile without running out of memory like libLAS does. Believe PDAL could do that quite simply via Python hence my attempts. I know that laspy exists but it is only for Python 2

I assume you are writing out those 40000 individual las files into the same directory ? Very few file systems (actually I don’t know of any) will handle gracefully that number of files in a single directory. Not to mention issues with access tools and shell expansion (“*” gets expanded into a massive command line). Maybe one thing to try is to build a directory hierarchy so that each contains a reasonable number of sub-directories and files at the bottom. Something like:

- top level directory is called las_crowns
- it contains 10 directories, called 01 to 10.
- each of those contains 10 directories also called 01 to 10
- each of those contains 400 las files

So the full file spec of a random crown file is then like “./las_crowns/03/02/crown_nnnn.las” … Add more intermediate levels if the number of files to manage increases.

I assume you use some kind of meaningful naming convention for your files, so it should not be too difficult to expand it to include the sub-directory names. 

This may not actually solve the memory issue - but i think it is a general good practice when dealing with large numbers of files. 

Albert

On 24-Jan-2017, at 15:55, Jean-Francois Prieur <[hidden email]> wrote:

Hi Jennifer, yes I found that GitHub page yesterday, we are using 3.5 (of course) but am going to give it a shot next week.

Thanks for the link!

On Tue, Jan 24, 2017, 03:46 Jennifer Simeon <[hidden email]> wrote:
Hi Jean-Francois,

Don't know if it is still relevant for you after the expert replies, but there exists a laspy version I've been using with Python 3.4.  You can clone it from GitHub.


Best, J.

On 23 January 2017 at 20:46, Jean-Francois Prieur <[hidden email]> wrote:
Thank you for the insights on how the sausage is made ;), I am not tied to Windows and am actively trying to get away from it!

Will try the docker tools if we must follow the windows route, thanks again. Will keep you posted on our progress. Keep focused on linux ;) as I stated I am removing windows from my workflow as much as possible. You did not break anything rest assured!

JF

On Mon, Jan 23, 2017 at 2:18 PM Howard Butler <[hidden email]> wrote:

> On Jan 23, 2017, at 11:31 AM, Jean-Francois Prieur <[hidden email]> wrote:
>
> When the student started (almost 2 years ago), we used OSGeo4W open source tools for development. The initial workflow was awesome. Read each file with PDAL, use pgwriter to send it to postgres, calculate all the metrics in the database. Worked like a charm until pgwriter dissapeared from the osgeo4w version of PDAL (we completely understand how this can happen, this is not a complaint!) so this production chain was broken. We both did not have the time (at the time) to figure out how to install everything in linux so she decided to press forward using Python. The end product is still in Postgres, it is the initial 'reading the LAS file' part that pgwriter performed flawlessly that is causing issues now.

Well that's a bummer. Your use case is actually a good one for pgpointcloud, and you had a good workflow going. Sorry to break things for you :(

I was recently contacted by NRCan about them paying to get a 1.8.1 OSGeo4W64 libLAS build together, but I have not heard back anything after I gave a quote.  I think 1.8.0 definitely had a memory management issue where it leaked file handles. IIRC, it was cleaned up in 1.8.1, but IMO pgointcloud, which you already had working is the better solution here.

An alternative that might give you traction is to use Docker http://www.pdal.io/quickstart.html The PDAL docker build is feature-complete with pgpointcloud support (and most other filters), and you could use it to get data in/out of your database by calling docker commands on windows. See the Quickstart http://www.pdal.io/quickstart.html for a teaser and the Workshop materials http://www.pdal.io/workshop/exercises/index.html for in-depth docker-on-windows examples. Docker might require Windows 10 for smooth usage, however.

A better solution of course is pgpointcloud support and current OSGeo4W64 binaries for windows. If you are willing to live dangerously, PDAL's continuous integration build, based off of OSGeo464, builds pgpointcloud-enabled binaries. It's just that you can only get a .zip file of the binaries, and you will need to do some %PATH% plumbing and other junk to get them to work with a current OSGeo4W64 environment. After every successful AppVeyor build, the zip file is placed at https://s3.amazonaws.com/pdal/osgeo4w/pdal.zip This means a constantly-changing but constantly up-to-date build is available. No promises.

A note for others watching PDAL's Windows situation: the problem is not getting builds done -- they're available via AppVeyor. The problem is smooth integration with OSGeo4W64, and a convenient packaging script to push releases at OSGeo4W64. I used to manually maintain this for libLAS, and it was awful. The first few OSGeo4W64 builds were the same. The task is an integration one, not so much a development one.

> A python 3 script using libLAS opens the LAS tile, runs through each crown to find the points associated to it and stores the result as a LAS file. The issue is that an individual LAS file is created for each tree crown, when we have more than 40,000 crowns per tile the system starts swapping (windows and linux) and the process just gets very slow. Then another script reads the las points, calculates metrics which are then stored in the database. This 'clipping' operation for the tree crowns only happens once at the beginning, it is not a problem. But it would take a month right now using libLAS which is not acceptable.
>
> So all I am looking for ;) is a linux python library that can write up tp 100,000 'mini-LAS' tree crowns from a las tile without running out of memory like libLAS does. Believe PDAL could do that quite simply via Python hence my attempts. I know that laspy exists but it is only for Python 2.

So you do indeed want to "touch the points"... but I think it would be best and cleanest to get back to pgpointcloud. You can get back there with Docker for i/o or try to bleed on the bleeding edge with the AppVeyor build and feather it into your OSGeo4W64 build.

> Thanks for any insights the list may have, keeping in mind we are relative programming noob scientists that don`t mind to work and read!

> Sorry for the book!

On the contrary, this kind of feedback lets us know how well or not well PDAL is doing the job for people. As I've said before, we have a particular set of use cases we use PDAL for, and it is encouraging that people are finding other ways to make it useful. We want to remove obvious blockers that prevent it from being so. Windows builds and integration are a tough one due to the fact that none of the PDAL developers work natively on that platform.

Thanks for the feedback!

Howard



_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal



--
Jennifer SIMEON
Data Scientist
Responsable Développement Big Data 3D
-----------------------------------------------------
Geosat - Société de Géomètres-Experts

17 rue Thomas Edison  
33600 Pessac, France
Tél: +33 5 56 78 14 33 ext 5011

Displaying

_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal

--
ORACLE
Albert Godfrind | Geospatial technologies | Tel: +33 4 93 00 80 67 | Mobile: +33 6 09 97 27 23 | Twitter: @agodfrin
Oracle Server Technologies
400 Av. Roumanille,
 BP 309  | 06906 Sophia Antipolis cedex | France
Everything you ever wanted to know about Oracle Spatial





_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal
Reply | Threaded
Open this post in threaded view
|

Re: [pdal] PDAL Python3 issues

jfprieur
Thank you very much for the information Albert, we are surely running into that problem with our current workflow, I will try out the alternative you spell out and get back to the list with the results

JF

On Wed, Jan 25, 2017 at 12:45 PM Albert Godfrind <[hidden email]> wrote:
Going back to the original issue:

> A python 3 script using libLAS opens the LAS tile, runs through each crown to find the points associated to it and stores the result as a LAS file. The issue is that an individual LAS file is created for each tree crown, when we have more than 40,000 crowns per tile the system starts swapping (windows and linux) and the process just gets very slow. Then another script reads the las points, calculates metrics which are then stored in the database. This 'clipping' operation for the tree crowns only happens once at the beginning, it is not a problem. But it would take a month right now using libLAS which is not acceptable.
>
> So all I am looking for ;) is a linux python library that can write up tp 100,000 'mini-LAS' tree crowns from a las tile without running out of memory like libLAS does. Believe PDAL could do that quite simply via Python hence my attempts. I know that laspy exists but it is only for Python 2

I assume you are writing out those 40000 individual las files into the same directory ? Very few file systems (actually I don’t know of any) will handle gracefully that number of files in a single directory. Not to mention issues with access tools and shell expansion (“*” gets expanded into a massive command line). Maybe one thing to try is to build a directory hierarchy so that each contains a reasonable number of sub-directories and files at the bottom. Something like:

- top level directory is called las_crowns
- it contains 10 directories, called 01 to 10.
- each of those contains 10 directories also called 01 to 10
- each of those contains 400 las files

So the full file spec of a random crown file is then like “./las_crowns/03/02/crown_nnnn.las” … Add more intermediate levels if the number of files to manage increases.

I assume you use some kind of meaningful naming convention for your files, so it should not be too difficult to expand it to include the sub-directory names. 

This may not actually solve the memory issue - but i think it is a general good practice when dealing with large numbers of files. 

Albert

On 24-Jan-2017, at 15:55, Jean-Francois Prieur <[hidden email]> wrote:

Hi Jennifer, yes I found that GitHub page yesterday, we are using 3.5 (of course) but am going to give it a shot next week.

Thanks for the link!

On Tue, Jan 24, 2017, 03:46 Jennifer Simeon <[hidden email]> wrote:
Hi Jean-Francois,

Don't know if it is still relevant for you after the expert replies, but there exists a laspy version I've been using with Python 3.4.  You can clone it from GitHub.


Best, J.

On 23 January 2017 at 20:46, Jean-Francois Prieur <[hidden email]> wrote:
Thank you for the insights on how the sausage is made ;), I am not tied to Windows and am actively trying to get away from it!

Will try the docker tools if we must follow the windows route, thanks again. Will keep you posted on our progress. Keep focused on linux ;) as I stated I am removing windows from my workflow as much as possible. You did not break anything rest assured!

JF

On Mon, Jan 23, 2017 at 2:18 PM Howard Butler <[hidden email]> wrote:

> On Jan 23, 2017, at 11:31 AM, Jean-Francois Prieur <[hidden email]> wrote:
>
> When the student started (almost 2 years ago), we used OSGeo4W open source tools for development. The initial workflow was awesome. Read each file with PDAL, use pgwriter to send it to postgres, calculate all the metrics in the database. Worked like a charm until pgwriter dissapeared from the osgeo4w version of PDAL (we completely understand how this can happen, this is not a complaint!) so this production chain was broken. We both did not have the time (at the time) to figure out how to install everything in linux so she decided to press forward using Python. The end product is still in Postgres, it is the initial 'reading the LAS file' part that pgwriter performed flawlessly that is causing issues now.

Well that's a bummer. Your use case is actually a good one for pgpointcloud, and you had a good workflow going. Sorry to break things for you :(

I was recently contacted by NRCan about them paying to get a 1.8.1 OSGeo4W64 libLAS build together, but I have not heard back anything after I gave a quote.  I think 1.8.0 definitely had a memory management issue where it leaked file handles. IIRC, it was cleaned up in 1.8.1, but IMO pgointcloud, which you already had working is the better solution here.

An alternative that might give you traction is to use Docker http://www.pdal.io/quickstart.html The PDAL docker build is feature-complete with pgpointcloud support (and most other filters), and you could use it to get data in/out of your database by calling docker commands on windows. See the Quickstart http://www.pdal.io/quickstart.html for a teaser and the Workshop materials http://www.pdal.io/workshop/exercises/index.html for in-depth docker-on-windows examples. Docker might require Windows 10 for smooth usage, however.

A better solution of course is pgpointcloud support and current OSGeo4W64 binaries for windows. If you are willing to live dangerously, PDAL's continuous integration build, based off of OSGeo464, builds pgpointcloud-enabled binaries. It's just that you can only get a .zip file of the binaries, and you will need to do some %PATH% plumbing and other junk to get them to work with a current OSGeo4W64 environment. After every successful AppVeyor build, the zip file is placed at https://s3.amazonaws.com/pdal/osgeo4w/pdal.zip This means a constantly-changing but constantly up-to-date build is available. No promises.

A note for others watching PDAL's Windows situation: the problem is not getting builds done -- they're available via AppVeyor. The problem is smooth integration with OSGeo4W64, and a convenient packaging script to push releases at OSGeo4W64. I used to manually maintain this for libLAS, and it was awful. The first few OSGeo4W64 builds were the same. The task is an integration one, not so much a development one.

> A python 3 script using libLAS opens the LAS tile, runs through each crown to find the points associated to it and stores the result as a LAS file. The issue is that an individual LAS file is created for each tree crown, when we have more than 40,000 crowns per tile the system starts swapping (windows and linux) and the process just gets very slow. Then another script reads the las points, calculates metrics which are then stored in the database. This 'clipping' operation for the tree crowns only happens once at the beginning, it is not a problem. But it would take a month right now using libLAS which is not acceptable.
>
> So all I am looking for ;) is a linux python library that can write up tp 100,000 'mini-LAS' tree crowns from a las tile without running out of memory like libLAS does. Believe PDAL could do that quite simply via Python hence my attempts. I know that laspy exists but it is only for Python 2.

So you do indeed want to "touch the points"... but I think it would be best and cleanest to get back to pgpointcloud. You can get back there with Docker for i/o or try to bleed on the bleeding edge with the AppVeyor build and feather it into your OSGeo4W64 build.

> Thanks for any insights the list may have, keeping in mind we are relative programming noob scientists that don`t mind to work and read!

> Sorry for the book!

On the contrary, this kind of feedback lets us know how well or not well PDAL is doing the job for people. As I've said before, we have a particular set of use cases we use PDAL for, and it is encouraging that people are finding other ways to make it useful. We want to remove obvious blockers that prevent it from being so. Windows builds and integration are a tough one due to the fact that none of the PDAL developers work natively on that platform.

Thanks for the feedback!

Howard



_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal



--
Jennifer SIMEON
Data Scientist
Responsable Développement Big Data 3D
-----------------------------------------------------
Geosat - Société de Géomètres-Experts

17 rue Thomas Edison  
33600 Pessac, France
Tél: <a href="tel:+33%205%2056%2078%2014%2033" value="+33556781433" class="gmail_msg" target="_blank">+33 5 56 78 14 33 ext 5011

Displaying

_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal

--
ORACLE

Albert Godfrind | Geospatial technologies | Tel: <a href="tel:+33%204%2093%2000%2080%2067" value="+33493008067" class="gmail_msg" target="_blank">+33 4 93 00 80 67 | Mobile: <a href="tel:+33%206%2009%2097%2027%2023" value="+33609972723" class="gmail_msg" target="_blank">+33 6 09 97 27 23 | Twitter: @agodfrin
Oracle Server Technologies
400 Av. Roumanille,
 BP 309  | 06906 Sophia Antipolis cedex | France
Everything you ever wanted to know about Oracle Spatial

_______________________________________________
pdal mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/pdal