[Liblas-devel] Fast way to read compressed LAZ files

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

[Liblas-devel] Fast way to read compressed LAZ files

Alexander Biddulph
I am currently using liblas (with LASzip) to read compressed LAZ files.

My current workflow is to open the LAZ file, read in each point sequentially and place each point into a cell in a regular grid, then find the centroid of each cell.

However, this process is currently taking an extraordinarily long time, so I was wondering of there is some way that I could speed this process up.

I was initially thinking about extracting points from the file using multiple threads (via OpenMP). However, based on the small tutorials provided, it is my understanding that each thread randomly accessing the Nth point in the file would require decompressing the first N points. So I was then thinking about decompressing the entire file first, then randomly accessing the points in the decompressed file, however, I am unable to tell if this would actually provide much of a performance increase over just reading sequentially from the compressed file.

Can anyone suggest a good way of speeding this process up?

Thanks,
Bidski

_______________________________________________
Liblas-devel mailing list
[hidden email]
http://lists.osgeo.org/mailman/listinfo/liblas-devel
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: [Liblas-devel] Fast way to read compressed LAZ files

isenburg
Hello,

by default LASzip compressed points in independent chunks of 50000. Hence LASzip can be trivially paralized on decompression by spawning multiple thrreads that each seek to a different multiple of 50000 and then decompress non-overlapping chunks of multiples of 50000 points.

Of forumated otherwise. Seeking and decompressing a random point in the file will incur a decompression overhead of 0 to 49999 points depending on where your target points falls within the chunk of 50000 points.

Regards.

Martin @rapidlasso

On Fri, Feb 5, 2016 at 12:51 AM, Alexander Biddulph <[hidden email]> wrote:
I am currently using liblas (with LASzip) to read compressed LAZ files.

My current workflow is to open the LAZ file, read in each point sequentially and place each point into a cell in a regular grid, then find the centroid of each cell.

However, this process is currently taking an extraordinarily long time, so I was wondering of there is some way that I could speed this process up.

I was initially thinking about extracting points from the file using multiple threads (via OpenMP). However, based on the small tutorials provided, it is my understanding that each thread randomly accessing the Nth point in the file would require decompressing the first N points. So I was then thinking about decompressing the entire file first, then randomly accessing the points in the decompressed file, however, I am unable to tell if this would actually provide much of a performance increase over just reading sequentially from the compressed file.

Can anyone suggest a good way of speeding this process up?

Thanks,
Bidski

_______________________________________________
Liblas-devel mailing list
[hidden email]
http://lists.osgeo.org/mailman/listinfo/liblas-devel


_______________________________________________
Liblas-devel mailing list
[hidden email]
http://lists.osgeo.org/mailman/listinfo/liblas-devel
Loading...