Huge Datase Upload

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Huge Datase Upload

Ana Silva
Hi everyone, I have a question. In the project I'm working on, we deal with a gigantic dataset (raster files larger than 18GB each, in the best case) and a total dataset size of more than 3TB.
And I'm having a problem uploading this data, the uploader always shows me an entity too large message when I try to upload some of this data.
I'm using Geonode 2.10.x, installed via Docker.
Would anyone have some kind of solution to this problem?

_______________________________________________
geonode-users mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/geonode-users
Reply | Threaded
Open this post in threaded view
|

Re: Huge Datase Upload

xbartolone
Hi Ana,

I've never handled such kind of big files but I would opt for making them available on the internal filesystem of the geoserver container, loading them from the GUI of GeoServer in the geonode workspace and finally running the updatelayers command [1] on the django container.

Don't know if Alessio, Paolo and somebody else can add something more appropriate.

Hope this helps.

Francesco

[1] http://docs.geonode.org/en/master/tutorials/admin/admin_mgmt_commands/#updatelayers

Il giorno mar 20 nov 2018 alle ore 12:28 Ana Silva <[hidden email]> ha scritto:
Hi everyone, I have a question. In the project I'm working on, we deal with a gigantic dataset (raster files larger than 18GB each, in the best case) and a total dataset size of more than 3TB.
And I'm having a problem uploading this data, the uploader always shows me an entity too large message when I try to upload some of this data.
I'm using Geonode 2.10.x, installed via Docker.
Would anyone have some kind of solution to this problem?
_______________________________________________
geonode-users mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/geonode-users

_______________________________________________
geonode-users mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/geonode-users
Reply | Threaded
Open this post in threaded view
|

Re: Huge Datase Upload

toni_schoenbuchner
In reply to this post by Ana Silva
Hi Ana,

what Francesco suggests is exactly the why I would choose.

In addition before that I would check if a lossy compression
could be an option for your imagery. gdal_translate would the
tool I would choose. Find a useful looking post here:


Cheers,

Toni

Am 20.11.2018 um 13:08 schrieb [hidden email]:

Message: 1
Date: Tue, 20 Nov 2018 08:28:33 -0300
From: Ana Silva <[hidden email]>
To: [hidden email]
Subject: [GeoNode-users] Huge Datase Upload
Message-ID:
<[hidden email]>
Content-Type: text/plain; charset="utf-8"

Hi everyone, I have a question. In the project I'm working on, we deal with
a gigantic dataset (raster files larger than 18GB each, in the best case)
and a total dataset size of more than 3TB.
And I'm having a problem uploading this data, the uploader always shows me
an entity too large message when I try to upload some of this data.
I'm using Geonode 2.10.x, installed via Docker.
Would anyone have some kind of solution to this problem?


_______________________________________________
geonode-users mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/geonode-users
Reply | Threaded
Open this post in threaded view
|

Re: Huge Datase Upload

Tweedle
Echoing Francesco and Toni, I have sucessfully deployed 100 GB+ datasets on geoserver and then used updatelayers to make them available via GeoNode. The geotiff compression for dummies article is good, I've tried to follow the advice given but I have to caution that I've found running gdal_retile with those compression settings can take weeks to process your data depending on the specifications of your machone. The issue is that it is single-threaded and image processing is all CPU. gdal2tiles is theoretically capable of multi-core processing and the dev of this tool has also created maptiler: https://www.maptiler.com which is advertised as being multi-core. 

On Tue, Nov 20, 2018 at 7:22 AM Toni Schönbuchner <[hidden email]> wrote:
Hi Ana,

what Francesco suggests is exactly the why I would choose.

In addition before that I would check if a lossy compression
could be an option for your imagery. gdal_translate would the
tool I would choose. Find a useful looking post here:


Cheers,

Toni

Am 20.11.2018 um 13:08 schrieb [hidden email]:

Message: 1
Date: Tue, 20 Nov 2018 08:28:33 -0300
From: Ana Silva <[hidden email]>
To: [hidden email]
Subject: [GeoNode-users] Huge Datase Upload
Message-ID:
<[hidden email]>
Content-Type: text/plain; charset="utf-8"

Hi everyone, I have a question. In the project I'm working on, we deal with
a gigantic dataset (raster files larger than 18GB each, in the best case)
and a total dataset size of more than 3TB.
And I'm having a problem uploading this data, the uploader always shows me
an entity too large message when I try to upload some of this data.
I'm using Geonode 2.10.x, installed via Docker.
Would anyone have some kind of solution to this problem?

_______________________________________________
geonode-users mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/geonode-users

_______________________________________________
geonode-users mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/geonode-users
Reply | Threaded
Open this post in threaded view
|

Re: Huge Datase Upload

toni_schoenbuchner
+1

Thanks Michael for this addition!



Am 20.11.2018 um 15:06 schrieb Michael Fedak <[hidden email]>:

Echoing Francesco and Toni, I have sucessfully deployed 100 GB+ datasets on geoserver and then used updatelayers to make them available via GeoNode. The geotiff compression for dummies article is good, I've tried to follow the advice given but I have to caution that I've found running gdal_retile with those compression settings can take weeks to process your data depending on the specifications of your machone. The issue is that it is single-threaded and image processing is all CPU. gdal2tiles is theoretically capable of multi-core processing and the dev of this tool has also created maptiler: https://www.maptiler.com which is advertised as being multi-core. 

On Tue, Nov 20, 2018 at 7:22 AM Toni Schönbuchner <[hidden email]> wrote:
Hi Ana,

what Francesco suggests is exactly the why I would choose.

In addition before that I would check if a lossy compression
could be an option for your imagery. gdal_translate would the
tool I would choose. Find a useful looking post here:


Cheers,

Toni

Am 20.11.2018 um 13:08 schrieb [hidden email]:

Message: 1
Date: Tue, 20 Nov 2018 08:28:33 -0300
From: Ana Silva <[hidden email]>
To: [hidden email]
Subject: [GeoNode-users] Huge Datase Upload
Message-ID:
<[hidden email]>
Content-Type: text/plain; charset="utf-8"

Hi everyone, I have a question. In the project I'm working on, we deal with
a gigantic dataset (raster files larger than 18GB each, in the best case)
and a total dataset size of more than 3TB.
And I'm having a problem uploading this data, the uploader always shows me
an entity too large message when I try to upload some of this data.
I'm using Geonode 2.10.x, installed via Docker.
Would anyone have some kind of solution to this problem?

_______________________________________________
geonode-users mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/geonode-users


_______________________________________________
geonode-users mailing list
[hidden email]
https://lists.osgeo.org/mailman/listinfo/geonode-users