I have 4 million random tiles split up into subdirectories containing 5000
tiles. The tiles are GeoTIFFs with dimensions of 1536 x 1536. Using gdal, I
ingested the tiles into geoserver the following way:
1. Create a virtual dataset (gdalbuildvrt -addalpha my-ds.vrt /tiles/1/*)
for each subdirectory. Note the "addalpha" option to ensure that transparent
tiles are between the supplied tiles.
2. List all the virtual datasets in a file (all-datasets.txt).
3. Use gdal_retile (gdal_retile.py ... --optfile=all-datasets.txt) to
generate an image pyramid.
4. Wait...
5. Create a store, ImagePyramid layer etc in geoserver and view the tiles
via WMS.
At regular intervals, I will need to add more tiles to my existing dataset.
There may be one single tile, or another million. The tiles may also be of
variable sizes e.g. 30000 x 30000.
My first question is can someone confirm if this is the best way to ingest
such a large volume of into geoserver (in terms of efficiency and
scalability etc)?
My second question is how can I merge/append my new data to the existing
image pyramid without having to run gdal_retile over the whole dataset (new
and old) all over again? The initial set has taken days to ingest and I want
new tiles to be added in real-time as they are made available so users can
see the results via WMS.
--
View this message in context: http://osgeo-org.1560.x6.nabble.com/Appending-large-volumes-of-tiles-to-existing-WMS-image-pyramid-tp5140623.html
Sent from the GeoServer - User mailing list archive at Nabble.com.