I have two remarks which should probably better be discussed at this stage of the project:
On 11/06/17 23:53, Zechariah Krautwurst wrote:
WEEK 02 REPORT: JUNE 05 - JUNE 09
GRASS GIS Locations from Public Data [1]
Zechariah Krautwurst
ACCOMPLISHED
- Reviewed Python scripting for GRASS documentation and existing GRASS
data import modules
- Wrote SRTM 30 download script using Python Elevation library and GRASS
output
IMHO, I think it would be better if we avoided additional outside dependencies as much as possible. Is it really that hard to implement our own download routine of SRTM ?
- Researched USGS data download methods, REST API’s and HTTP protocol
- Reviewed The National Map (TNM) API documentation and tested
coordinate output methods from g.region as input to TNM API for SRTM ⅓
arc sec NED tiles
I don't know if you have already made a list of target datasets, but I would plead for using global datasets, instead of USA-only datasets, whenever possible.
You can check the wiki page on global data sets for inspiration:
On 12 June 2017 at 09:48, Moritz Lennert <mlennert@club.worldonline.be> wrote:
Hi Zechariah,
Hi,
IMHO, I think it would be better if we avoided additional outside
dependencies as much as possible. Is it really that hard to implement our
own download routine of SRTM ?
+1
- Researched USGS data download methods, REST API’s and HTTP protocol
- Reviewed The National Map (TNM) API documentation and tested
coordinate output methods from g.region as input to TNM API for SRTM ⅓
arc sec NED tiles
I don't know if you have already made a list of target datasets, but I would
plead for using global datasets, instead of USA-only datasets, whenever
possible.
+1 also here
You can check the wiki page on global data sets for inspiration:
These GDAL features might be relevant for importing external data.
In the blog-post Even demonstrates how to read data directly from web locations even when inside zip archives (not sure what performance implications that has, but it might help to reduce the amount of local storage needed).
More info on how to treat Sentinel in GRASS would be good.
Once you managed to get to Level 2A (from TOA to BOA), the rest is
similar to other imagery datasets (I think ESA was about to start
providing Level 2A data, dunno if they have...).
MarkusN added support for Sentinel2 in i.atcorr, but I have not tested
that, so far... Has anyone tested?
Vero, didn't you write a tutorial on this ?
yes and no... MarkusN shared some info/scripts, and I further worked
on it to create a wiki, but did not finish it... I've been doing some
simple stuff with S2 recently that could be added... No time right
now, but definitely a TO-DO for FOSS4G-EU code sprint
More info on how to treat Sentinel in GRASS would be good.
Once you managed to get to Level 2A (from TOA to BOA), the rest is
similar to other imagery datasets (I think ESA was about to start
providing Level 2A data, dunno if they have…).
Yes, new scenes are (often? always?) now atmospherically corrected by ESA.
MarkusN added support for Sentinel2 in i.atcorr, but I have not tested
that, so far… Has anyone tested?
Sajid did to my knowledge.
Vero, didn’t you write a tutorial on this ?
yes and no… MarkusN shared some info/scripts, and I further worked
on it to create a wiki, but did not finish it… I’ve been doing some
simple stuff with S2 recently that could be added… No time right
now, but definitely a TO-DO for FOSS4G-EU code sprint
More info on how to treat Sentinel in GRASS would be good.
Once you managed to get to Level 2A (from TOA to BOA), the rest is
similar to other imagery datasets (I think ESA was about to start
providing Level 2A data, dunno if they have…).
Yes, new scenes are (often? always?) now atmospherically corrected by ESA.
I checked with the filter on our S2A maps system (maps.mundialis.de) and see that atmospherically corrected scenes are provided for Europe only and since May 2017. No idea of they will enlarge the supported area in future.
Forgot to include GRASS-dev on this. Keep the good ideas coming!
···
On Mon, Jun 12, 2017 at 8:49 AM, Zechariah Krautwurst <zfkrautw@ncsu.edu> wrote:
Moritz,
Thank you for your feedback.
IMHO, I think it would be better if we avoided additional outside dependencies as much as possible. Is it really that hard to implement our own download routine of SRTM ?
Agreed. I’m temporarily using the Elevation python package to figure out how it is indexing and clipping SRTM tiles from user input coordinate boundaries.
I don’t know if you have already made a list of target datasets, but I would plead for using global datasets, instead of USA-only datasets, whenever possible.
Double agreed. The initial goal is to create the USGS pipeline as proof of concept and apply the framework to broader global data sets and API’s.
You can check the wiki page on global data sets for inspiration:
Also don’t hesitate to update this page if you come by any other interesting global datasets.
Definitely good to keep in mind. I’ve been keeping good documentation of my workflow and resources. I hope to spend some time updating the online GRASS documentation with that information as I go along.
Please let me know if you have any other thoughts!