hi,
Not specifically a grass question. How do I download in bulk a number
of Landsat tiles from GLCF.
for example: ftp://ftp.glcf.umiacs.umd.edu/glcf/Landsat/WRS2/p114/r052/p114r052_7x20010909.ETM-EarthSat-Orthorectified/
I want to download only the bands 1-5 and 7and the metadata.
I need to do this for 52+ plus Landsat tiles.
cheers,
maning
--
|---------|----------------------------------------------------------|
| __.-._ |"Ohhh. Great warrior. Wars not make one great." -Yoda |
| '-._"7' |"Freedom is still the most radical idea of all" -N.Branden|
| /'.-c |Linux registered user #402901, http://counter.li.org/ |
| | /T |http://esambale.wikispaces.com|
| _)_/LI
|---------|----------------------------------------------------------|
Just read the esdi warning:
http://glcf.umiacs.umd.edu/esdi2-help/expert.html
"Do not use these layers to automate a bulk download (over 50 scenes)
without contacting us first. Violations will result in the ban of your
IP address or domain."
Maybe ten at a time
maning
On Fri, May 23, 2008 at 1:03 PM, maning sambale
<emmanuel.sambale@gmail.com> wrote:
hi,
Not specifically a grass question. How do I download in bulk a number
of Landsat tiles from GLCF.
for example: ftp://ftp.glcf.umiacs.umd.edu/glcf/Landsat/WRS2/p114/r052/p114r052_7x20010909.ETM-EarthSat-Orthorectified/
I want to download only the bands 1-5 and 7and the metadata.
I need to do this for 52+ plus Landsat tiles.
cheers,
maning
--
|---------|----------------------------------------------------------|
| __.-._ |"Ohhh. Great warrior. Wars not make one great." -Yoda |
| '-._"7' |"Freedom is still the most radical idea of all" -N.Branden|
| /'.-c |Linux registered user #402901, http://counter.li.org/ |
| | /T |http://esambale.wikispaces.com|
| _)_/LI
|---------|----------------------------------------------------------|
--
|---------|----------------------------------------------------------|
| __.-._ |"Ohhh. Great warrior. Wars not make one great." -Yoda |
| '-._"7' |"Freedom is still the most radical idea of all" -N.Branden|
| /'.-c |Linux registered user #402901, http://counter.li.org/ |
| | /T |http://esambale.wikispaces.com|
| _)_/LI
|---------|----------------------------------------------------------|
maning:
> How do I download in bulk a number of Landsat tiles from GLCF.
> for example:
ftp://ftp.glcf.umiacs.umd.edu/glcf/Landsat/WRS2/p114/r052/p114r052_7x20010909.ETM-EarthSat-Orthorectified/
> I want to download only the bands 1-5 and 7and the
> metadata.
>
> I need to do this for 52+ plus Landsat tiles.
a little wget shell script loop is given here:
http://grass.osgeo.org/wiki/MODIS#Download
maybe you could adapt that?
Just read the esdi warning:
http://glcf.umiacs.umd.edu/esdi2-help/expert.html
"Do not use these layers to automate a bulk download
(over 50 scenes) without contacting us first. Violations will
result in the ban of your IP address or domain."
The idea is to not abuse their server, so others are not denied access by a single greedy use hogging the system. I assume that they are working with finite resources and so forth, and FTP usually only has a small number of connections allowed at once.
Maybe ten at a time
maybe ten a day, and put a "sleep <number of seconds>" after each call to wget to give someone else the chance to connect. The ban sounds automated, and thus strict. No idea how long it takes for the download log to cycle, it could be once a day, it could be once a week. But a 10 minute break between batches is probably not enough.
or, as they request, drop them an friendly email saying you'd like to download 52+ images in a way that suits them.
Hamish
p.s. there are other solutions for them: http://geotorrent.org/
maning sambale <emmanuel.sambale@gmail.com> writes:
>> Not specifically a grass question. How do I download in bulk a
>> number of Landsat tiles from GLCF. for example:
>> ftp://ftp.glcf.umiacs.umd.edu/glcf/Landsat/WRS2/p114/r052/p114r052_7x20010909.ETM-EarthSat-Orthorectified/
>> I want to download only the bands 1-5 and 7and the metadata.
>> I need to do this for 52+ plus Landsat tiles.
GNU Wget [1] retrieves URLs specified either in the command line
or in an plain ASCII file, which could easily be created either
with a simple program (in almost whatever programming language),
or with a text editor.
[1] http://www.gnu.org/software/wget/
> Just read the esdi warning:
> http://glcf.umiacs.umd.edu/esdi2-help/expert.html "Do not use these
> layers to automate a bulk download (over 50 scenes) without
> contacting us first. Violations will result in the ban of your IP
> address or domain."
> Maybe ten at a time
You may consider using the `-w' Wget option then:
--cut: (wget) Download Options--
`-w SECONDS'
`--wait=SECONDS'
Wait the specified number of seconds between the retrievals. Use
of this option is recommended, as it lightens the server load by
making the requests less frequent. Instead of in seconds, the
time can be specified in minutes using the `m' suffix, in hours
using `h' suffix, or in days using `d' suffix.
Specifying a large value for this option is useful if the network
or the destination host is down, so that Wget can wait long enough
to reasonably expect the network error to be fixed before the
retry. The waiting interval specified by this function is
influenced by `--random-wait', which see.
--cut: (wget) Download Options--
[...]