From the r.series manual I thought that the file option allows one to analyze large amount of raster maps without hitting open files limit and the size limit of command line arguments. Yet, when using the r.series with as input a file with 994 raster layers, I am still getting an error message about “too many open files”
GRASS 7.1.svn (Data_latlon):~ > r.series --overwrite file=a.txt output=Spec_rich method=count range=1,2
WARNING: G__open(read): Unable to open
‘/media/HD2/Data/GRASSdb/Data_latlon/VECEA_species/cellhd/maxent_v1_Hyparrhenia_rufa_rcp45_2055_ens_mean_wc60s_presabs’:
Too many open files
The mentioned layer is the 551th layer out of a list of 996 layers.
On Thu, Dec 3, 2015 at 10:26 AM, Paulo van Breugel <p.vanbreugel@gmail.com>
wrote:
From the r.series manual I thought that the *file* option allows one to
analyze large amount of raster maps without hitting open files limit and
the size limit of command line arguments. Yet, when using the r.series with
as input a file with 994 raster layers, I am still getting an error message
about "too many open files"
GRASS 7.1.svn (Data_latlon):~ > r.series --overwrite file=a.txt
output=Spec_rich method=count range=1,2
WARNING: G__open(read): Unable to open
'/media/HD2/Data/GRASSdb/Data_latlon/VECEA_species/cellhd/maxent_v1_Hyparrhenia_rufa_rcp45_2055_ens_mean_wc60s_presabs':
Too many open files
The mentioned layer is the 551th layer out of a list of 996 layers.
Did I misunderstood this 'file' option?
Try to use z flag. I think the manual is wrong:
Use the *file* option to analyze large amount of raster maps without
hitting open files limit and the size limit of command line arguments. The
computation is slower than the *input* option method. For every sinlge row
in the output map(s) all input maps are opened and closed.
On Thu, Dec 3, 2015 at 4:42 PM, Anna Petrášová <kratochanna@gmail.com>
wrote:
On Thu, Dec 3, 2015 at 10:26 AM, Paulo van Breugel <p.vanbreugel@gmail.com
> wrote:
From the r.series manual I thought that the *file* option allows one to
analyze large amount of raster maps without hitting open files limit and
the size limit of command line arguments. Yet, when using the r.series with
as input a file with 994 raster layers, I am still getting an error message
about "too many open files"
GRASS 7.1.svn (Data_latlon):~ > r.series --overwrite file=a.txt
output=Spec_rich method=count range=1,2
WARNING: G__open(read): Unable to open
'/media/HD2/Data/GRASSdb/Data_latlon/VECEA_species/cellhd/maxent_v1_Hyparrhenia_rufa_rcp45_2055_ens_mean_wc60s_presabs':
Too many open files
The mentioned layer is the 551th layer out of a list of 996 layers.
Did I misunderstood this 'file' option?
Try to use z flag. I think the manual is wrong:
Use the *file* option to analyze large amount of raster maps without
hitting open files limit and the size limit of command line arguments. The
computation is slower than the *input* option method. For every sinlge
row in the output map(s) all input maps are opened and closed.
Anna
The -z flag works perfectly, thanks. I though I would provide a patch with
an update of the manual page, but to do so, I need to understand it
correctly;
What I understand now is that the -z flag is to avoid the open file limit,
while the file option is to avoid the comman line size limit?
If that is true, for which (or both) the following still applies:
"The computation is slower as for every sinlge row in the
output map(s) all input maps are opened and closed. The amount of
RAM will rise linear with the number of specified input maps."