(moved to grass5)
On Wed, Jun 08, 2005 at 02:47:48PM +1200, Hamish wrote:
> > I would like to query more than 150 raster layers with r.what, to
> > extract the time evolution of a variable in a certan point of a map
> > (each layer is for a different time step) but i do receive the
> > error:
> >
> > r.what: can only do up to 150 cell files, sorry.
> >
> > The call is:
> >
> > r.what input=$RASTERMAPS east_north=$myeast, $mynorth
> >
> > where $RASTERMAPS is the list of my files.
> >
> > How can I overcome this problem?
..
> Not sure if it's the best way but maybe a bash script will do the
> trick
>
> for i in <list of raster maps>
> do r.what input=$i east_north=$myeast, $mynorth >> temp
> done
>
> This will (hopefully) append the value to the temp file. Just not
> entirely sure if the redirect (>>) should be inside the loop or after
> the done statement. Can't test it right nowor do less than 150 raster maps at a time and cat the resulting pieces
together...grass6/raster/r.patch/nfiles.h says:
/* The number of cell files that can be patched together.
*
* All cell files will be opened at one time, so this number can not
* be arbitrarily large.
*
* Must be smaller than MAXFILES as defined in lib/gis/G.h which
* in turn must be smaller than the operating system's limit.
* (Given by `cat /proc/sys/fs/file-max` in Linux 2.4)
*/(same issue for r.series, r.patch, r.what, ...)
Wouldn't it be better to define a single value in gis.h
for all related r.* modules?
The first line of grass6/raster/r.what/main.c is:
#define NFILES 150You can make that (a bit less than) MAXFILES in lib/gis/G.h
(currently 256) without too much worry.Beyond that you'll need to change MAXFILES too. Probably 1024 files
is ok, my system says 200k is file-max. But depends on your system.Maybe I'll clean these up to reference G.h. The number can't be
'arbitrarily large' due to resource constraints. Search the mailing
lists for emails from Glynn on the subject for a better explanation.Hamish
Markus