Hamish wrote:
> Rather than writing lots of separate modules, it might be worth
> extending r.mapcalc to support operations on a neighbourhood window
> (i.e. so that things that you can theoretically do in r.mapcalc using
> expressions with O(W x H) terms would actually become practical).
how would this differ from r.mfilter or r.mfilter functionalily merged
into r.neighbors?
With r.mfilter + r.neighbors, how do you combine the weight with the
cell value to get the value passed to the aggregate?
r.mfilter multiplies the value by the weight, then sums the results.
Combining this with r.neighbors would allow additional aggregates, but
they would still operate upon products.
In the course of writing r.resamp.aggregate (or whatever it's
eventually called), it had occured to me that it might be useful to
support weighted aggregates (for boundary cells, which are currently
assigned in their entirety to the cell in which their centre lies).
Most of the existing aggregates can be extended in an "obvious"
manner.
However, there are still situations where you might want to compute
the aggregate over some arbitrary function of the <weight,value>
pairs, e.g. aggregate over the original values of all cells where the
weight-value product falls in some range, apply some other non-linear
transformation to the product, or even use something other than a
simple product.
This isn't something which can (easily[1]) be done by combining
existing modules, or even with a combined r.mfilter/r.neighbors
module.
[1] You can do it by creating WxH intermediate maps, each
corresponding to a single neighbourhood cell. This isn't particularly
efficient for large neighbourhood windows.
You could probably still get quite a way by replacing r.mfilter's
built-in sum aggregate with the ability to use the aggregates from the
stats library, along with a set of combining functions.
Rather than clutter r.mapcalc, could r.mfilter be extended to take
r.mapcalc expressions in the matrix def'n?
Not without copying most of r.mapcalc; or turning it into a library.
Actually moving most of the functions (x*.c) into a library would be
feasible, as there aren't that many dependencies on the r.mapcalc
framework. Some functions (e.g. xcoor.c, xrowcol.c) wouldn't be
applicable elsewhere, and a fair chunk (e.g. mapcalc.y, evaluate.c)
would need to be rewritten for each module.
Essentially, the changes required to r.mapcalc would involve some or
all of:
+ Allowing array values.
+ A function to return the neighbourhood of the current cell as an array.
+ The ability to map expressions over arrays.
+ A set of common aggregates.
+ A set of functions operating upon arrays (extraction, replacement).
+ The ability to define new aggregates (inductively, similar to the
way that PostgreSQL's CREATE AGGREGATE works).
Hmm. This isn't trivial. But it isn't rocket science either.
OTOH, it might be better to think about a completely new language;
possibly a compiled language which can be extended through C.
[Note: Octave is way too slow for this; I tried using it for writing
some texture synthesis algorithms and it was roughly 50 times slower
than equivalent C code.]
I would like to see it made much easier to write raster processing
modules without all of the boilerplate involved in a typical r.*
module (look at r.example: 100+ lines to do the equivalent of
"r.mapcalc out = func(in)").
--
Glynn Clements <glynn@gclements.plus.com>