[GRASS-dev] r.shrink??

Can r.grow use negative numbers? Could it be made to do so or can we make an r.shrink?

A standard set of image processing routines, for binary images, includes dilate, erode, outline, and skeletonize. Two other compound routines are open (dilate followed by erode) and close (erode followed by dilate). GRASS can accomplish some of these processes:

dilate = r.grow
outline (sort of) = r.to.vect followed by v.to rast, but a better raster version would be nice
skeletonize = r.thin (iterated)

But there is nothing to shrink areas (erode in image processing parlance).

Since GRASS already has such good image processing capabilities already, it would be nice to round them out with these standard routines.

If people think this is a good idea, I can turn this into a wish.

Michael


Michael Barton, Professor of Anthropology
School of Human Evolution & Social Change
Center for Social Dynamics and Complexity
Arizona State University

phone: 480-965-6213
fax: 480-965-7671
www: http://www.public.asu.edu/~cmbarton

On Tue, 2006-08-22 at 11:46 -0700, Michael Barton wrote:

Can r.grow use negative numbers? Could it be made to do so or can we
make an r.shrink?

A standard set of image processing routines, for binary images,
includes dilate, erode, outline, and skeletonize. Two other compound
routines are open (dilate followed by erode) and close (erode followed
by dilate). GRASS can accomplish some of these processes:

dilate = r.grow
outline (sort of) = r.to.vect followed by v.to rast, but a better
raster version would be nice
skeletonize = r.thin (iterated)

But there is nothing to shrink areas (erode in image processing
parlance).

Since GRASS already has such good image processing capabilities
already, it would be nice to round them out with these standard
routines.

If people think this is a good idea, I can turn this into a wish.

This is very interesting and I guess we don't really have a erode
equivalent anymore (not in CVS...in older modules, yes, IIRC).

Please submit a wish.

--
Brad Douglas <rez touchofmadness com> KB8UYR
Address: 37.493,-121.924 / WGS84 National Map Corps #TNMC-3785

Michael Barton wrote:

Can r.grow use negative numbers?

No.

Could it be made to do so

Sort of.

You could implement shrinking by inverting the sense of the
!G_is_d_null_value(...) tests (and making a couple of other changes).

The end result would be equivalent to "inverting" the map (swap
null<->non-null), running r.grow, then swapping back, e.g.:

  r.mapcalc 'tmp1 = if(isnull(inmap),1,null())'
  r.grow in=tmp1 out=tmp2 new=1
  r.mapcalc outmap = if(isnull(tmp2),inmap,null())'

or can we make an r.shrink?

That's also an option. A separate r.shrink would be slightly simpler
than r.grow, as you don't need to deal with ordering (r.grow needs to
determine the nearest non-null cell, while an r.shrink would just need
to test whether any of the neighbours are null).

A standard set of image processing routines, for binary images, includes
dilate, erode, outline, and skeletonize. Two other compound routines are
open (dilate followed by erode) and close (erode followed by dilate). GRASS
can accomplish some of these processes:

dilate = r.grow
outline (sort of) = r.to.vect followed by v.to rast, but a better raster
version would be nice
skeletonize = r.thin (iterated)

But there is nothing to shrink areas (erode in image processing parlance).

Since GRASS already has such good image processing capabilities already, it
would be nice to round them out with these standard routines.

If people think this is a good idea, I can turn this into a wish.

Rather than writing lots of separate modules, it might be worth
extending r.mapcalc to support operations on a neighbourhood window
(i.e. so that things that you can theoretically do in r.mapcalc using
expressions with O(W x H) terms would actually become practical).

--
Glynn Clements <glynn@gclements.plus.com>

Glynn Clements wrote:

Rather than writing lots of separate modules, it might be worth
extending r.mapcalc to support operations on a neighbourhood window
(i.e. so that things that you can theoretically do in r.mapcalc using
expressions with O(W x H) terms would actually become practical).

how would this differ from r.mfilter or r.mfilter functionalily merged
into r.neighbors?

Rather than clutter r.mapcalc, could r.mfilter be extended to take
r.mapcalc expressions in the matrix def'n?

Hamish

Hamish wrote:

> Rather than writing lots of separate modules, it might be worth
> extending r.mapcalc to support operations on a neighbourhood window
> (i.e. so that things that you can theoretically do in r.mapcalc using
> expressions with O(W x H) terms would actually become practical).

how would this differ from r.mfilter or r.mfilter functionalily merged
into r.neighbors?

With r.mfilter + r.neighbors, how do you combine the weight with the
cell value to get the value passed to the aggregate?

r.mfilter multiplies the value by the weight, then sums the results.
Combining this with r.neighbors would allow additional aggregates, but
they would still operate upon products.

In the course of writing r.resamp.aggregate (or whatever it's
eventually called), it had occured to me that it might be useful to
support weighted aggregates (for boundary cells, which are currently
assigned in their entirety to the cell in which their centre lies).
Most of the existing aggregates can be extended in an "obvious"
manner.

However, there are still situations where you might want to compute
the aggregate over some arbitrary function of the <weight,value>
pairs, e.g. aggregate over the original values of all cells where the
weight-value product falls in some range, apply some other non-linear
transformation to the product, or even use something other than a
simple product.

This isn't something which can (easily[1]) be done by combining
existing modules, or even with a combined r.mfilter/r.neighbors
module.

[1] You can do it by creating WxH intermediate maps, each
corresponding to a single neighbourhood cell. This isn't particularly
efficient for large neighbourhood windows.

You could probably still get quite a way by replacing r.mfilter's
built-in sum aggregate with the ability to use the aggregates from the
stats library, along with a set of combining functions.

Rather than clutter r.mapcalc, could r.mfilter be extended to take
r.mapcalc expressions in the matrix def'n?

Not without copying most of r.mapcalc; or turning it into a library.

Actually moving most of the functions (x*.c) into a library would be
feasible, as there aren't that many dependencies on the r.mapcalc
framework. Some functions (e.g. xcoor.c, xrowcol.c) wouldn't be
applicable elsewhere, and a fair chunk (e.g. mapcalc.y, evaluate.c)
would need to be rewritten for each module.

Essentially, the changes required to r.mapcalc would involve some or
all of:

+ Allowing array values.
+ A function to return the neighbourhood of the current cell as an array.
+ The ability to map expressions over arrays.
+ A set of common aggregates.
+ A set of functions operating upon arrays (extraction, replacement).
+ The ability to define new aggregates (inductively, similar to the
  way that PostgreSQL's CREATE AGGREGATE works).

Hmm. This isn't trivial. But it isn't rocket science either.

OTOH, it might be better to think about a completely new language;
possibly a compiled language which can be extended through C.

[Note: Octave is way too slow for this; I tried using it for writing
some texture synthesis algorithms and it was roughly 50 times slower
than equivalent C code.]

I would like to see it made much easier to write raster processing
modules without all of the boilerplate involved in a typical r.*
module (look at r.example: 100+ lines to do the equivalent of
"r.mapcalc out = func(in)").

--
Glynn Clements <glynn@gclements.plus.com>