[GRASS-user] Raster histogram or r.info for raster min/max values

Hello All,

I am brand new to GRASS so I am sure this is a newbie question. (In a
previous role I was an advanced ER Mapper user so I know the task,
just not how to execute it in GRASS.)

I am trying to perform a dark pixel correction on a Landsat ETM mosaic
consisting of 15 scenes. For each band in each scene I need to
determine the minimum brightness value and then subtract that 'bias'
from the entire band. I have figured out how to use r.mapcalc to do
the subtraction, but I am getting what appears to be conflicting
results when I attempt to calculate the minimum value.

My operating system is Windows Vista Business and I am using GRASS
6.4.0svn under the wxPython user interface. My dilemma is this:

If I run r.info on Band 3 (visible red) for a particular scene I get
this result:

Range of data: min = 1 max = 255

Given that this band is visible red I would expect a minimum value
somewhat greater than 1 -- a value of between 5 and 10 would be more
appropriate for this region.

If I display the raster histogram (using the Analyze button on the Map
Display window), the x-axis extends anywhere from around 10 to around
235 (in a small window) or from 6 to around 247 (if I span the window
across both monitors). And, yes, I know the x-axis units are 'in
tens'.

Varying the size of the window has a major effect on the histogram! It
is not a visual error - the first tic clearly changes relative to the
y-intercept.

Which result do I believe? I cannot get a consistent histogram display
because it changes according to the window size, and I do not trust
the minimum values from r.info.

What I would really like is to generate a summary of the number of
pixels with each brightness value from 0 to 255. Is there an easy way
to do that? I have only been using GRASS a couple weeks and have not
figured out how to script it yet, so would prefer the solution to
involve the UI.

Cheers,

Frank

Hi,

Frank wrote:

I am trying to perform a dark pixel correction on a Landsat ETM
mosaic consisting of 15 scenes. For each band in each scene I
need to determine the minimum brightness value and then subtract
that 'bias' from the entire band. I have figured out how to use
r.mapcalc to do the subtraction, but I am getting what appears to be
conflicting results when I attempt to calculate the minimum value.

did you try yet the i.landsat.rgb module? you may look inside the
script with a text editor to see/adjust the method used.

example: http://grass.osgeo.org/screenshots/rs.php

It uses the r.univar module's percentile= option to chop away the
bottom x% of pixel values, rather than assume the minimum value seen
is worth keeping and not just an outlier.

Note that it does not touch data values, it only adjusts the colors.

My operating system is Windows Vista Business and I am using GRASS
6.4.0svn under the wxPython user interface. My dilemma is this:

If I run r.info on Band 3 (visible red) for a particular
scene I get this result:

Range of data: min = 1 max = 255

ok.
note r.info works on the map's metadata (so full extent), while r.univar
(and most other raster modules) work on the current region settings (zoom
extent).

also if current region resolution does not match the map's native
resolution some resampling will happen which may cause it to skip cells,
and so the minimum value might be hidden.
(see g.region help page and raster intro help page)

the command shell (dos box) way would be:
g.region rast=your_map # (zoom to match extent of that map)
r.univar your_map

Given that this band is visible red I would expect a minimum value
somewhat greater than 1 -- a value of between 5 and 10 would be more
appropriate for this region.

If I display the raster histogram (using the Analyze button on the Map
Display window), the x-axis extends anywhere from around 10 to around
235 (in a small window) or from 6 to around 247 (if I span the window
across both monitors).

....hmmm so the wxGUI histogram works on the Map Display region instead
of the computational one? (I'm not an expert on the ins and outs of the
GUI, not sure if that is true or not, nor which way is more appropriate)

try "zoom to current layer"

And, yes, I know the x-axis units are 'in tens'.

the axis scaling is dynamic dependent on the data.

Varying the size of the window has a major effect on the
histogram! It is not a visual error - the first tic clearly changes
relative to the y-intercept.
Which result do I believe?

The d.histogram module which generates that is a little bit crusty,
but all it does is run and plot up the results of r.stats. So I'd
suggest to run r.stats.

I do not trust the minimum values from r.info.

Trust them, but realize that it represents the entire map, not just
the current zoom settings.

What I would really like is to generate a summary of the
number of pixels with each brightness value from 0 to 255.
Is there an easy way to do that?

sure:

g.region rast=your_map.red # (eg red band, all bands should be the same)
r.stats -c your_map.red

then cut and paste output into your favorite plotting program to create
the histogram.

I don't think/know that this would work on Windows, but for Mac and
Linux users a way to recreate d.histogram using the d.linegraph module:

g.region rast=your_map.red
r.stats -c your_map.red | cut -f1 -d' ' > Xs
r.stats -c your_map.red | cut -f2 -d' ' > Ys
d.mon x0
d.linegraph x_file=Xs y_file=Ys

(dev note: it seems d.linegraph and d.histogram share code, if either is
to remain for GRASS7 recent bug fixes to d.histo should be ported over to
d.linegraph. personally I wouldn't mind to see PyPlot replace them both)

I have only been using GRASS a couple weeks and have not
figured out how to script it yet, so would prefer the
solution to involve the UI.

ok, so I have failed badly in that regard but hopefully it helps explain
some of the underlying logic and give you an idea of what to look for
in the menus.

(dev note- TODO: get module synopsis PDF* to show GUI menu hierarchy
location for each module name. prototype code can be found in
find_menu_hierarchy() in tools/module_synopsis.sh)
[*] http://grass.ibiblio.org/gdp/grassmanuals/grass64_module_list.pdf

Hamish

Hello Hamish,

Thanks so much for the very comprehensive reply -- it's much
appreciated. I must confess it's a bit beyond me at the moment but I'm
sure it will make more sense over time.

The distinction between the default region and the individual scenes
is very good to know - thank you. I had set the default region to the
native resolution of the scenes and extents of the entire mosaic, but
perhaps it would be better to maintain a separate region for each
scene while I'm pre-processing them?

I have been using i.landsat.rgb for display with good success, but
clipping histograms is a bit different from dark pixel correction,
which normalises the scenes in preparation for downstream calculations
like band ratios, classification, and so on. I've noticed i.atcorr but
such a comprehensive correction is beyond my needs (and understanding)
at the moment.

Thanks again. I will now go away and carefully contemplate your
answers before continuing ... which will no doubt raise the next set
of questions!

Cheers,

Frank

On Tue, Jul 21, 2009 at 2:20 PM, Hamish<hamish_b@yahoo.com> wrote:

Hi,

Frank wrote:

I am trying to perform a dark pixel correction on a Landsat ETM
mosaic consisting of 15 scenes. For each band in each scene I
need to determine the minimum brightness value and then subtract
that 'bias' from the entire band. I have figured out how to use
r.mapcalc to do the subtraction, but I am getting what appears to be
conflicting results when I attempt to calculate the minimum value.

did you try yet the i.landsat.rgb module? you may look inside the
script with a text editor to see/adjust the method used.

example: http://grass.osgeo.org/screenshots/rs.php

It uses the r.univar module's percentile= option to chop away the
bottom x% of pixel values, rather than assume the minimum value seen
is worth keeping and not just an outlier.

Note that it does not touch data values, it only adjusts the colors.

My operating system is Windows Vista Business and I am using GRASS
6.4.0svn under the wxPython user interface. My dilemma is this:

If I run r.info on Band 3 (visible red) for a particular
scene I get this result:

Range of data: min = 1 max = 255

ok.
note r.info works on the map's metadata (so full extent), while r.univar
(and most other raster modules) work on the current region settings (zoom
extent).

also if current region resolution does not match the map's native
resolution some resampling will happen which may cause it to skip cells,
and so the minimum value might be hidden.
(see g.region help page and raster intro help page)

the command shell (dos box) way would be:
g.region rast=your_map # (zoom to match extent of that map)
r.univar your_map

Given that this band is visible red I would expect a minimum value
somewhat greater than 1 -- a value of between 5 and 10 would be more
appropriate for this region.

If I display the raster histogram (using the Analyze button on the Map
Display window), the x-axis extends anywhere from around 10 to around
235 (in a small window) or from 6 to around 247 (if I span the window
across both monitors).

....hmmm so the wxGUI histogram works on the Map Display region instead
of the computational one? (I'm not an expert on the ins and outs of the
GUI, not sure if that is true or not, nor which way is more appropriate)

try "zoom to current layer"

And, yes, I know the x-axis units are 'in tens'.

the axis scaling is dynamic dependent on the data.

Varying the size of the window has a major effect on the
histogram! It is not a visual error - the first tic clearly changes
relative to the y-intercept.
Which result do I believe?

The d.histogram module which generates that is a little bit crusty,
but all it does is run and plot up the results of r.stats. So I'd
suggest to run r.stats.

I do not trust the minimum values from r.info.

Trust them, but realize that it represents the entire map, not just
the current zoom settings.

What I would really like is to generate a summary of the
number of pixels with each brightness value from 0 to 255.
Is there an easy way to do that?

sure:

g.region rast=your_map.red # (eg red band, all bands should be the same)
r.stats -c your_map.red

then cut and paste output into your favorite plotting program to create
the histogram.

I don't think/know that this would work on Windows, but for Mac and
Linux users a way to recreate d.histogram using the d.linegraph module:

g.region rast=your_map.red
r.stats -c your_map.red | cut -f1 -d' ' > Xs
r.stats -c your_map.red | cut -f2 -d' ' > Ys
d.mon x0
d.linegraph x_file=Xs y_file=Ys

(dev note: it seems d.linegraph and d.histogram share code, if either is
to remain for GRASS7 recent bug fixes to d.histo should be ported over to
d.linegraph. personally I wouldn't mind to see PyPlot replace them both)

I have only been using GRASS a couple weeks and have not
figured out how to script it yet, so would prefer the
solution to involve the UI.

ok, so I have failed badly in that regard but hopefully it helps explain
some of the underlying logic and give you an idea of what to look for
in the menus.

(dev note- TODO: get module synopsis PDF* to show GUI menu hierarchy
location for each module name. prototype code can be found in
find_menu_hierarchy() in tools/module_synopsis.sh)
[*] http://grass.ibiblio.org/gdp/grassmanuals/grass64_module_list.pdf

Hamish