[GRASSLIST:3463] Geophysical/Potential field modules for GRASS?

Hello, I will need to process/interpret some geophysical data (gravity, mag) and I would like to use GRASS for this purpose. Is anyone aware of any modules available for GRASS? I need to do things like trend removal, terrian corrections, Bouguer correction, reduction to pole, downward continuation, etc. I will also need to apply any number of 2-D filters.

If there are no suitable modules, I will likely write them. So please also let me know if anyone else is interested in such modules, and features that would be desired.

Thank you,
Craig

I have designed a simple approach to scripting (bash and perl) and running from witin the grass shell. I generate many (hundreds sometimes) of filtered Bouguer surfaces satisfying different extent and filter parameters in an iterative loop. It is easy to do, but I use the perl and the GMT tools for reducing, interpolating and filtering my raw data. I import this into grass and set up some of the support files, all from small master scripts. I found the fft/ifft modules in grass awkward, having become accusomed to setting my filter parameters from the command line in GMT, but I could probably figure out an equivalent way using the grass tools in a script if I had to. I have ideas for a module related to this, including perhaps color palette creation and asignment support from a color-chooser like interface. I have a problem where this would be useful. I am only a one-shot scripter, though, myself. I don't think I could tackle this, even in Tcl/Tk.

Funkmeister wrote:

Hello, I will need to process/interpret some geophysical data (gravity, mag) and I would like to use GRASS for this purpose. Is anyone aware of any modules available for GRASS? I need to do things like trend removal, terrian corrections, Bouguer correction, reduction to pole, downward continuation, etc. I will also need to apply any number of 2-D filters.

If there are no suitable modules, I will likely write them. So please also let me know if anyone else is interested in such modules, and features that would be desired.

Thank you,
Craig

As an archaeologist, I also work with geophysical data
regularly (especially gradiometer data). So far, I have used proprietary software,
such as Geoplot to view and filter data (trend removal/destriping, low/high pass filters etc.).
But I think that GRASS is, in principle, an ideal platform
for such tasks. I have already spent some time laying out an infrastructure for
geophysical data processing with GRASS.
I have also made plans to create some module myself - in the near future :wink:
The wonderfull "Scientific Applications on Linux" web repository (http://sal.jyu.fi/index.shtml)
has a category for "Computer Graphics, Images & Signals: Processing & Visualization".
In it, I found a link to the "XITE" image processing tools.
XITE also has an open source C library that seems to have ever imaginable
high-level functionality one would need.
I think one could get started ang get functionality very quickly by integrating this library with
GRASS.
So, if there is any serious interest in getting GRASS up to par with Imagine and the like,
maybe we should pull forces together and start coding?
Anyone else interested in this sort of functionality?

Cheers,

Benjamin.

On Fri, 21 May 2004 10:50:41 -0600
Funkmeister <funkmeister@lynxseismicdata.com> wrote:

Hello, I will need to process/interpret some geophysical data (gravity,
mag) and I would like to use GRASS for this purpose. Is anyone aware of
any modules available for GRASS? I need to do things like trend
removal, terrian corrections, Bouguer correction, reduction to pole,
downward continuation, etc. I will also need to apply any number of 2-D
filters.

If there are no suitable modules, I will likely write them. So please
also let me know if anyone else is interested in such modules, and
features that would be desired.

Thank you,
Craig

Hi,
I whould like to know how can i do this:
Suppose I have a mount (vector) that belong to a city(vector), but the mount
area it exceeds is border. If i use v.cutter the result will be the inside
part of the interseption, no?. But I want to know how I can obtain only the
part that don´t belong to the city area (outside part). In raster mode I can
do that, but I need that in vector mode. Probably if I used the result of
the v.cutter then make v.patch with the mount vector and finally with
v.extrat I will obtain the result. There is an easy and logic way to do
this?

thanks

Luis Gonçalves Seco

-----Mensagem original-----
De: owner-GRASSLIST@baylor.edu [mailto:owner-GRASSLIST@baylor.edu] Em nome
de Benjamin Ducke
Enviada: sábado, 22 de Maio de 2004 9:48
Para: grasslist@baylor.edu
Assunto: [GRASSLIST:3465] Re: Geophysical/Potential field modules for GRASS?

As an archaeologist, I also work with geophysical data
regularly (especially gradiometer data). So far, I have used proprietary
software,
such as Geoplot to view and filter data (trend removal/destriping, low/high
pass filters etc.).
But I think that GRASS is, in principle, an ideal platform
for such tasks. I have already spent some time laying out an infrastructure
for
geophysical data processing with GRASS.
I have also made plans to create some module myself - in the near future :wink:
The wonderfull "Scientific Applications on Linux" web repository
(http://sal.jyu.fi/index.shtml)
has a category for "Computer Graphics, Images & Signals: Processing &
Visualization".
In it, I found a link to the "XITE" image processing tools.
XITE also has an open source C library that seems to have ever imaginable
high-level functionality one would need.
I think one could get started ang get functionality very quickly by
integrating this library with
GRASS.
So, if there is any serious interest in getting GRASS up to par with Imagine
and the like,
maybe we should pull forces together and start coding?
Anyone else interested in this sort of functionality?

Cheers,

Benjamin.

On Fri, 21 May 2004 10:50:41 -0600
Funkmeister <funkmeister@lynxseismicdata.com> wrote:

Hello, I will need to process/interpret some geophysical data (gravity,
mag) and I would like to use GRASS for this purpose. Is anyone aware of
any modules available for GRASS? I need to do things like trend
removal, terrian corrections, Bouguer correction, reduction to pole,
downward continuation, etc. I will also need to apply any number of 2-D
filters.

If there are no suitable modules, I will likely write them. So please
also let me know if anyone else is interested in such modules, and
features that would be desired.

Thank you,
Craig

On Saturday 22 May 2004 23:47, Luis Gonçalves Seco wrote:

Hi,
I whould like to know how can i do this:
Suppose I have a mount (vector) that belong to a city(vector), but the
mount area it exceeds is border. If i use v.cutter the result will be the
inside part of the interseption, no?. But I want to know how I can obtain
only the part that don´t belong to the city area (outside part). In raster
mode I can do that, but I need that in vector mode. Probably if I used the
result of the v.cutter then make v.patch with the mount vector and finally
with v.extrat I will obtain the result. There is an easy and logic way to
do this?

v.overlay operator=not (5.7)

Radim

Hi Radim,

Thanks, but i´m using the 5.03 version. There are no other way?

Regards,

Luis Gonçalves Seco

-----Mensagem original-----
De: owner-GRASSLIST@baylor.edu [mailto:owner-GRASSLIST@baylor.edu] Em nome
de Radim Blazek
Enviada: segunda-feira, 24 de Maio de 2004 9:21
Para: Luis Gonçalves Seco; grasslist@baylor.edu
Assunto: [GRASSLIST:3470] Re: v.cutter

On Saturday 22 May 2004 23:47, Luis Gonçalves Seco wrote:

Hi,
I whould like to know how can i do this:
Suppose I have a mount (vector) that belong to a city(vector), but the
mount area it exceeds is border. If i use v.cutter the result will be the
inside part of the interseption, no?. But I want to know how I can obtain
only the part that don´t belong to the city area (outside part). In raster
mode I can do that, but I need that in vector mode. Probably if I used the
result of the v.cutter then make v.patch with the mount vector and finally
with v.extrat I will obtain the result. There is an easy and logic way to
do this?

v.overlay operator=not (5.7)

Radim

Benjamin Ducke <benducke <at> compuserve.de> writes:

As an archaeologist, I also work with geophysical data
regularly (especially gradiometer data). So far, I have used proprietary

software,

such as Geoplot to view and filter data (trend removal/destriping, low/high

pass filters etc.).

But I think that GRASS is, in principle, an ideal platform
for such tasks. I have already spent some time laying out an infrastructure

for

geophysical data processing with GRASS.
I have also made plans to create some module myself - in the near future :wink:
The wonderfull "Scientific Applications on Linux" web repository

(http://sal.jyu.fi/index.shtml)

has a category for "Computer Graphics, Images & Signals: Processing &

Visualization".

In it, I found a link to the "XITE" image processing tools.
XITE also has an open source C library that seems to have ever imaginable
high-level functionality one would need.
I think one could get started ang get functionality very quickly by

integrating this library with

GRASS.
So, if there is any serious interest in getting GRASS up to par with Imagine

and the like,

maybe we should pull forces together and start coding?
Anyone else interested in this sort of functionality?

Cheers,

Benjamin.

On Fri, 21 May 2004 10:50:41 -0600
Funkmeister <funkmeister <at> lynxseismicdata.com> wrote:

> Hello, I will need to process/interpret some geophysical data (gravity,
> mag) and I would like to use GRASS for this purpose. Is anyone aware of
> any modules available for GRASS? I need to do things like trend
> removal, terrian corrections, Bouguer correction, reduction to pole,
> downward continuation, etc. I will also need to apply any number of 2-D
> filters.
>
> If there are no suitable modules, I will likely write them. So please
> also let me know if anyone else is interested in such modules, and
> features that would be desired.
>
> Thank you,
> Craig
>

this is the first time i have added a message to this user group and i hope it
gets there this time i have been rejected 3 times for different reasons.

We are a geophysical contracting and consulting company and make use of
proprietary, in-house and opensource software to do our data processing and
interpretation. We also specialize in doing 3D inversions of magnetic, gravity
and IP data on our in-house cluster using software developed by UBC and
modified for our use.
We are trying to move over to using opensource packages like grass, VTK etc to
process and display data and are looking for funding to carry this forward a
extra step. As you mentioned there is a lot of free software available. A other
place that I can think of is at the USGS site http://pubs.usgs.gov/fs/fs-0076-
95/FS076-95.html. The problem or at least i think the problem is not only
finding the free available software but putting it together into a useful
package. We would certainly like to see any geophysical, potential field,
functionality added to a package like grass. Although we are extremely busy at
this point we would try to help anyone starting such a project.

Syd
www.sjgeophysics.com

(First of all: my apologies for the length of this message
and my inability to say more with fewer words)

Thanks to everyone who joined in to the discussion
about GRASS and geophysics so far. I think this is an
issue that a lot of us are interested in.

A valid point by Syd: if we go through all the work of creating
a full set of GRASS modules, we will want to make sure that we
use the full power of GRASS data integration, not just create
a bunch of disconnected tools, each one with a different set
of command line parameters.
My ideas of a GRASS-based geophysics workflow include:

( - possibly provide device drivers to read measurement
    data from the serial interface in the field )

- provide a consistent storage for raw measurement data as
  dedicated database elements in the working location.
  This way I won't have my
  ASCII files floating around random folders in the file
  system and can always fall back to the original data

- provide a simple and comfortable way of setting up a mesh
  of adjacent, regular measurement grids with dimension,
  orientation and origin in reference to the working location.
  Each raw measurements file could then be referenced to
  this mesh by simply specifying its row and column location

- a set of modules to apply filters to any GRASS raster map
  These modules should share a common set of command line
  parameters and instead of applying filters immediately,
  store each filter and its parameters in a processing list.
  The processing list should be a simple ASCII file, which
  in turn resides in its own database element

- a set of modules (possibly a simple GUI) to manipulate
  the processing list, ensuring that only valid filters
  and parameters get written into it

- a set of modules to easily manipulate the color ramp
  for the entire mesh

- For the final output:
  a module to create a GRASS raster map from the ENTIRE mesh:

  1. read the individual raw data files associated with
  each grid location,
  2. convert each individual grid to a GRASS raster map and
  store it in a cache database element
  3. apply all specified filters in the processing list and
  color ramps to the grid maps
  4. Patch grids together into one GRASS raster map
  5. rotate and move the entire thing to the specified origin
  in the working location,

This is what I would prefer to have it done like, based
on my field experience with gradiometer and caesium data.
I don't know a whole lot about other geophysics and have
left out any plotting capability for e.g. resistivity
data or geo-electrics.
So, what do you think should be added/changed about this concept?
I would like us to assemble a white paper on targeted workflow
and functionality asap, so we can, as a next step, decide
on what technology to use.

Best regards,

Benjamin

On Tue, 25 May 2004 14:28:39 +0000 (UTC)
Syd Visser <sydv@sjgeophysics.com> wrote:

Benjamin Ducke <benducke <at> compuserve.de> writes:

>
> As an archaeologist, I also work with geophysical data
> regularly (especially gradiometer data). So far, I have used proprietary
software,
> such as Geoplot to view and filter data (trend removal/destriping, low/high
pass filters etc.).
> But I think that GRASS is, in principle, an ideal platform
> for such tasks. I have already spent some time laying out an infrastructure
for
> geophysical data processing with GRASS.
> I have also made plans to create some module myself - in the near future :wink:
> The wonderfull "Scientific Applications on Linux" web repository
(http://sal.jyu.fi/index.shtml)
> has a category for "Computer Graphics, Images & Signals: Processing &
Visualization".
> In it, I found a link to the "XITE" image processing tools.
> XITE also has an open source C library that seems to have ever imaginable
> high-level functionality one would need.
> I think one could get started ang get functionality very quickly by
integrating this library with
> GRASS.
> So, if there is any serious interest in getting GRASS up to par with Imagine
and the like,
> maybe we should pull forces together and start coding?
> Anyone else interested in this sort of functionality?
>
> Cheers,
>
> Benjamin.
>
> On Fri, 21 May 2004 10:50:41 -0600
> Funkmeister <funkmeister <at> lynxseismicdata.com> wrote:
>
> > Hello, I will need to process/interpret some geophysical data (gravity,
> > mag) and I would like to use GRASS for this purpose. Is anyone aware of
> > any modules available for GRASS? I need to do things like trend
> > removal, terrian corrections, Bouguer correction, reduction to pole,
> > downward continuation, etc. I will also need to apply any number of 2-D
> > filters.
> >
> > If there are no suitable modules, I will likely write them. So please
> > also let me know if anyone else is interested in such modules, and
> > features that would be desired.
> >
> > Thank you,
> > Craig
> >
>
>

this is the first time i have added a message to this user group and i hope it
gets there this time i have been rejected 3 times for different reasons.

We are a geophysical contracting and consulting company and make use of
proprietary, in-house and opensource software to do our data processing and
interpretation. We also specialize in doing 3D inversions of magnetic, gravity
and IP data on our in-house cluster using software developed by UBC and
modified for our use.
We are trying to move over to using opensource packages like grass, VTK etc to
process and display data and are looking for funding to carry this forward a
extra step. As you mentioned there is a lot of free software available. A other
place that I can think of is at the USGS site http://pubs.usgs.gov/fs/fs-0076-
95/FS076-95.html. The problem or at least i think the problem is not only
finding the free available software but putting it together into a useful
package. We would certainly like to see any geophysical, potential field,
functionality added to a package like grass. Although we are extremely busy at
this point we would try to help anyone starting such a project.

Syd
www.sjgeophysics.com

Well I underestimated how much interest there is in geophysics apps. I am rather new to GRASS but I also feel that GRASS is an ideal platform for such apps.

I think the white paper is a great idea - and it will serve as a nice working document to keep efforts coordinated and development on target.

I am a Geophysicist with a strong programming and database background and my employer is quite open to the idea of developing some modules for GRASS. The oil industry is also very profitable now - so the timing is good. It would be really nice if other people could get involved with development. We are a very small company so to begin with I have to focus on meeting my/employers immediate needs. And I am hoping that this effort won't be wasted. So by coordinating needs and documenting them in the form of a white paper, development can proceed without having to do things over again at a latter time.

Getting the framework right from the start is essential. That way it is easy for others contribute to it, and the software can grow nicely without other having to reinvent the wheel. I also feel that that we should start with GRASS 5.7 because of it's seamless integration with ODBC, MySQL, Postgres and others...Any comments on this?

My comments are dispersed below:

On 25-May-04, at 10:52 AM, Benjamin Ducke wrote:

(First of all: my apologies for the length of this message
and my inability to say more with fewer words)

Thanks to everyone who joined in to the discussion
about GRASS and geophysics so far. I think this is an
issue that a lot of us are interested in.

A valid point by Syd: if we go through all the work of creating
a full set of GRASS modules, we will want to make sure that we
use the full power of GRASS data integration, not just create
a bunch of disconnected tools, each one with a different set
of command line parameters.
My ideas of a GRASS-based geophysics workflow include:

( - possibly provide device drivers to read measurement
    data from the serial interface in the field )

I have limited experience with device drivers - but I do know that it can be a pain because of platform issues. It might be smart to approach the problem like GRASS does with the display drivers. Develop a device independent protocol and then write platform specific "drivers" that know how to communicate with the OS, and the field devices.

For example Mac OS X could be really problematic because new Macs do not have serial ports and I assume that most field devices use a serial port for communication. I could be wrong here...is there USB interfaces?

- provide a consistent storage for raw measurement data as
  dedicated database elements in the working location.
  This way I won't have my
  ASCII files floating around random folders in the file
  system and can always fall back to the original data

Where you thinking about a file system database here, or a relational database? It would be really nice if this could be transparent to the end user - like in GRASS 5.7. Where you choose your driver and then don't worry about where the data is actually stored.

- provide a simple and comfortable way of setting up a mesh
  of adjacent, regular measurement grids with dimension,
  orientation and origin in reference to the working location.
  Each raw measurements file could then be referenced to
  this mesh by simply specifying its row and column location

I think this would be critical to seamless data import.

- a set of modules to apply filters to any GRASS raster map
  These modules should share a common set of command line
  parameters and instead of applying filters immediately,
  store each filter and its parameters in a processing list.
  The processing list should be a simple ASCII file, which
  in turn resides in its own database element

XML might be ideal for this? (processing list).

- a set of modules (possibly a simple GUI) to manipulate
  the processing list, ensuring that only valid filters
  and parameters get written into it

This could work much like the new GUI for GRASS, where each processing element would be synonymous with a layer. When one clicks on a node a panel would be populated with the attributes of that node and the user could edit them. As long as all the filters share a similar schema, this should be fairly straight forward to implement.

- a set of modules to easily manipulate the color ramp
  for the entire mesh

I have not taken the time to look at this but it would seem like many GRASS users could benefit from such a module if one does not exist.

- For the final output:
  a module to create a GRASS raster map from the ENTIRE mesh:

  1. read the individual raw data files associated with
  each grid location,
  2. convert each individual grid to a GRASS raster map and
  store it in a cache database element
  3. apply all specified filters in the processing list and
  color ramps to the grid maps
  4. Patch grids together into one GRASS raster map
  5. rotate and move the entire thing to the specified origin
  in the working location,

This is what I would prefer to have it done like, based
on my field experience with gradiometer and caesium data.
I don't know a whole lot about other geophysics and have
left out any plotting capability for e.g. resistivity
data or geo-electrics.

I also do not have a need to work with Resistivity however we do work with seismic reflection data quite a bit. WRT 2-D seismic - I would likely use other software to process and interpret the data. Then I would import time/depth picks as site/vector data and use krigging or other interpolation methods to obtain DEM's. Many tools from R could probably be used to work with 2-D data - although I have not looked at this yet.

So, what do you think should be added/changed about this concept?
I would like us to assemble a white paper on targeted workflow
and functionality asap, so we can, as a next step, decide
on what technology to use.

I'm in, I have goals I need to meet for my work in the near term however I think the white paper is a great idea and is essential to getting the architecture right from the start.

Craig

Best regards,

Benjamin

Craig,

I will write you directly as well as the list. Perhaps you can coordinate several of replies.

Most of the things being requested below already exist in GRASS. I will comment briefly but will be happy to expand if you'd like. As is the gist of the comments below, I'll focus on GRASS 5.7.

1. I agree with Benjamin about device drivers. These are a lot of trouble and beyond what a GIS is designed to do. They have to be platform-specific and device-specific to be useful. On the other hand, most data collectors for geophysical data produce standard ascii output files (e.g., csv). These CAN be read into GRASS using v.in.ascii.

2. The idea of a GIS is to have an integrated data management system. This is quite good in GRASS 5.7. Internally it uses dbf files. This would be more than sufficient for standard geophysical data and easily readable by a variety of other common programs (e.g., Excel, Access, Filemaker). However, as someone pointed out below, 5.7 also has easy links to other database systems like MySQL and PostgreSQL.

3. A ...

  mesh of adjacent, regular measurement grids with dimension,
  orientation and origin in reference to the working location

...is provided by the GRASS raster data model. It is a regular grid by default and you can specify the size of each grid in the g.region module. Point data can be easily attached to the grid, though a simple script might make this easier.

If the original data are not oriented according to geographic directions, they can be imported into an XY location. If a set of points are known that relate an arbitrary XY location to a georeferenced (i.e., real world) grid system, GRASS provides tools (i.rectify, v.transform, r.proj, v.proj) to georeference the original working grid so that it can be combined with any other grids.

4. GRASS has a series of fairly sophisticated filter modules. More could be written if there are specific ones for particular kinds of uses. The ones that exist include: r.neighbor (many neighborhood functions like mean, median, diversity, etc.), r.mfilter (user-defined matrix/convolving filter), i.fft and i.ifft (fast fourier transform), and i.zc (zero-crossing edge detection). It also includes a reasonably decent set of image classification (supervised and unsupervised) and analysis functions (e.g., pca, cannonical components analysis, tassled cap and brovey transformations, etc.). Of course, there are all the other many raster and vector analysis modules that can be applied to multiple grids.

5. r.colors gives very flexible and powerful ways to create and modify color tables across the grids. You can specify by values, ranges of values, percents of total, etc. using standard GRASS colors and/or rgb triplets.

6. On top of this, you can easily render any grid and vector data into 2.5 D surfaces using NVIZ, AND you can create true 3D volumes using the Grid3D modules. The latter still need to be better developed but are as or more sophisticated than anything else I know of in a GIS package. 3D volumes can also be displayed and manipulated in NVIZ. Finally, using NVIZ, you can create 'fly-through' key frame animations.

I am sure that GRASS could benefit from new modules or shell scripts that could enhance its ability to be used in geophysical analysis (e.g., new filters or scripts to automate the import and georeferencing of XY data). However, it interesting that it already does all the things requested below.

Michael Barton
______________________________
Michael Barton, Professor & Curator
School of Human Origins, Cultures, & Societies
Arizona State University
Tempe, AZ 85287-2402
USA

voice: 480-965-6262; fax: 480-965-7671
www: http://www.public.asu.edu/~cmbarton
On May 25, 2004, at 11:49 AM, Craig Funk wrote:

Well I underestimated how much interest there is in geophysics apps. I am rather new to GRASS but I also feel that GRASS is an ideal platform for such apps.

I think the white paper is a great idea - and it will serve as a nice working document to keep efforts coordinated and development on target.

I am a Geophysicist with a strong programming and database background and my employer is quite open to the idea of developing some modules for GRASS. The oil industry is also very profitable now - so the timing is good. It would be really nice if other people could get involved with development. We are a very small company so to begin with I have to focus on meeting my/employers immediate needs. And I am hoping that this effort won't be wasted. So by coordinating needs and documenting them in the form of a white paper, development can proceed without having to do things over again at a latter time.

Getting the framework right from the start is essential. That way it is easy for others contribute to it, and the software can grow nicely without other having to reinvent the wheel. I also feel that that we should start with GRASS 5.7 because of it's seamless integration with ODBC, MySQL, Postgres and others...Any comments on this?

My comments are dispersed below:

On 25-May-04, at 10:52 AM, Benjamin Ducke wrote:

(First of all: my apologies for the length of this message
and my inability to say more with fewer words)

Thanks to everyone who joined in to the discussion
about GRASS and geophysics so far. I think this is an
issue that a lot of us are interested in.

A valid point by Syd: if we go through all the work of creating
a full set of GRASS modules, we will want to make sure that we
use the full power of GRASS data integration, not just create
a bunch of disconnected tools, each one with a different set
of command line parameters.
My ideas of a GRASS-based geophysics workflow include:

( - possibly provide device drivers to read measurement
    data from the serial interface in the field )

I have limited experience with device drivers - but I do know that it can be a pain because of platform issues. It might be smart to approach the problem like GRASS does with the display drivers. Develop a device independent protocol and then write platform specific "drivers" that know how to communicate with the OS, and the field devices.

For example Mac OS X could be really problematic because new Macs do not have serial ports and I assume that most field devices use a serial port for communication. I could be wrong here...is there USB interfaces?

- provide a consistent storage for raw measurement data as
  dedicated database elements in the working location.
  This way I won't have my
  ASCII files floating around random folders in the file
  system and can always fall back to the original data

Where you thinking about a file system database here, or a relational database? It would be really nice if this could be transparent to the end user - like in GRASS 5.7. Where you choose your driver and then don't worry about where the data is actually stored.

- provide a simple and comfortable way of setting up a mesh
  of adjacent, regular measurement grids with dimension,
  orientation and origin in reference to the working location.
  Each raw measurements file could then be referenced to
  this mesh by simply specifying its row and column location

I think this would be critical to seamless data import.

- a set of modules to apply filters to any GRASS raster map
  These modules should share a common set of command line
  parameters and instead of applying filters immediately,
  store each filter and its parameters in a processing list.
  The processing list should be a simple ASCII file, which
  in turn resides in its own database element

XML might be ideal for this? (processing list).

- a set of modules (possibly a simple GUI) to manipulate
  the processing list, ensuring that only valid filters
  and parameters get written into it

This could work much like the new GUI for GRASS, where each processing element would be synonymous with a layer. When one clicks on a node a panel would be populated with the attributes of that node and the user could edit them. As long as all the filters share a similar schema, this should be fairly straight forward to implement.

- a set of modules to easily manipulate the color ramp
  for the entire mesh

I have not taken the time to look at this but it would seem like many GRASS users could benefit from such a module if one does not exist.

- For the final output:
  a module to create a GRASS raster map from the ENTIRE mesh:

  1. read the individual raw data files associated with
  each grid location,
  2. convert each individual grid to a GRASS raster map and
  store it in a cache database element
  3. apply all specified filters in the processing list and
  color ramps to the grid maps
  4. Patch grids together into one GRASS raster map
  5. rotate and move the entire thing to the specified origin
  in the working location,

This is what I would prefer to have it done like, based
on my field experience with gradiometer and caesium data.
I don't know a whole lot about other geophysics and have
left out any plotting capability for e.g. resistivity
data or geo-electrics.

I also do not have a need to work with Resistivity however we do work with seismic reflection data quite a bit. WRT 2-D seismic - I would likely use other software to process and interpret the data. Then I would import time/depth picks as site/vector data and use krigging or other interpolation methods to obtain DEM's. Many tools from R could probably be used to work with 2-D data - although I have not looked at this yet.

So, what do you think should be added/changed about this concept?
I would like us to assemble a white paper on targeted workflow
and functionality asap, so we can, as a next step, decide
on what technology to use.

I'm in, I have goals I need to meet for my work in the near term however I think the white paper is a great idea and is essential to getting the architecture right from the start.

Craig

Best regards,

Benjamin

Allright, this is getting to be a rather verbose discussion.
Just some short clarifications and comments:

- a "device driver" in our context would simply receive a stream
  of bytes from the serial line/USB parallel port, whatever,
  using Linux /dev/* entries. I don't know about MacOS and windows
  but it surely is not much harder to do it in one of those
  environments. On top of that, there would be a device-dependent
  part which actually makes sense of the byte stream by applying
  a vendor-specific protocol. This critical information can
  really only be got from the manufacturer/manual. Other than that, I don't
  think any OS-level programming would be necessary.

- I have used the term 'GRASS database element' several times.
  This actually refers to a directory below the LOCATION/MAPSET
  directory in GRASS terminology. E.g. 'fcell' is the element
  (directory) that stores floating point raster maps.
  The GRASS programming API has very convenient functions to
  create/delete and read/write user-definable elements. I deem
  this an easy and clean way to integrate all the additional
  info and data into the grass location. If someone wants to
  share his/her geophysical analysis with someone else, all
  that is required is to zip the needed elements into a package
  and copy them into the other user's location.

- I agree that XML is a very nice format to store the processing
  list, as long as we use only GRASS modules/GUI to make any changes
  to it. If we keep it XML, we can use the excellent cross-platform
  open source 'libxml2' and won't have any trouble parsing module
  parameters.
  A plain ASCII list would have the benefit of the user being able
  to edit it with a simple text editor, but would ease file corruption
  and mean a lot of additional work to parse module parameters when
  reading the list.

- I don't think there will be too much vector programming involved
  in the basic framework, so it does not really matter what
  GRASS 5 version we use -- they all share the same raster API
  I personally am using 5.3 right now but am slowly switching
  over to 5.7. Site files are officially banned from GRASS 5.7 and
  everything to do with them would have to be replaced with
  vector points and the vector API.

  Craig: I don't use relational DBs too often, so my creativity in
  this field is a bit limitied: where would you envision the usefulness of
  a DB connection in this framework?

- I would imagine a basic GUI in the same way Craig outlined it

- GRASS has color ramp tools, but these are -- well (cough).
  We need something more convenient, that allows the user to easily
  e.g. rescale 100 gray scale values to +/- 3 STD of the image data.

Well, that's all I can think of, right now.

Benjamin

I will attempt to summarize the current thread,

1) Device interfaces to field instruments would be a *nice to have* but the functionality is already there. although that requires a few extra steps. As I indicated previously - my experience is limited - device drivers on windows is a mess. Not nearly as simple as Linux.

2) Database means GRASS database not relational. I don't think a relational database would add any value, it would probably make it more difficult to move data around. You can't just zip it up and mail it for example. And for large datasets there would be considerable overhead when loading the data. So GRASS file systems is the best approach

3) The raster data model is the way to go. Michael, how do you associate point data with a raster grid? Also would the GRASS architects (Neteler et al.) frown on adding new database structures?

4) GRASS has many powerful filtering modules. Any convolution based filter can be realized through r.mfilter, and r.mapcalc can be used to implement many filters. However there seems to be a need for a more seamless "integration" of filters. Also specific corrections - like the terrain correction - do not exist. To implement such a correction with r.mapcalc would be tedious. I think this is where the white paper could be useful in that it should highlight the filters that are needed for geophysical apps; identify those filters that already exist in GRASS; and then we are left with the filters/corrections that need to be implemented.

I really like the idea of the filtering stream (Benjamin). Maybe this could be a new GRASS module that is a seamless front end to existing GRASS modules and possibly some yet-to-be-determined modules. For example a commonly applied low pass 2-D filter would be predefined in this module without having to write the kernel function in an ASCII file. It would call r.mfilter on the users behalf with the predefined kernel function. Also by having a schema and an XML structure these streams should be easy to manage - maybe even just store in the database. For new/casual users this would be probably simplify their GRASS experience/barrier to use. This module would also have many other uses to all types of raster data.

5) Color management routines exist, but they could maybe use some improvement. I have not looked into this personally so I am speculating here, but other users (non-geophysical) would also likely be interested in an easy-to-use color map generating module.

My 2 cents worth:

I think it is always best to start small and build from there. So it is important to get the architecture right from the start. Plus if my boss sees some results early, he will be willing to continue funding of my portion of the work :wink:

Also a white paper would be a very nice way of starting this, anyone have a document we can start with? *or* I can just start one based on this thread. How about a Wikki?

Cheers,
Craig

- a "device driver" in our context would simply receive a stream
  of bytes from the serial line/USB parallel port, whatever,
  using Linux /dev/* entries. I don't know about MacOS and windows
  but it surely is not much harder to do it in one of those
  environments. On top of that, there would be a device-dependent
  part which actually makes sense of the byte stream by applying
  a vendor-specific protocol. This critical information can
  really only be got from the manufacturer/manual. Other than that, I
  don't think any OS-level programming would be necessary.

I'm not sure this is really something that should be part of the GIS.
The GIS should be able to import flat x,y,z,data1,data2,string1,string2
style data in a generic sense. Any hardware drivers should in turn be
generic and dump their data into a flat ascii file which many programs
can access. The two should run concurrently but independently.
?

For ideas, see how the 'gpsd' GPS interface can talk to GRASS in real
time: http://op.gfz-potsdam.de/GRASS-List/Archive/msg10661.html

It isn't wonderful, but it works. Note NMEA is a fairly generic
standard,
so this was pretty easy to do.

I'm sure someone would be happy to host a repository for add-on Free
software somewhere though.. who knows, maybe it should be added to our
giant pile of code.

- GRASS has color ramp tools, but these are -- well (cough).
  We need something more convenient, that allows the user to easily
  e.g. rescale 100 gray scale values to +/- 3 STD of the image data.

such as 'r.colors color=grey.eq'?
Specific suggestions for improvements are welcome.

Hamish

Thanks for all the hints, Michael

Although I have been using GRASS for several years now,
I still am not aware of all the capabilities it has.
I am especially thankful for pointing out the possibility
to user GRASS' rubber sheeting capabilites for geo-referencing
the data. I had completely forgotten all about that functionality...
We should make sure that we make as much use as possible
of what's already in place.
I completely agree about the importance of volumetric representations
of data, especially concerning archaeological stratigraphic reasoning.
Unfortunately, I have never dealt much with this form of data
representation and have no experience.
Do you have any specific thoughts on how to integrate volumetric
measurements into a geophysics infrastructure?

If everyone agrees, I would like to get started on summing up
the discussed items so far and creating a first draft white paper.
I have a heavy workload right now but am pretty confident
that I would be able to finish it over the weekend and mail
it around at the start of next week.
Good enough?

Cheers,

Benjamin

On Tue, 25 May 2004 16:10:10 -0700
Michael Barton <michael.barton@asu.edu> wrote:

Benjamin,

Certainly a module or set of modules to make this easier would be nice.
And I certainly don't want to discourage you from doing this. I just
wanted to point out that almost all of this is already built into
GRASS. You can make a slick application by using existing commands in a
shell script rather than writing them again from the ground up in C++.
Using TclTk, for example, scripts can be very sophisticated. The
display manager is a script. An alternative is to write an external
program that is geared toward geophysical survey using grasslib,
allowing you to use grass commands in your application.

I've actually done exactly what you describe last fall on one of my
sites in Spain for Cesium magnetometry data. All of it was done in
GRASS except calculating the xy coordinates of the data points (I did
this in Excel). Of course, I did each step one at a time. You could
speed this up by automating the data flow from one module to another.
The creation of xy coordinates for a stream of data points along a
transect would take new coding (the part I did in Excel). If you
incorporated a metagrid reference in this translation module so that it
assigns x&y correctly, the points for each survey grid would
automatically be referenced into your mesh.

With regularly spaced data points you could use fast v.surf.idw
interpolation (or r.binlinear) to create the grid. However, the
distance between data points is often much less than the distance
between transects. In this case, it might be necessary to use some of
the more sophisticated options of v.surf.rst. If another interpolation
routine works better for you (e.g., some form of Krieging), you would
need to create a new module (this would be nice for a variety of
things). r.patch will put the maps together.

You can subset using g.region (specify extents) and/or masks (each grid
being a potential mask) for analysis. Since GRASS and most GIS programs
always create new maps from raster analysis routines, rollback is never
a problem, though accumulating files is. g.mlist will give you batch
lists of you analysis files if you name them in a consistent fashion so
your application can track what has been done are display the results
of different filtering sessions.

You can create a set of reference points (ground control points), save
them in the format used by i.rectify. Again, you could script or
program this to make it easier. For a rectilinear grid, you only need
3-4 for a 1st order transformation. If the grids are patched, this will
georeference and rectify the entire set to whatever coordinate system
you want. Alternatively, you can do this outside GRASS with gdalwarp if
you have gdal on your system.

Clearly, if you do this a lot, it would be very handy to automate
and/or enhance these various routines in a systematic fashion. However,
you don't have to start from scratch, but can use tools already built
into GRASS to give you more bang for your buck so to speak.

One thing not mentioned is the use of true 3D volumetric modeling. This
is an area that definitely COULD use new modules programed. It seems
highly appropriate for archaeology and geological applications where we
actually deal with volumes of sediment (or rock) rather than surfaces.
GPR and coring data are naturals for this, but GRASS lacks much in the
way of analysis or query ability for G3D data--only a map calculator,
though this gives something to start with for someone who wanted to
work with it.

These are just some thoughts. I'd love to see more ways to use this for
archaeology and will be very interested in where you go with this. I
want to encourage you to work with this and hope you will keep me in
the loop. Thanks.

Michael
______________________________
Michael Barton, Professor & Curator
School of Human Origins, Cultures, & Societies
Arizona State University
Tempe, AZ 85287-2402
USA

voice: 480-965-6262; fax: 480-965-7671
www: http://www.public.asu.edu/~cmbarton
On May 25, 2004, at 2:34 PM, Benjamin Ducke wrote:

> Hmmm, I think these are somewhat different ideas about
> the functional level of working with geophysical data in GRASS.
> My idea (and I think also Craig's) was to actually have a high-level
> infrastructure that makes working with a lot of measurements and
> filter setups a breeze.
>
> by (quote myself)
>>> mesh of adjacent, regular measurement grids with dimension,
>>> orientation and origin in reference to the working location
>
> I was referring to this:
> When I take gradiometer measurements in the field, I first superimpose
> a mesh of adjacent, regular and equal-sized grids on the site.
> Each of these grids might be, say 20x20 m.
> I complete a series of measurements by walking zig-zag or parallel
> lines in this grid. The grid gets saved as an ASCII-list of raw
> measurement data (or transferred directly to GRASS via a serial
> device driver).
> Now, after I get home from the field, I have these things on my
> agenda:
>
> - convert each grid's ASCII data to a raster map
> if I took a measurement each 10 cm, each grid would
> result in a 200 x 200 cell raster map
> - assemble the entire mesh from all the grids
> and georeference it to my other site data
> - apply filters to all or a subset of the grids
> try out different filter combinations, roll-back
> to original data if I messed up
> - create color ramp(s)
> - output the final image
>
> Now, if there was a place to store each grid's position
> in a mesh meta structure, plus some additional information,
> I could easily do the following
> things:
>
> - read raw data into one of the grids and turn it
> into a raster map at, say, position 3 (instead
> of having to create a raster, than editing its
> header to move it to the precise position I want it)
>
> - run a number of filters on all or a selection
> of grids (say low pass filter over 1,2 and 6)
>
> - keep several sets of processing lists with different
> filters and options and switch between them
> to quickly compare results.
>
> - move the whole bunch of grids around and rotate
> them into the right position, by specifying
> coordinates of the MESH instead of every single
> grid -- and keep the overall structure intact
>
> +-+-+-+
> |1|2|3|
> +-+-+-+
> |5|6|7|
> +-+-+-+
>
> Say -- wouldn't that save a lot of work and be
> so much more fun than batch scripting?
>
>
> Cheers,
>
> Benjamin

Michael

Benjamin

[mix of replies here]
..............

> The creation of xy coordinates for a stream of data points along a
> transect would take new coding (the part I did in Excel).

If I get your meaning right, see 'r.profile -g'. This could be easily
added to r.transect.

see also 5.0/5.3's v.mkgrid.

Alternatively, some creative use of g.region + r.to.sites might be
useful for setting up a grid.

On the filtering side of things, may I suggest a g.parser shell script
with a filter option to launch them:

filter options:
all Run all filters sequentially
1 Run first filter
2 Run second filter
3 Run third filter
4 ...
5
6
7
8
9 Run ninth filter

e.g. 'r.filter.sh in=inputmap out=outputmap filter=all'
(or)
   ... filter=1,3,5,7-9
   ... filter=6,4,1

You need an option instead of flags if you want to be able to order the
running of discrete filters.

> > Say -- wouldn't that save a lot of work and be
> > so much more fun than batch scripting?

Someone has to do the scripting at least once....
The idea is to make the low level scripts good enough so the day to day
scripts can be small and easy. Making use of existing modules means less
chance of bugs, versus writing a whole new filtering app. Also has the
side effect of making the existing modules better with time.

> GRASS has some pretty good interpolation
> modules. But there is a big hole in its lack of Krieging.

I think with the R-geostats interface, there is less motivation to
reinvent that.

Hamish

This is part of what is needed, but not all of it. What happens in geophysical survey is that someone walks up and down transects within a grid; the data recorder ticks off the readings at each point; it may let the operator record the transect. As so...

1 5 1 5 1
2 4 2 4 2
3 3 3 3 3
4 2 4 2 4
5 1 5 1 5

OR

1.1 2.5 3.1 4.5 5.1
1.2 2.4 3.2 4.4 5.2
1.3 2.3 3.3 4.3 5.3
1.4 2.2 3.4 4.2 5.4
1.4 2.1 3.5 4.1 5.5

The data recorder outputs a data file more or less like this.....

1 time date value
2 time date value
3 time date value
...

These need to be translated into xy coordinates for each point given the (x) spacing between transects and the (y) spacing between data reading along transects.

I do think it would be handy for other reasons to add a -g flag to r.transect
____________________
C. Michael Barton, Professor
School of Human Origins, Cultures, & Societies
PO Box 872402
Arizona State University
Tempe, AZ 85287-2402
USA

Phone: 480-965-6262
Fax: 480-965-7671
www: <www.public.asu.edu/~cmbarton>
On May 27, 2004, at 1:49 AM, Hamish wrote:

If I get your meaning right, see 'r.profile -g'. This could be easily
added to r.transect.

see also 5.0/5.3's v.mkgrid.

This is part of what is needed, but not all of it. What happens in
geophysical survey is that someone walks up and down transects within
a grid; the data recorder ticks off the readings at each point; it may
let the operator record the transect. As so...

[...]

1.1 2.5 3.1 4.5 5.1
1.2 2.4 3.2 4.4 5.2
1.3 2.3 3.3 4.3 5.3
1.4 2.2 3.4 4.2 5.4
1.4 2.1 3.5 4.1 5.5

The data recorder outputs a data file more or less like this.....

1 time date value
2 time date value
3 time date value
...

These need to be translated into xy coordinates for each point given
the (x) spacing between transects and the (y) spacing between data
reading along transects.

Sounds like a job for Matlab/Octave to me .. We don't get to lay nice
grids over our study sites at sea (or manage to get the ship to go in a
straight line anyway), so what I've done in the past is to log the NMEA
output from a GPS into a laptop while our instruments spit out
"1,time,date,value" style data which gets recorded into another file[*].
A Matlab postprocessing script does a linear interpolation of position
from the NMEA log for each data point, and directly writes a GRASS sites
file. Any spatial interpolations of the data can then be done from
there in the GIS. [It's not as hard as it might sound]

[*] all clocks sycronized to GPS time before hand of course

If you already have a grid set out, you can skip all the GPS stuff & the
script is all the easier.

I don't see how this could be implemented as a generic GIS module.
Each job is custom..

Note you can deal with no-data points (NULL) in the GRASS 5.0/5.3 sites
format by using a numerical value of 'nan'. e.g.:
easting|northing|#1 %nan @"date/time"

This probably won't work with most sites modules, but it works with more
than you might think as 'nan' is a valid floating point number on many
systems. e.g. 'd.site.labels attr=double' doesn't mind it.

I do think it would be handy for other reasons to add a -g flag to
r.transect

I think so too; done.
also fixed a bug (G_find_cell() was stripping the mapset off the map name).

Hamish

On May 28, 2004, at 10:25 PM, Hamish wrote:

Sounds like a job for Matlab/Octave to me .. We don't get to lay nice
grids over our study sites at sea (or manage to get the ship to go in a
straight line anyway), so what I've done in the past is to log the NMEA
output from a GPS into a laptop while our instruments spit out
"1,time,date,value" style data which gets recorded into another file[*].
A Matlab postprocessing script does a linear interpolation of position
from the NMEA log for each data point, and directly writes a GRASS sites
file. Any spatial interpolations of the data can then be done from
there in the GIS. [It's not as hard as it might sound]

Hamish

This seems like a very nice setup for your case. For the geophysical survey I've had been involved with (admittedly limited), the scale is too small for GPS to be of use unless high resolution differential correction is applied (data points along transects separated by centimeters and transects separated by a few meters). The grid is set up in advance, however (a luxury you don't have at sea). So the xy coordinate values have to assigned by 'dead reckoning' based on the predefined transect grid. I haven't used Matlab or Octave, but I suppose they could do this kind of transformation. It's pretty straightforward in a spreadsheet. However, if this is the common way this kind of work is done for magnetometry, resistivity, conductivity, and GPR, then it would feasible to design some kind of pre-processing module. What I don't know is the extent to which the ascii output from the relevant data recorders used by different manufacturers is sufficiently standardized so as to plug into such a module. If everyone records 'new transect' and 'data point' in very different ways, this could be difficult as you suggest.

[*] all clocks sycronized to GPS time before hand of course

If you already have a grid set out, you can skip all the GPS stuff & the
script is all the easier.

I don't see how this could be implemented as a generic GIS module.
Each job is custom..

Michael
____________________
C. Michael Barton, Professor
School of Human Origins, Cultures, & Societies
PO Box 872402
Arizona State University
Tempe, AZ 85287-2402
USA

Phone: 480-965-6262
Fax: 480-965-7671
www: <www.public.asu.edu/~cmbarton>

Is anyone in the GRASS community familiar with the
Industrial Source Complex Long Term (ISCLT3) Model of the
U.S. Environmental Protection Agency? The ISCLT3 model
combines industrial source data with meteorological data
(the input data) and computes concentrations of pollutants
at specified receptor locations (the output data). (I think
that it is a frequently used implementation of the plume
model genre. See http://www.weblakes.com/lakeepa3.html for
the ISCLT3 model itself, which seems to be about 20,000
lines of Fortran.)

Would it make sense to port the ISCLT3 model to GRASS?
Would there be interested users? What would be involved?
Is this project realistic and interesting? I am an
intermediate GRASS user and intermediate programmer but have
not done any development for GRASS. Should I start by
examining the GRASS simulation models at http://grass.itc.it/modelintegration.html

BTW, are archives of GRASSClippings, the Journal of Open
GIS, available anywhere on the web? I found a reference to
a plume-modeling article and wanted to follow up. Citation
to Fred Limp (GrassClippings 1992, Vol 6, no. 2, pg. 15).
Citation at http://grass.itc.it/pipermail/grassuser/1994-October/022603.html

I'd be grateful for encouragement or discouragement and
suggestions.

Best regards,
Michael Ash

P.S. Notes to myself

GIS and Risk Assessment: A fruitful combination
http://gis.esri.com/library/userconf/proc96/TO50/PAP028/P28.HTM

The OLAF algorithm
http://www.mathematik.uni-dortmund.de/lsx/research/projects/fliege/olaf/olaf.html

For the geophysical survey I've had been involved with (admittedly
limited), the scale is too small for GPS to be of use unless high
resolution differential correction is applied (data points along
transects separated by centimeters and transects separated by a few
meters). The grid is set up in advance, however (a luxury you don't
have at sea). So the xy coordinate values have to assigned by 'dead
reckoning' based on the predefined transect grid.

Ok.
We do set up transects in advance btw, it's just the boat doesn't always
want to go in a straight line.

I haven't used Matlab or Octave, but I suppose they could do this kind
of transformation. It's pretty straightforward in a spreadsheet.

But not automated, is susceptible to typos, and it isn't as easy to pass
off to a student to use.

However, if this is the common way this kind of work is done for
magnetometry, resistivity, conductivity, and GPR, then it would
feasible to design some kind of pre-processing module.

I was thinking a script could be made around r.profile/r.transect -g.

eg:
g.mkgrid.sh start_point=e,n size=1 columns=25 rows=30 rotation=360

To make a 30x25 grid of 1m cells, with one corner at [e,n] and oriented
X degrees from true North.

It could output either a set of coordinates in a sites/points file, a
vector file like v.mkgrid, etc. depending on your needs.

Use r.transect to get the x coordinates, then run again to get the y
coordinates, then fill in the blanks. Or if rotated, use the results
from the first profile as starting points for running r.transect again
for each 'y' line sequentially.

It would probably need r.profiles's res= option ported to r.transect,
and maybe a dummy raster to use as a backdrop if you didn't want to make
a new C module.

It's all possible.. good luck.

What I don't know is the extent to which the ascii output from
the relevant data recorders used by different manufacturers is
sufficiently standardized so as to plug into such a module. If
everyone records 'new transect' and 'data point' in very different
ways, this could be difficult as you suggest.

The probability of different manufacturers using the same output format
is near zero. The probability of the same manufacturer using the same
output format twice is pretty slim as well. This situation is what paid
my way though university. :wink:

Hamish