[GRASS-user] r.proj use of memory

I have a large DEM reprojecting from lon/lat to Lambert Conformal Conic and
specified the lanczos_f method to minimize smoothing (the west edge of the
DEM is at the coast).

With 32G RAM in the host I specified mem=4000 in the r.proj command line.
When I look at top I see that process consuming 97.5%-100.0% of the CPU (8
cores/16 threads) but only 1.5% of available memory. Since I allocated about
16% of available memory to this process why isn't r.proj using it?

So far the process has been running for 2 hours; I thought that 4G of memory
(rather than the default 300M) would decrease processing time.

I would like to learn how to minimize processing time when modules accept a
higher-than-default amount of memory dedicated to them and the maps are very
large.

Thanks in advance,

Rich

On Thu, 12 Sep 2019, Rich Shepard wrote:

I have a large DEM reprojecting from lon/lat to Lambert Conformal Conic and
specified the lanczos_f method to minimize smoothing (the west edge of the
DEM is at the coast).

I would like to learn how to minimize processing time when modules accept
a higher-than-default amount of memory dedicated to them and the maps are
very large.

The wiki page on parallel grass jobs, working with tiles section, looks like
the way to go.

Has anyone a more complete example of using g.region and WIND_OVERRIDE so I
can see what I need to do with the 31 remaining large 10m DEMs that need to
be reprojected? The ellipses leave me clueless about dividing a raster file
with 8*10^7 cells.

I'm looking forward to learning this,

Rich