[GRASS-user] r.walk usage

Good Morning Everyone,

I’m looking for some help with the r.walk tool. I’m finding some tutorials and man pages with examples but having trouble applying what I’ve found. I’m trying to create mobility models for use with lost person analytics.

To start off, I’m running QGIS 2.18.13 64-bit on a laptop with Windows 10 64-bit. I’ve posted a similar question on the QGIS Dev Listserv and they recommended I re-post here.

Q1: Does it matter what format the input layers are in? Should they all be UTM since the result is in meters and seconds? Or is the tool smart enough to mix and match WGS84 and UTM layers? And along the same lines, does my DEM need to have cells in meters? Most of what I get comes native in WGS84 (lat/lon) with altitudes in feet, so if I don’t need to add extra steps converting to UTM and cells to meters that’s a good thing.

For the friction layer, the example on the man page appears to use landclass96 and only has 7 classifications… the NLCD 2011 land classification file I have has 30-40 classifications ranging from 11 to 95, this was initially hard for me to understand but then I stumbled upon the man page for r.reclass and saw that the NLCD data was being generalized into the major sub-groups so I did the same by first converting the large set of classifications into the 7 groups and then I created a friction layer using the example for r.walk.

Q2: Now with everything in UTM and my DEM’s cells being in meters I’m trying to run r.walk and the algorithm is never able to finish I assume because I’m running out of HD space. Is it normal for an r.walk over a DEM covering an area of 10,500 acres to use up 12GB of HD space and want more? That’s using up every bit of space I have left, so it’s crashing out. I tried setting the maximum cumulative cost to 10 and 1 also, but the algorithm keeps running for a long time making these really large files… My understanding from the man page is that the values calculated are in seconds, so when I set maximum cumulative cost to 1 or 10 seconds that should be really quick, right? Maybe I’m missing something…

Here’s the command as run from within QGIS: r.walk elevation=“tmp151153693667” friction=“tmp151153693668” start_points=“tmp15115369366310” walk_coeff=“0.72,6.0,1.9998,-1.9998” lambda=“1” slope_factor=“-0.2125” max_cost=“1” null_cost=“0” memory=“2048” output=output8468f646ce7b4b838f8c319a88ad3d73 outdir=outdir8468f646ce7b4b838f8c319a88ad3d73 --overwrite

Thanks,

Josh Q

Le Fri, 24 Nov 2017 10:46:11 -0500,
"Joshua Quesenberry" <engnfrc@gmail.com> a écrit :

Good Morning Everyone,

I'm looking for some help with the r.walk tool. I'm finding some
tutorials and man pages with examples but having trouble applying
what I've found. I'm trying to create mobility models for use with
lost person analytics.

To start off, I'm running QGIS 2.18.13 64-bit on a laptop with
Windows 10 64-bit. I've posted a similar question on the QGIS Dev
Listserv and they recommended I re-post here.

Q1: Does it matter what format the input layers are in? Should they
all be UTM since the result is in meters and seconds? Or is the tool
smart enough to mix and match WGS84 and UTM layers? And along the
same lines, does my DEM need to have cells in meters? Most of what I
get comes native in WGS84 (lat/lon) with altitudes in feet, so if I
don't need to add extra steps converting to UTM and cells to meters
that's a good thing.

GRASS GIS works with "locations" which have a given defined projection
and all data in the location must be in that projection. You cannot
combine data from different locations in one module call.

I don't know how QGIS handles multiple input into one GRASS module. If
the data is imported into the temporary location using v.import with
reprojection, then this could work, provided that the
original projections of the input can be read from the files.

So this is something we cannot answer here on the GRASS list.

For the friction layer, the example on the man page appears to use
landclass96 and only has 7 classifications. the NLCD 2011 land
classification file I have has 30-40 classifications ranging from 11
to 95, this was initially hard for me to understand but then I
stumbled upon the man page for r.reclass and saw that the NLCD data
was being generalized into the major sub-groups so I did the same by
first converting the large set of classifications into the 7 groups
and then I created a friction layer using the example for r.walk.

You can create a friction map from your 30-40 class classification, as
long as you have an idea of the values to use for each class.

Q2: Now with everything in UTM and my DEM's cells being in meters I'm
trying to run r.walk and the algorithm is never able to finish I
assume because I'm running out of HD space. Is it normal for an
r.walk over a DEM covering an area of 10,500 acres to use up 12GB of
HD space and want more? That's using up every bit of space I have
left, so it's crashing out. I tried setting the maximum cumulative
cost to 10 and 1 also, but the algorithm keeps running for a long
time making these really large files. My understanding from the man
page is that the values calculated are in seconds, so when I set
maximum cumulative cost to 1 or 10 seconds that should be really
quick, right? Maybe I'm missing something.

This sounds like an issue with the computational region, i.e. the grid
(extension and resolution) defined for creating rasters, probably a
resolution that is much too high for the given extension, leading to an
enormous amount of pixels. Normally you can define the resolution in the
module GUI in QGIS. Try setting this to a higher value (aka lower
resolution).

Moritz