[GRASS-user] r.sun use - automaticcaly stopped process ?

Hi all,

I'm using r.sun for calculatin shadow and getting solar irradiance.

I've tried to run both command on my raster :

/r.sun -s elevin=calcul_rast incidout=shadow day=80 time=17/

/ r.sun -s elevin=calcul_rast aspin=SOLaspect slopein=SOLslope day=80
beam_rad=b_rad.080 diff_rad=d_rad.080 refl_rad=r_rad.080/

The script start running but after a while, I always get the same error
message : *Processus arrêté* (process stopped).

Here is my raster file properties :

zone: 0
datum: towgs84=0,0,0,0,0,0,0
ellipsoid: grs80
north: 6834498.4129157
south: 6818891.28705492
west: 658872.26712997
east: 672363.87733357
nsres: 0.19999905
ewres: 0.20000015
rows: 78036
cols: 67458
cells: 5264152488

Is my raster file too big? Is there a workaround? Actually, I need to work
on large area since I also want to compute mountain shadows.

Thanks for your help,
simo

--
View this message in context: http://osgeo-org.1560.x6.nabble.com/r-sun-use-automaticcaly-stopped-process-tp5047682.html
Sent from the Grass - Users mailing list archive at Nabble.com.

simogeo wrote:

I'm using r.sun for calculatin shadow and getting solar
irradiance.

I've tried to run both command on my raster :

/r.sun -s elevin=calcul_rast incidout=shadow day=80
time=17/

/ r.sun -s elevin=calcul_rast aspin=SOLaspect
slopein=SOLslope day=80
beam_rad=b_rad.080 diff_rad=d_rad.080 refl_rad=r_rad.080/

The script start running but after a while, I always get the
same error
message : *Processus arrêté* (process stopped).

Here is my raster file properties :

zone: 0
datum: towgs84=0,0,0,0,0,0,0
ellipsoid: grs80
north: 6834498.4129157
south: 6818891.28705492
west: 658872.26712997
east: 672363.87733357
nsres: 0.19999905
ewres: 0.20000015
rows: 78036
cols: 67458
cells: 5264152488

Is my raster file too big? Is there a workaround? Actually,
I need to work on large area since I also want to compute
mountain shadows.

how much RAM do you have? Running 32 or 64 bit? Which operating
system?

Do the same commands work at a coarser resolution? 20000x20000
region size is known to work ok. anything bigger than about
45000x45000 gets into 64 bit/LFS territory.

Hamish

ps- 'g.region res=0.2 -a' will clean up/round off the messy
resolution value, if you like.

Hi Hamish,

Thanks for your reply.

Indeed, my post was uncompleted - sorry for that. I'm running the script
under Ubuntu 12.04 - 64bits with Intel® Core™ i7 CPU M 640 @ 2.80GHz × 4 and
5.8GB RAM.

I guess the hardware is supposed (and able) to do the hard work I ask for.

A lack of memory should produce a segmentation fault, no? The current error
message is not really explicit.

I've tried changing the resolution to 0.5 but I still have the same issue
... I will still investigate. Any clue/idea would be appreciate.

--
View this message in context: http://osgeo-org.1560.x6.nabble.com/r-sun-use-automaticcaly-stopped-process-tp5047682p5047885.html
Sent from the Grass - Users mailing list archive at Nabble.com.

simogeo wrote:

I'm running the script under Ubuntu 12.04 - 64bits with Intel®
Core™ i7 CPU M 640 @ 2.80GHz × 4 and 5.8GB RAM.

ok

I guess the hardware is supposed (and able) to do the hard
work I ask for.

A lack of memory should produce a segmentation fault, no?
The current error message is not really explicit.

a Seg. Fault happens due to programmer error, they told the
program to write memory outside of where it should have. A
seg fault should not normally happen due to an out-of-memory
error.

when a program eats up all the system memory, the OS is left
which a choice of what to do, as it needs memory too, and the
choice it makes is to protect itself and kill the process which
is eating up all the memory. I'm not sure what the translation
would look like, for me I think it's like "Process killed."

I've tried changing the resolution to 0.5 but I still have
the same issue
... I will still investigate. Any clue/idea would be
appreciate.

try watching the process in `top` in a Terminal window to
see how much memory it uses, and keep going and try changing
the region res= to be as coarse as 5m.

Hamish

hamish-2 wrote

a Seg. Fault happens due to programmer error, they told the
program to write memory outside of where it should have. A
seg fault should not normally happen due to an out-of-memory
error.

when a program eats up all the system memory, the OS is left
which a choice of what to do, as it needs memory too, and the
choice it makes is to protect itself and kill the process which
is eating up all the memory. I'm not sure what the translation
would look like, for me I think it's like "Process killed."

Thanks for the precision. Indeed, it should be "process killed".

hamish-2 wrote

try watching the process in `top` in a Terminal window to
see how much memory it uses, and keep going and try changing
the region res= to be as coarse as 5m.

I haven't seen the memory usage going to 100%, but it should be the cause,
indeed since it's working with a lower resolution (for example region
res=2).

This issue is /kind of solved/ since it appears to be caused by hardware
itself.

By the way, I use a raster having the following characteristics :
-----------
rows: 55109
cols: 54792
cells: 3019532328
-----------

but having lots of null values (see image below). Is there a way to speed up
the execution ignoring null values (I guess they are nulll values -
displaying "*" when querying) ?

<http://osgeo-org.1560.x6.nabble.com/file/n5047964/snap.png&gt;

Many thanks for sharing all this, Hamish.

Bye,
simon

--
View this message in context: http://osgeo-org.1560.x6.nabble.com/r-sun-use-automaticcaly-stopped-process-tp5047682p5047964.html
Sent from the Grass - Users mailing list archive at Nabble.com.

On Thu, Apr 18, 2013 at 10:52 AM, simogeo <simon.georget@gmail.com> wrote:

under Ubuntu 12.04 - 64bits with Intel® Core™ i7 CPU M 640 @ 2.80GHz × 4 and
5.8GB RAM.

Please also tell us which GRASS version you use.

On Thu, Apr 18, 2013 at 2:57 PM, simogeo <simon.georget@gmail.com> wrote:

hamish-2 wrote

a Seg. Fault happens due to programmer error, they told the
program to write memory outside of where it should have. A
seg fault should not normally happen due to an out-of-memory
error.

...

By the way, I use a raster having the following characteristics :
-----------
rows: 55109
cols: 54792
cells: 3019532328
-----------

2^31

[1] 2147483648

you have:

  3019532328

Probably you exceed the maximum file limit for non-LFS enabled GRASS GIS:
http://grasswiki.osgeo.org/wiki/Large_raster_data_processing

It depends on the version you use and how it was compiled.

Markus

Hi Markus,

I use version 6.4.2 as package on a 64bit. From what you said, the LFS is
not enabled - the wiki page mentions only 32 bits systems. Now, it's clear
enough.

Iv' just installed GRASS7 using ubuntu-gis PPA. Could you just tell me if I
can share the GRASS database folder between both grass versions (6.4 and 7)
without messing up files?

Thanks,
simo

--
View this message in context: http://osgeo-org.1560.x6.nabble.com/r-sun-use-automaticcaly-stopped-process-tp5047682p5048689.html
Sent from the Grass - Users mailing list archive at Nabble.com.

I got my answer on the wiki itself. Sorry for that.
I post links for others :

http://grasswiki.osgeo.org/wiki/Upgrading_GRASS_database
http://grasswiki.osgeo.org/wiki/Convert_all_GRASS_6_vector_maps_to_GRASS_7

--
View this message in context: http://osgeo-org.1560.x6.nabble.com/r-sun-use-automaticcaly-stopped-process-tp5047682p5048693.html
Sent from the Grass - Users mailing list archive at Nabble.com.

Me again,

As said before, I use Ubuntu 12.04 in 64bits.The wikipage
<http://grasswiki.osgeo.org/wiki/Large_raster_data_processing&gt; mentions
only memory usage limits for 32bits system. After your first reply Markus, I
thought the 2^31 memory limit would also maybe affect 64 bits systems.

I've installed then GRASS7.0 (from ubuntugis-testing PPA) since LFS is
enabled natively. But I've made some tests on the same raster file getting
exactly the same issue as result.

With res=5 or res=1 it's working well but with the raster resolution (0.2)
the process is killed again.

How can I be sure to have LFS enabled? I added swap space (2Gb) to my
partition as described on the wiki
<http://grasswiki.osgeo.org/wiki/Memory_issues&gt; and tried again, but the
issue still there.

Did I miss or missunderstand something? Thanks.

--
View this message in context: http://osgeo-org.1560.x6.nabble.com/r-sun-use-automaticcaly-stopped-process-tp5047682p5048847.html
Sent from the Grass - Users mailing list archive at Nabble.com.

Hi simo,

On Mon, Apr 22, 2013 at 10:51 AM, simogeo <simon.georget@gmail.com> wrote:

Hi Markus,

I use version 6.4.2 as package on a 64bit. From what you said, the LFS is
not enabled - the wiki page mentions only 32 bits systems. Now, it's clear
enough.

So you got it precompiled? Then it might be a good idea to notify the
package maintainer about this.

This page seems to be updated:
http://grasswiki.osgeo.org/wiki/Compile_and_Install_Ubuntu#GRASS_GIS

Iv' just installed GRASS7 using ubuntu-gis PPA. Could you just tell me if I
can share the GRASS database folder between both grass versions (6.4 and 7)
without messing up files?

Basically yes: raster format is the same, vector is convertable easily:

http://grasswiki.osgeo.org/wiki/Convert_all_GRASS_6_vector_maps_to_GRASS_7
(also back, see there)

Markus

simogeo wrote:

As said before, I use Ubuntu 12.04 in 64bits.The
wikipage <http://grasswiki.osgeo.org/wiki/Large_raster_data_processing&gt; mentions
only memory usage limits for 32bits system. After your
first reply Markus, I thought the 2^31 memory limit would
also maybe affect 64 bits systems.

...

With res=5 or res=1 it's working well but with the raster
resolution (0.2) the process is killed again.

Hi,

just looking in the code, I see a few things which might be
suspicious in the INPUT_part() function,
  https://trac.osgeo.org/grass/browser/grass/trunk/raster/r.sun/main.c#L761
maybe something there needs to be off_t instead?

but mainly I think it's just that the module wants a lot of RAM,
and the process gets killed when it asks for too much.

here are some tests on a few months old 6.4.svn build
(6.4.3svn.50937) on 64bit linux.

# Mode 2 (integrated daily irradiation) at spearfish
g.region rast=elevation.10m res=${*}
r.sun -s elevation.10m lin=2.5 alb=0.2 day=172 \
   beam_rad=b.172 diff_rad=d.172 \
   refl_rad=r.172 insol_time=it.172

as you can see, allocating >4gb RAM works ok for me, so it
is likely not a LFS/32/64bit problem.

rows: 27960
cols: 37980
cells: 1061920800
-> in swap, >16gb RAM, (~19gb?)

rows: 22368
cols: 30384
cells: 679629312
-> 12gb

rows: 18640
cols: 25320
cells: 471964800
-> 8.8g

rows: 13980
cols: 18990
cells: 265480200
-> 5GB

rows: 6990
cols: 9495
cells: 66370050
-> 1.2g

rows: 2796
cols: 3798
cells: 10619208
-> 0.2gb time: 37m42s

plotting it out, memory use seems to grow linearly with
number of cells. from earlier experiments, time does as well.

by my calcs, a 60000x60000 cell region would want ~64gb RAM.
However it would take a long-long time to get there, as in
the last example above, a bit bigger than 3000x3000 cells took
half an hour on a few months old fast i7 cpu w/ 16GB ram.

I don't think Seth's GPU OpenCL acceleration is going to help
there, since GPU RAM is often limited and the I/O to it a bottle-
neck, and even 8 or 16x faster than it w/multithreading would
take would still take too long for Mode 2 daily integration runs.

So I think your best bet is to use a coarser resolution, then
make it finer until you hit a time or RAM limit. Do you have
that fine of a DEM anyway? LiDAR often being binned to 2m cell
size,..

Hamish

Hamish wrote:

by my calcs, a 60000x60000 cell region would want ~64gb RAM.

and your 0.2m resolution run would want ~94gb RAM.

To fit in 5.8gb RAM you could have about max 17860x17860 region
size. I see your region bounds are 15607m x 13492m, so 1m cell
resolution would fit well, and probably finish overnight.
Is you DEM as fine as 1m res?

Hamish

On Tue, Apr 23, 2013 at 6:24 AM, Hamish <hamish_b@yahoo.com> wrote:

simogeo wrote:

As said before, I use Ubuntu 12.04 in 64bits.The
wikipage <http://grasswiki.osgeo.org/wiki/Large_raster_data_processing&gt; mentions
only memory usage limits for 32bits system. After your
first reply Markus, I thought the 2^31 memory limit would
also maybe affect 64 bits systems.

...

With res=5 or res=1 it's working well but with the raster
resolution (0.2) the process is killed again.

Hi,

just looking in the code, I see a few things which might be
suspicious in the INPUT_part() function,
  https://trac.osgeo.org/grass/browser/grass/trunk/raster/r.sun/main.c#L761
maybe something there needs to be off_t instead?

but mainly I think it's just that the module wants a lot of RAM,
and the process gets killed when it asks for too much.

That's why there is the numpartitions option. Try r.sun
numpartitions=<a number > 1>

HTH,

Markus M

here are some tests on a few months old 6.4.svn build
(6.4.3svn.50937) on 64bit linux.

# Mode 2 (integrated daily irradiation) at spearfish
g.region rast=elevation.10m res=${*}
r.sun -s elevation.10m lin=2.5 alb=0.2 day=172 \
   beam_rad=b.172 diff_rad=d.172 \
   refl_rad=r.172 insol_time=it.172

as you can see, allocating >4gb RAM works ok for me, so it
is likely not a LFS/32/64bit problem.

rows: 27960
cols: 37980
cells: 1061920800
-> in swap, >16gb RAM, (~19gb?)

rows: 22368
cols: 30384
cells: 679629312
-> 12gb

rows: 18640
cols: 25320
cells: 471964800
-> 8.8g

rows: 13980
cols: 18990
cells: 265480200
-> 5GB

rows: 6990
cols: 9495
cells: 66370050
-> 1.2g

rows: 2796
cols: 3798
cells: 10619208
-> 0.2gb time: 37m42s

plotting it out, memory use seems to grow linearly with
number of cells. from earlier experiments, time does as well.

by my calcs, a 60000x60000 cell region would want ~64gb RAM.
However it would take a long-long time to get there, as in
the last example above, a bit bigger than 3000x3000 cells took
half an hour on a few months old fast i7 cpu w/ 16GB ram.

I don't think Seth's GPU OpenCL acceleration is going to help
there, since GPU RAM is often limited and the I/O to it a bottle-
neck, and even 8 or 16x faster than it w/multithreading would
take would still take too long for Mode 2 daily integration runs.

So I think your best bet is to use a coarser resolution, then
make it finer until you hit a time or RAM limit. Do you have
that fine of a DEM anyway? LiDAR often being binned to 2m cell
size,..

Hamish
_______________________________________________
grass-user mailing list
grass-user@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/grass-user

Hi,

Hamish : Thanks again for all precision regarding resolutions and RAM use.
All given examples in your last message give me a better idea of what I can
do with my dataset/system config.

The reason, I want the best resolution is that I'm calculating sun
irradiation on buildings (roofs). A 0.2m resolution is ideal. 0.5 can be
accurate enough ... 1m is just acceptable! :wink:
Will try to optimize my treatment by processing my raster file not on the
full extent but on each administrative boundary and then merge them).

Markus Metz-3 wrote

That's why there is the numpartitions option. Try r.sun
numpartitions=

1>

Indeed, but I'm calculating shadow and /numpartitions/ option does not work
with /-s/ flag (shadow) or /horizon/ option. Thanks anyway.

--
View this message in context: http://osgeo-org.1560.x6.nabble.com/r-sun-use-automaticcaly-stopped-process-tp5047682p5049139.html
Sent from the Grass - Users mailing list archive at Nabble.com.