[GRASS-dev] r.sun at a landscape scale

Hi folks,
Just forwarding the conversation to add to the record.

Doug

----- Forwarded by Doug Newcomb/R4/FWS/DOI on 07/27/2011 09:23 AM -----

Doug Newcomb <newcomb.ttg@gmail.com>

04/29/2011 09:15 AM

To

Doug Newcomb <Doug_Newcomb@FWS.gov>

cc

Subject

Fwd: Re: r.sun at a landscape scale

-------- Original Message --------


Subject:
Re: r.sun at a landscape scale

Date:
Tue, 26 Apr 2011 01:16:05 -0700 (PDT)

From:
Hamish hamish_b@yahoo.com

To:
Doug Newcomb newcomb.ttg@gmail.com

Hi Doug,

> Contacting you from the home email. Work webmail non-functional.
> If you recall the 60ft canopy height layer I created for the state of
> North Carolina, I've been using that canopy height layer as a base
> elevation layer for calculating total global solar irradiation for a
> 60ft grid for the state of north carolina. I have created slope and
> aspect layers, as well as r.horizon layers at 8 45 degree
> increments starting at 0 ( East) and going counter-clockwise. I
> left the albedo and linke options at the defaults. I have calculated
> the global solar irradiation for all 365 days using these parameters.

After a number of tests (see below), I'm of the opinion that the pre-made
slope, aspect, lat, lon, and r.horizon maps are either no faster, or
introduce such inaccuracies as to be not worth the trouble.

In particular I question the usefulness of r.horizon maps: the sun is in
a slightly different place on the compass each morning, and the overhead
of keeping/loading 360 horizon maps is more than just recalculating it
on the fly (and on the fly you get the exact value, not a nearby estimate).
Also, having 360 starting r.horizons limits you to 1 degree precision.

I value accuracy over processing time (I'm happy to wait a week for
an answer if I know the model setup is good), so take with that grain
of salt. So I like to use step=0.05 instead of the default step=0.5
even though it takes 10 times longer.

> I would like to sharpen the accuracy of the calculations by
> creating grid layers of the linke turbulence similar to how you do it
> in your python script. My efforts seem to be complicated by the
> landscape approach I have taken, which includes calculations in
> coastal plain, piedmont, and mountain areas which have a mixture of
> urban and rural land uses.
>
> I've been to the helioclim
> http://www.helioclim.net/linke/index.html and and soda
> http://www.soda-is.com/eng/services/climat_free_eng.php#climatCielClair
> web sites to try to get more exact estimates on the linke turbulence,
> but the best they can do is monthly estimates at 5 arc second
> resolution . Perhaps I want too much :-)

for me the Soda linke DB wasn't too useful, AFAIU it's rather elevation
dependent and I'm modeling light in fjords with 1000m vertical cliff
faces.. their spatial elevation model resolution was just too coarse
for me.

what we did instead was to pick at each model point (~1 deg) for each
month, and then use v.surf.rst to make an interpolated linke map from
those. I'm not sure if 5 arc-sec data was available then, or if that
was just a reinterpolation from a coarser grid so we ignored it..?

I suppose for each position you could uncomment the couple of lines
at the bottom of the linke month->day interpolation python script
which gives the full year's worth of linkie values, and use the
collection of those to make 365 v.surf.rst linkie coverage maps.
?
you could also mix and match mountainous and urban values that way.

> In any event, my thought was to try to take the global monthly
> estimate tifs, georeference them and pull them into grass. Chop out
> the NC areas and reproject them into the same workspace as the canopy
> height data. Take the monthly values as the mid month values and
> interpolate to 365 days and use those calculations as the basis for
> the linke turbulence input to calculation.
>
> Does that seem like a reasonable approach to you?

currently r.series doesn't support linear interpolation extractions,
I suppose something could be whipped together using r.mapcalc's
graph() linear interpolation, maybe hack that to do cubic interpolations
too?

> BtW, how much does it speed things up to have a lat and long layers
> in the input?

I don't think by much at all, maybe a few percent, at the risk of user
error. read through:
[`https://trac.osgeo.org/grass/ticket/498`](https://trac.osgeo.org/grass/ticket/498) `and` http://grass.osgeo.org/wiki/r.sun

If you have a new graphics card you can vastly speed it up by using
Seth's GPU version:
``http://grass.osgeo.org/wiki/R.sun#OpenCL

(which I still need to work on merging into trunk)

Seth wrote:
>> The OpenCL version of r.sun runs over 20x faster than the original
>> version on my machine (2.26 GHz Mac Pro vs. GeForce GTX 285). However,
>> it is hampered by the low memory on your GPU, so you may need to
>> partition your raster.

> Just some performance fyi, running the r.sun analysis in 1 chunk
> (n=1) takes about 4 hours and requires about 16 GB of RAM. running a
> calculation in 8 chunks (n=8) knocks the memory requirement down to
> 4.6 GB and allows me to run multiple days of r.sun simultaneously.

as in breaking up the region into smaller areas? what's the region
rows,columns? make sure you don't chop away a shading mountain just
outside of the region...

> I should be back to my regular email address on Monday. Feel free
> to forward any parts of this to the grass-users list /grass-devel
> list as you deem appropriate.

ditto. In particular some new version of r.series or r.regression[.line]
to do linear and cubic interpolations. maybe the task piques some
developer's imagination as an interesting project.

Hamish