On Sun, Dec 28, 2008 at 1:18 PM, Markus Neteler <neteler@osgeo.org> wrote:
Hi,
I am planning to calculate energy maps based on a high res
DEM (458890698 cells, FCELL type). I'll use a multi-core
computer with several GB of RAM in a cluster environment.
How should I calculate the "numpartitions"? What is the expected
memory demand? I have to tell the job scheduler how much
memory I need for each process - I want to calculate daily
energy maps, so submit each day to a node of the cluster.
Here the answer from the author:
On Tue, Dec 30, 2008 at 9:34 PM, <thomas.huld@jrc.it> wrote:
Ciao Markus,
I had a look at the source code. Most of the memory is used to store the
rasters (or part of the rasters if you use numpartitions>1).
How much depends on the options you use. I have made a little table of the
input and output options and the amount of memory the use per pixel. The
addition (if) means that it is optional, so you may have from 1 to 6 output
rasters. The output rasters are not partitioned, only the input rasters:
Output rasters
incidout (if) float
insol_time (if) float
beam_rad (if) float
diff_rad (if) float
refl_rad (if) float
glob_rad (if) float
Input rasters. Size should be divided by numpartitions:
elevation float
slopein (if) float
aspin (if) float
linkein float
albedo (if) float
latin (if) float
longin (if) float
coefbh float
coefdh float
horizon (if) numhorizonstep bytes
A float is 4 bytes, so you can calculate the memory usage yourself.
Horizon values are stored in the program as 1-byte integers to save memory.
I just tried one example, and the expected memory used was 252000 KB while
the actual memory used (according to "ps aux") is 259000KB. I guess the
rest is pointers, code and smaller arrays.
Hope this helps.
Thomas
I'll add that to the manual in condensed form.
Markus