[GRASS-dev] Memory consumption using pygrass.utils.get_raster_for_points

Hello devs,

When running pygrass.utils.get_raster_for_points repeatedly, it appears that the python memory allocation continuously increases until all ram is consumed, even if the extracted values are not being collected (or are overwriting the same variable).

I noticed this when extracting raster data at point locations, when using a large point dataset, even though I had pre-allocated a numpy array to receive the results.

Below is an example on the nc_spm_08_grass7 example data (in the landsat mapset), repeating the operation say 50 times on the same point vector dataset. I wouldn’t have expected the memory consumption to continuously increase for this operation, because I’m overwriting the ‘arr’ variable each time. However, if you repeat this enough times, you will run out of system memory and the allocated memory does not appear to be released, i.e. even if you manually force garbage collection.

Any suggestions?

from grass.pygrass.vector import VectorTopo
from grass.pygrass.raster import RasterRow
from grass.pygrass.modules.shortcuts import raster as r
from grass.pygrass.gis.region import Region
from grass.pygrass.utils import get_raster_for_points

reg = Region()
reg.from_rast(“landclass96”)
reg.write()
reg.set_raster_region()

r.random(input=“landclass96”, npoints=200000, vector=“landclass96_roi”,
overwrite=True)

points = VectorTopo(“landclass96_roi”)
points.open(“r”)

repeat spatial query of raster

for i in range(50):
with RasterRow(“lsat5_1987_10”) as src:
arr = get_raster_for_points(points, src)

As a follow-up to this, I tried tracking the memory usage using the python Pympler package.

I ran the following code block:

from grass.pygrass.vector import VectorTopo
from grass.pygrass.raster import RasterRow
from grass.pygrass.modules.shortcuts import raster as r
from grass.pygrass.gis.region import Region
from grass.pygrass.utils import get_raster_for_points

set region

reg = Region()
reg.from_rast(“landclass96”)
reg.write()
reg.set_raster_region()

generate a large point dataset

r.random(input=“landclass96”, npoints=200000, vector=“landclass96_roi”,
overwrite=True)

memory tracking

from pympler.tracker import SummaryTracker
tracker = SummaryTracker()

points = VectorTopo(“landclass96_roi”)
points.open(“r”)

repeat spatial query of raster

for i in range(10):
print(i)
with RasterRow(“lsat5_1987_10”) as src:
arr = get_raster_for_points(points, src)
points.close()

tracker.print_diff()

The memory tracker results are:

types | # objects | total size
============================================================ | =========== | ============
<class 'grass.pygrass.raster.buffer.Buffer | 2000000 | 4.16 GB
dict | 6000022 | 1.56 GB
<class 'grass.lib.ctypes_preamble.LP_c_int | 2000000 | 274.66 MB
<class 'ctypes.c_void_p | 2000000 | 274.66 MB
<class 'numpy.core._internal.c_char_Array_0 | 2000000 | 274.66 MB
<class 'numpy.core._internal.LP_c_char_Array_0 | 2000000 | 274.66 MB
numpy.ndarray | 2000000 | 152.59 MB
<class 'numpy.core._internal._unsafe_first_element_pointer | 2000000 | 122.07 MB
int | 4001700 | 91.59 MB
list | 14032 | 2.99 MB
str | 14072 | 845.98 KB
StgDict | 2 | 1.20 KB
weakref | 12 | 1.03 KB
_ctypes.PyCPointerType | 1 | 904 B
_ctypes.PyCArrayType | 1 |

So, grass.pygrass.raster.buffer.Buffer is still using 4.16 GB despite the RasterRow object being closed, and that there are 200,000 of those objects remaining in memory, which I think means that for each of my 200,000 points, the Buffer object which contains a row from the RasterRow object for each point coordinate has remained in memory. There is also a dict that is consuming memory as well, and I can see that the grass.pygrass.raster.raster_type module, called by Buffer, uses a dict to store the cell type of the Buffer.

Steve

---------- Forwarded message ---------
From: Steven Pawley <dr.stevenpawley@gmail.com>
Date: Sat, Feb 9, 2019 at 9:48 AM
Subject: Memory consumption using pygrass.utils.get_raster_for_points
To: GRASS developers list <grass-dev@lists.osgeo.org>

Hello devs,

When running pygrass.utils.get_raster_for_points repeatedly, it appears that the python memory allocation continuously increases until all ram is consumed, even if the extracted values are not being collected (or are overwriting the same variable).

I noticed this when extracting raster data at point locations, when using a large point dataset, even though I had pre-allocated a numpy array to receive the results.

Below is an example on the nc_spm_08_grass7 example data (in the landsat mapset), repeating the operation say 50 times on the same point vector dataset. I wouldn’t have expected the memory consumption to continuously increase for this operation, because I’m overwriting the ‘arr’ variable each time. However, if you repeat this enough times, you will run out of system memory and the allocated memory does not appear to be released, i.e. even if you manually force garbage collection.

Any suggestions?

from grass.pygrass.vector import VectorTopo
from grass.pygrass.raster import RasterRow
from grass.pygrass.modules.shortcuts import raster as r
from grass.pygrass.gis.region import Region
from grass.pygrass.utils import get_raster_for_points

reg = Region()
reg.from_rast(“landclass96”)
reg.write()
reg.set_raster_region()

r.random(input=“landclass96”, npoints=200000, vector=“landclass96_roi”,
overwrite=True)

points = VectorTopo(“landclass96_roi”)
points.open(“r”)

repeat spatial query of raster

for i in range(50):
with RasterRow(“lsat5_1987_10”) as src:
arr = get_raster_for_points(points, src)