[GRASS5] [bug #4164] (grass) v.clean: uses 1GB RAM, 1 GB swap for a vector with only 628 boundaries

this bug's URL: http://intevation.de/rt/webrt?serial_num=4164
-------------------------------------------------------------------------

Subject: v.clean: uses 1GB RAM, 1 GB swap for a vector with only 628 boundaries

Platform: GNU/Linux/x86
grass obtained from: CVS
grass binary for platform: Compiled from Sources
GRASS Version: 2006-02-20

v.buffer used all my memory (1 GB RAM and 1GB swap) when building a clean buffer for my vector file, and I had to kill it to continue my work.

For a test, I forced v.buffer to skip cleaning, using option debug=buffer, and tried to clean this "dirty" output myself:

v.clean input=rogow_parcels_06_water_buff100 output=rogow_parcels_06_water_buff100_cl type=boundary tool=break

Again, it ate all my memory. The rogow_parcels_06_water_buff100 is very small:

Number of boundaries: 628

v.clean reaches the memory limit at about:

Intersections: 78194 (line 150883)

There must be some problem with memory handling in the vector code.

I'm putting the location with (only) my problematic vector file, if somebody is interested in fixing the problem.

http://www.biol.uni.wroc.pl/sieczka/udostepnione/grass/huha.tar.bz2 (364 KB)

I'm not looking for a workaround - I accomplished my task using r.buffer, r.to.vect.

Maciek

-------------------------------------------- Managed by Request Tracker

Request Tracker wrote:

this bug's URL: http://intevation.de/rt/webrt?serial_num=4164
-------------------------------------------------------------------------

Subject: v.clean: uses 1GB RAM, 1 GB swap for a vector with only 628 boundaries

Platform: GNU/Linux/x86
grass obtained from: CVS
grass binary for platform: Compiled from Sources
GRASS Version: 2006-02-20

v.buffer used all my memory (1 GB RAM and 1GB swap) when building a clean buffer for my vector file, and I had to kill it to continue my work.

For a test, I forced v.buffer to skip cleaning, using option debug=buffer, and tried to clean this "dirty" output myself:

v.clean input=rogow_parcels_06_water_buff100 output=rogow_parcels_06_water_buff100_cl type=boundary tool=break

Again, it ate all my memory. The rogow_parcels_06_water_buff100 is very small:

Number of boundaries: 628

v.clean reaches the memory limit at about:

Intersections: 78194 (line 150883)

There must be some problem with memory handling in the vector code.

I'm putting the location with (only) my problematic vector file, if somebody is interested in fixing the problem.

http://www.biol.uni.wroc.pl/sieczka/udostepnione/grass/huha.tar.bz2 (364 KB)

I'm not looking for a workaround - I accomplished my task using r.buffer, r.to.vect.

Maciek

-------------------------------------------- Managed by Request Tracker

_______________________________________________
grass5 mailing list
grass5@grass.itc.it
http://grass.itc.it/mailman/listinfo/grass5

Although this may not be related to your problem, v.buffer may be calling V_build which has a memory management problem/feature that it eats up all your memory. But it should not happen with 628 boundaries
- we have that problem with about 600,000+ points.
Maybe Radim can explain whether this may be the same. If you digg through the emails related to v.in.ascii and handling of large files there was some discussion between Hamish and Radim on how to solve it.

Helena