[GRASSLIST:7100] Upper file size limit to dbf?

I am returning to GRASS after being away for some time. I was
wondering if the new default vector format (dbf) has a low size limit?
I am trying to load a 15 Mb x,y,z file of point data (about 550,000
points). I get a segmentation fault (Signal 11). I am on
Windows/Cygwin, the file is 2114.dat:

GRASS 6.0.0 (HOOD):/cygdrive/c/temp/lowfall_complete/lidar > head 2114.dat
1184773.43 293979.22 000.27
1184764.46 293986.36 000.26
1185098.46 293894.99 005.44
1185098.06 293860.95 040.08
1185099.80 293846.70 059.70
1185103.26 293834.33 071.42
1185108.47 293823.20 075.91
1185108.56 293809.44 094.81
1185118.10 293801.21 082.97
1185125.92 293791.81 075.55
GRASS 6.0.0 (HOOD):/cygdrive/c/temp/lowfall_complete/lidar >
v.in.ascii in=2114.dat out=lowfall_points fs="\t" col="northing
double, easting double, depth double" x=1 y=2 cat=0
WARNING: DOS text format found, attempting import anyway
Maximum input row length: 30
Maximum number of columns: 3
Minimum number of columns: 3
column: 1 type: double
column: 2 type: double
column: 3 type: double
Signal 11
dbmi: Protocol error (invalid table/column name or unsupported column type)
      7 [main] v.in.ascii 3436 fork_copy: linked dll data/bss pass 0
failed, 0x850000..0x850030, done 0, windows pid 2808, Win32 error 487
ERROR: Cannot insert values: insert into lowfall_points values ( 461001,
       1191105.31, 300844.76, 119.52)

I should add that it uses an enourmous (>1.5 Gb!) amount of RAM for
this operation before crashing. I have broken up the file into 100,000
point chunks and it works fine. However, I want to interpolate the
points into a raster surface and will need them all in one file
eventually. So, what can I do?

Thanks,

David

--
David Finlayson
Marine Geology & Geophysics
School of Oceanography
Box 357940
University of Washington
Seattle, WA 98195-7940
USA

Office: Marine Sciences Building, Room 112
Phone: (206) 616-9407
Web: http://students.washington.edu/dfinlays

I am returning to GRASS after being away for some time. I was
wondering if the new default vector format (dbf) has a low size limit?
I am trying to load a 15 Mb x,y,z file of point data (about 550,000
points). I get a segmentation fault (Signal 11). I am on
Windows/Cygwin, the file is 2114.dat:

GRASS 6.0.0

..

I should add that it uses an enourmous (>1.5 Gb!) amount of RAM for
this operation before crashing. I have broken up the file into 100,000
point chunks and it works fine. However, I want to interpolate the
points into a raster surface and will need them all in one file
eventually. So, what can I do?

This was a memory leak bug in the DBF code which was fixed after 6.0.0
was released. It is fixed in CVS, or you can wait for the next version
of GRASS, or you can try using different database such as Postgres or
MySQL.

There may still be a bug in the "Registering lines" part of the import,
so you might have to load your 15M points in 2-3 passes of 5M each? Give
it a try and let us know how it gets on. Join with v.patch or perhaps
v.append from the GRASS Wiki AddOns page?

Hamish

I am trying to load a 15 Mb x,y,z file of point data (about 550,000
points).

Ok sorry I misread that, I thought you wrote 15M points. 550,000 points
should import fine with the latest version. I've imported 1.5M LIDAR
points- took about 400mb RAM and a minute or so to run.

Hamish