[GRASS-user] v.surf.rst limitations?

Hello, I have this LIDAR dataset with abou 5.5 millions points that I
imported with v.in.ascii -ztb, and I want to interpolate in a DEM with
5m resolution, which gives me about 13.5 million cells. When I try to
run v.surf.rst (GRASS 6.3 cvs), it get killed after 30-40 minutes. Is
there a limitation on v.surf.rst?

thanks

--
+-----------------------------------------------------------+
              Carlos Henrique Grohmann - Guano
  Geologist M.Sc - Doctorate Student at IGc-USP - Brazil
Linux User #89721 - carlos dot grohmann at gmail dot com
+-----------------------------------------------------------+
_________________
"Good morning, doctors. I have taken the liberty of removing Windows
95 from my hard drive."
--The winning entry in a "What were HAL's first words" contest judged
by 2001: A SPACE ODYSSEY creator Arthur C. Clarke

Hello,

I am using Grass61 and tried to
draw some Text surrounded with
a colored border. Is this possible?
The only solution I found was to
place the text in a colored box
(and there it's possible: the
box can be bordered...)?
(d.vect bgcolor=... bcolor=...)

Is it possible to set the
transparency of (shape)objects?

Thanks for any help,
Hans Braxmeier

--

Abteilungen SAI
und Stochastik

Universität Ulm
Helmholtzstr. 18
Raum E22
0731/50-23575

Carlos,

To the best of my knowledge, the only limitation on v.surf.rst is imposed by the limits of your hardware. IE, your probably running into memory allocation problems. You can try increasing your swap space, but be aware that v.surf.rst will take a painfully long time to process that many points. Alternately, I'd recommend you give r.in.xyz and v.surf.bspline a try.

Cheers,

Mike

On 4-Dec-06, at 3:49 AM, Carlos "Guâno" Grohmann wrote:

Hello, I have this LIDAR dataset with abou 5.5 millions points that I
imported with v.in.ascii -ztb, and I want to interpolate in a DEM with
5m resolution, which gives me about 13.5 million cells. When I try to
run v.surf.rst (GRASS 6.3 cvs), it get killed after 30-40 minutes. Is
there a limitation on v.surf.rst?

thanks

--
+-----------------------------------------------------------+
             Carlos Henrique Grohmann - Guano
Geologist M.Sc - Doctorate Student at IGc-USP - Brazil
Linux User #89721 - carlos dot grohmann at gmail dot com
+-----------------------------------------------------------+
_________________
"Good morning, doctors. I have taken the liberty of removing Windows
95 from my hard drive."
--The winning entry in a "What were HAL's first words" contest judged
by 2001: A SPACE ODYSSEY creator Arthur C. Clarke

_______________________________________________
grassuser mailing list
grassuser@grass.itc.it
http://grass.itc.it/mailman/listinfo/grassuser

__________________________________________________
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection aroundhttp://mail.yahoo.com

Hi Carlos and Mike,
At this moment, there is actually a limitation with v.surf.bspline. There is a bug and I think it is no possible working with 13.5 mill cells. I'm working on it, but by the moment It will reach only 4mill cells. Yes, I know it is no enough, sorry. I hope that in a near future this will be also fixed. Maybe, if you try with a lower resolution (about 20m) v.surf.bspline should work.

However, I think v.surf.bspline will be faster than v.surf.rst. :wink:

Regards,
Roberto.

Michael Perdue ha scritto:

Carlos,

To the best of my knowledge, the only limitation on v.surf.rst is imposed by the limits of your hardware. IE, your probably running into memory allocation problems. You can try increasing your swap space, but be aware that v.surf.rst will take a painfully long time to process that many points. Alternately, I'd recommend you give r.in.xyz and v.surf.bspline a try.

Cheers,

Mike

On 4-Dec-06, at 3:49 AM, Carlos "Guâno" Grohmann wrote:

Hello, I have this LIDAR dataset with abou 5.5 millions points that I
imported with v.in.ascii -ztb, and I want to interpolate in a DEM with
5m resolution, which gives me about 13.5 million cells. When I try to
run v.surf.rst (GRASS 6.3 cvs), it get killed after 30-40 minutes. Is
there a limitation on v.surf.rst?

thanks

--
Roberto Antolín Sánchez
Politecnico di Milano – Polo Regionale di Como
(Laboratorio di Geomatica V2.8)
Via Valleggio, 11 – 22100 Como, Italy
tel: +39 031 332 7533 || fax: +39 031 332 7519 email: roberto.antolin@polimi.it

I don't think this is a HW limitation. I am working on a CoreDuo
1.87GHz, 1.5Gb RAM, +60Gb free space, 1Gb swap.

Carlos

On 12/4/06, Michael Perdue <michael_perdue@yahoo.ca> wrote:

Carlos,

To the best of my knowledge, the only limitation on v.surf.rst is
imposed by the limits of your hardware. IE, your probably running
into memory allocation problems. You can try increasing your swap
space, but be aware that v.surf.rst will take a painfully long time
to process that many points. Alternately, I'd recommend you give
r.in.xyz and v.surf.bspline a try.

Cheers,

Mike

On 4-Dec-06, at 3:49 AM, Carlos "Guâno" Grohmann wrote:

> Hello, I have this LIDAR dataset with abou 5.5 millions points that I
> imported with v.in.ascii -ztb, and I want to interpolate in a DEM with
> 5m resolution, which gives me about 13.5 million cells. When I try to
> run v.surf.rst (GRASS 6.3 cvs), it get killed after 30-40 minutes. Is
> there a limitation on v.surf.rst?
>
> thanks
>
> --
> +-----------------------------------------------------------+
> Carlos Henrique Grohmann - Guano
> Geologist M.Sc - Doctorate Student at IGc-USP - Brazil
> Linux User #89721 - carlos dot grohmann at gmail dot com
> +-----------------------------------------------------------+
> _________________
> "Good morning, doctors. I have taken the liberty of removing Windows
> 95 from my hard drive."
> --The winning entry in a "What were HAL's first words" contest judged
> by 2001: A SPACE ODYSSEY creator Arthur C. Clarke
>
> _______________________________________________
> grassuser mailing list
> grassuser@grass.itc.it
> http://grass.itc.it/mailman/listinfo/grassuser

__________________________________________________
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection around
http://mail.yahoo.com

--
+-----------------------------------------------------------+
              Carlos Henrique Grohmann - Guano
  Geologist M.Sc - Doctorate Student at IGc-USP - Brazil
Linux User #89721 - carlos dot grohmann at gmail dot com
+-----------------------------------------------------------+
_________________
"Good morning, doctors. I have taken the liberty of removing Windows
95 from my hard drive."
--The winning entry in a "What were HAL's first words" contest judged
by 2001: A SPACE ODYSSEY creator Arthur C. Clarke

Hey Roberto,
Can you tell me more about this bug? I ran a tile through v.surf.bspline that was significantly larger than this and I never ran into any problems. Nor can I find any defects in the output.
GRASS 6.2.0 (Mudflats):~ > r.info map=BSP_FF
±---------------------------------------------------------------------------+
| Layer: BSP_FF Date: Sun Dec 3 18:53:28 2006 |
| Mapset: temp Login of Creator: maperdue |
| Location: Mudflats |
| DataBase: /mnt/Storage1/grass |
| Title: ( BSP_FF ) |

timestamp: none
Type of Map: raster Number of Categories: 255
Data Type: DCELL
Rows: 6878
Columns: 4798
Total Cells: 33000644
Projection: UTM (zone 10)
N: 5275187 S: 5268309 Res: 1
E: 584612 W: 579814 Res: 1
Range of data: min = 3.222074 max = 250.353352
Data Source:
Data Description:
generated by v.surf.bspline
±---------------------------------------------------------------------------+

This grid was 33mil cells with ~30mil input data points.

However, I do receive a seg fault when I try to define a set of sparce data nodes to be interolated (input_ext=).

Cheers,

Mike

PS Yes, v.surf.bspline is much faster than v.surf.rst. But more importantly, it seems to step through the data to prevent memory allocation errors. Is this true? Nice!!! :slight_smile:

Roberto Antolin roberto.antolin@polimi.it wrote:

Hi Carlos and Mike,
At this moment, there is actually a limitation with v.surf.bspline.
There is a bug and I think it is no possible working with 13.5 mill
cells. I’m working on it, but by the moment It will reach only 4mill
cells. Yes, I know it is no enough, sorry. I hope that in a near future
this will be also fixed. Maybe, if you try with a lower resolution
(about 20m) v.surf.bspline should work.

However, I think v.surf.bspline will be faster than v.surf.rst. :wink:

Regards,
Roberto.

Michael Perdue ha scritto:

Carlos,

To the best of my knowledge, the only limitation on v.surf.rst is
imposed by the limits of your hardware. IE, your probably running into
memory allocation problems. You can try increasing your swap space,
but be aware that v.surf.rst will take a painfully long time to
process that many points. Alternately, I’d recommend you give r.in.xyz
and v.surf.bspline a try.

Cheers,

Mike

On 4-Dec-06, at 3:49 AM, Carlos “Guâno” Grohmann wrote:

Hello, I have this LIDAR dataset with abou 5.5 millions points that I
imported with v.in.ascii -ztb, and I want to interpolate in a DEM with
5m resolution, which gives me about 13.5 million cells. When I try to
run v.surf.rst (GRASS 6.3 cvs), it get killed after 30-40 minutes. Is
there a limitation on v.surf.rst?

thanks


Roberto Antolín Sánchez
Politecnico di Milano – Polo Regionale di Como
(Laboratorio di Geomatica V2.8)
Via Valleggio, 11 – 22100 Como, Italy
tel: +39 031 332 7533 || fax: +39 031 332 7519
email: roberto.antolin@polimi.it


Be smarter than spam. See how smart SpamGuard is at giving junk email the boot with the All-new Yahoo! Mail

Carlos "Guâno" Grohmann wrote:

Hello, I have this LIDAR dataset with abou 5.5 millions points that I
imported with v.in.ascii -ztb, and I want to interpolate in a DEM with
5m resolution, which gives me about 13.5 million cells. When I try to
run v.surf.rst (GRASS 6.3 cvs), it get killed after 30-40 minutes. Is
there a limitation on v.surf.rst?

Carlos,

You might try r.surf.nnbathy from the GRASS AddOns WIKI. If nnbathy (a
program the script relies on) is built with serial processing, as
outlined in the r.surf.nnbathy manual, it is able to output virtually
any big grid you system can handle. The "only" memory limitation is
circa 0.2 KB overhead per each *input* point. Thus, your 5.5 mln points
should require about 1.1 GB for the most of nnabthy run + additional
~600 MB in the very beginning. Given your 1.87 GHz CPU the whole
interpolation should propably take about 5-6 hours.

Hope this helps. If you go for nnbathy - I'm eager to know how it
worked for you.

Best,
Maciek

Hi Michael and all

Michael Perdue <michael_perdue@yahoo.ca> ha escrito:

Hey Roberto,
Can you tell me more about this bug? I ran a tile through v.surf.bspline that was significantly larger than this and I never ran into any problems. Nor can I find any defects in the output.

...

This grid was 33mil cells with ~30mil input data points.

Good to know! :slight_smile: I had some problems with memory allocation weeks ago and I was probably wrong about the number of cell v.surf.bspline can manage. I'll check again. Thank you for the information.

However, I do receive a seg fault when I try to define a set of sparce data nodes to be interolated (input_ext=).

Yes, this is one of the "many" bugs. But, at least, this is also fixed. A new patch will be available tomorrow (I think). This new patch will also include a best stdout info. Well, it will actually include stdout info :stuck_out_tongue:

Just a curiosity, how much time did it take to run? Maybe the new version makes it to run faster :wink: Let me know, please.

Cheers,

Mike

PS Yes, v.surf.bspline is much faster than v.surf.rst. But more importantly, it seems to step through the data to prevent memory allocation errors. Is this true? Nice!!! :slight_smile:

Yes, that's totally true. If the current region is too large, it divides the whole region into smaller regions and it works with them one at a time. The module also considered overlapped zones between sub-regions to avoid discontinuities in the dtm heights.

Regards,
Roberto.

PS: I think the mailing list didn't receive my last mail, did it?. If that's right, I'm sorry. I have tons of problems with my mail server. However, you can find it below.

At this moment, there is actually a limitation with v.surf.bspline.
There is a bug and I think it is no possible working with 13.5 mill
cells. I'm working on it, but by the moment It will reach only 4mill
cells. Yes, I know it is no enough, sorry. I hope that in a near future
this will be also fixed. Maybe, if you try with a lower resolution
(about 20m) v.surf.bspline should work.

However, I think v.surf.bspline will be faster than v.surf.rst. :wink:

Regards,
Roberto.

Hans Braxmeier wrote:

I am using Grass61 and tried to
draw some Text surrounded with
a colored border. Is this possible?

Yes, but you should get GRASS 6.2 and use v.label + d.labels to have
it work well.

Is it possible to set the transparency of (shape)objects?

Yes, but again it's something new for GRASS 6.2's GIS manager (gis.m).

Hamish

Thanks for your answer Hamish

Now I try to install grass-6.2.0-1.fdr.fc4.i386.rpm,
running SuSE 10.0. Two failed dependencies are left:

        freetype >= 2.0.0 is needed by grass-6.2.0-1.fdr.fc4
        xorg-x11-Mesa-libGL >= 6.8 is needed by grass-6.2.0-1.fdr.fc4

Freetype version 2.1.10-4 is already installed as libfreetype.so.6
and xorg-x11-mesa version 6.8.2-100 (including libMesaGL.so.3 e.g.)

(I also tried to install xorg-x11-Mesa-libGL-6.8.2-31.i386.rpm ->
XFree86-libs < 4.2.0-50.5 conflicts with xorg-x11-Mesa-libGL-6.8.2-31)

Is there a way setting softlinks to get rid of this problem?
Or any other solutions? (after compiling the sources and installing
the system crashed...)!

Thanks, Hans

On Tue, 5 Dec 2006, Hamish wrote:

Hans Braxmeier wrote:
> I am using Grass61 and tried to
> draw some Text surrounded with
> a colored border. Is this possible?

Yes, but you should get GRASS 6.2 and use v.label + d.labels to have
it work well.

> Is it possible to set the transparency of (shape)objects?

Yes, but again it's something new for GRASS 6.2's GIS manager (gis.m).

Hamish

--

Abteilungen SAI
und Stochastik

Universität Ulm
Helmholtzstr. 18
Raum E22
0731/50-23575

Roberto, can v.surf.bspline work with opint data imported as
v.in.ascii -vbt? I mean, without topolopgy nor table? Or I do need a
table?

thanks

Carlos

On 12/4/06, roberto.antolin@polimi.it <roberto.antolin@polimi.it> wrote:

Hi Michael and all

Michael Perdue <michael_perdue@yahoo.ca> ha escrito:

> Hey Roberto,
> Can you tell me more about this bug? I ran a tile through
> v.surf.bspline that was significantly larger than this and I never
> ran into any problems. Nor can I find any defects in the output.
...
> This grid was 33mil cells with ~30mil input data points.
>
Good to know! :slight_smile: I had some problems with memory allocation weeks ago
and I was probably wrong about the number of cell v.surf.bspline can
manage. I'll check again. Thank you for the information.

> However, I do receive a seg fault when I try to define a set of
> sparce data nodes to be interolated (input_ext=).
>
Yes, this is one of the "many" bugs. But, at least, this is also
fixed. A new patch will be available tomorrow (I think). This new
patch will also include a best stdout info. Well, it will actually
include stdout info :stuck_out_tongue:

Just a curiosity, how much time did it take to run? Maybe the new
version makes it to run faster :wink: Let me know, please.

> Cheers,
>
> Mike
>
> PS Yes, v.surf.bspline is much faster than v.surf.rst. But more
> importantly, it seems to step through the data to prevent memory
> allocation errors. Is this true? Nice!!! :slight_smile:
>
Yes, that's totally true. If the current region is too large, it
divides the whole region into smaller regions and it works with them
one at a time. The module also considered overlapped zones between
sub-regions to avoid discontinuities in the dtm heights.

Regards,
Roberto.

PS: I think the mailing list didn't receive my last mail, did it?. If
that's right, I'm sorry. I have tons of problems with my mail server.
However, you can find it below.

> At this moment, there is actually a limitation with v.surf.bspline.
> There is a bug and I think it is no possible working with 13.5 mill
> cells. I'm working on it, but by the moment It will reach only 4mill
> cells. Yes, I know it is no enough, sorry. I hope that in a near future
> this will be also fixed. Maybe, if you try with a lower resolution
> (about 20m) v.surf.bspline should work.
>
> However, I think v.surf.bspline will be faster than v.surf.rst. :wink:
>
> Regards,
> Roberto.
>

_______________________________________________
grassuser mailing list
grassuser@grass.itc.it
http://grass.itc.it/mailman/listinfo/grassuser

--
+-----------------------------------------------------------+
              Carlos Henrique Grohmann - Guano
  Geologist M.Sc - Doctorate Student at IGc-USP - Brazil
Linux User #89721 - carlos dot grohmann at gmail dot com
+-----------------------------------------------------------+
_________________
"Good morning, doctors. I have taken the liberty of removing Windows
95 from my hard drive."
--The winning entry in a "What were HAL's first words" contest judged
by 2001: A SPACE ODYSSEY creator Arthur C. Clarke

Hi Maciek,

I compiled nnbathy with serial processing and imported the Lidar
datase with r.in.xyz (which is very fast). Then I trye to interpolate
with r.surf.nnbathy, but after about 30 minutes, it crashes with a msg
like "Error in column 1 row 1"..

Carlos

On 12/4/06, Maciej Sieczka <tutey@o2.pl> wrote:

Carlos "Guâno" Grohmann wrote:
> Hello, I have this LIDAR dataset with abou 5.5 millions points that I
> imported with v.in.ascii -ztb, and I want to interpolate in a DEM with
> 5m resolution, which gives me about 13.5 million cells. When I try to
> run v.surf.rst (GRASS 6.3 cvs), it get killed after 30-40 minutes. Is
> there a limitation on v.surf.rst?

Carlos,

You might try r.surf.nnbathy from the GRASS AddOns WIKI. If nnbathy (a
program the script relies on) is built with serial processing, as
outlined in the r.surf.nnbathy manual, it is able to output virtually
any big grid you system can handle. The "only" memory limitation is
circa 0.2 KB overhead per each *input* point. Thus, your 5.5 mln points
should require about 1.1 GB for the most of nnabthy run + additional
~600 MB in the very beginning. Given your 1.87 GHz CPU the whole
interpolation should propably take about 5-6 hours.

Hope this helps. If you go for nnbathy - I'm eager to know how it
worked for you.

Best,
Maciek

--
+-----------------------------------------------------------+
              Carlos Henrique Grohmann - Guano
  Geologist M.Sc - Doctorate Student at IGc-USP - Brazil
Linux User #89721 - carlos dot grohmann at gmail dot com
+-----------------------------------------------------------+
_________________
"Good morning, doctors. I have taken the liberty of removing Windows
95 from my hard drive."
--The winning entry in a "What were HAL's first words" contest judged
by 2001: A SPACE ODYSSEY creator Arthur C. Clarke

Hi Carlos and all

"Carlos \"Guâno\" Grohmann" <carlos.grohmann@gmail.com> ha escrito:

Roberto, can v.surf.bspline work with opint data imported as
v.in.ascii -vbt? I mean, without topolopgy nor table? Or I do need a
table?

thanks

Carlos

Yes, of course. In fact, this module was developed to work with LiDAR points -although it will be able to interpolate another kind of points in the next version(*)-. GRASS can't create the topology for the amount of data that LiDAR supplies. Indeed, it is also unnecessary create it if you work with a point vector map.

So, v.in.ascii -zbt should work (sorry, I don't know what -v flag within v.in.ascii means). Anyway, you should use -z flag in order to create a 3d vector map, i.e., in order to record z-coordinates. In other case, v.surf.bspline will interpolate zeros.

Cheers,
Roberto.

PS: (*) I sent the new version to Markus this morning (in Europe). So, CVS update will be ready very soon.

Hi Roberto,

This grid was 33mil cells with ~30mil input data points.

Just a curiosity, how much time did it take to run?

It took just under 4 hrs to grid 30 million data points.

To give an idea of how that compares; with a smaller data set consisting of ~10 million nodes and 12 million cells;

v.surf.rst took ~4100min (just under 3 days)
v.surf.bspline took 55min

That's almost 2 orders of magnitude faster.

( Dell precision 530
2 x 1.7 xeon's
2Gigs ram
8 gigs swap
3 x 15k scsi drives in a raid0 array.)

Maybe the new version makes it to run faster :wink: Let me know, please.

I'm going to be gone for a little over a week but I'll run some more tests and benchmarks when I get back.

Cheers,

Mike
__________________________________________________
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection aroundhttp://mail.yahoo.com

Carlos "Guâno" Grohmann wrote:

I compiled nnbathy with serial processing and imported the Lidar
datase with r.in.xyz (which is very fast). Then I trye to interpolate
with r.surf.nnbathy, but after about 30 minutes, it crashes with a msg
like "Error in column 1 row 1"..

Hi Carlos,

Thanks for the feedback!

Propably nnbathy failed and r.in.ascii couldn't import it's output. I
should add error trapping in r.surf.nnbathy for such cases (I've got
several fixes and improvements waiting; propably will do on Christmas).

What: platform, compiler, nnbathy version?

Can I have a sample of your data crashing the nnbathy, along with your
region settings and the command syntax?

Can you add a following command into line 140 of the script:

cp $TMP.${PROG}.output_xyz $HOME/nnbathy_out_xyz

and send me the $HOME/nnbathy_out_xyz file?

Thanks,
Maciek

roberto.antolin@polimi.it wrote:

So, v.in.ascii -zbt should work (sorry, I don't know what -v flag
within v.in.ascii means). Anyway, you should use -z flag in order to
create a 3d vector map, i.e., in order to record z-coordinates. In
other case, v.surf.bspline will interpolate zeros.

fyi, I just changed v.in.ascii in 6.3-cvs to automatically trigger the
-z flag if zcol= was used. I can't think of a case where you wouldn't
want that. If there is one, let me know and I'll revert the change.

Hamish

> PS Yes, v.surf.bspline is much faster than v.surf.rst. But more
> importantly, it seems to step through the data to prevent memory
> allocation errors. Is this true? Nice!!! :slight_smile:

do you mean steps through input points?

Yes, that's totally true. If the current region is too large, it
divides the whole region into smaller regions and it works with them
one at a time.

does this use GRASS's lib/segment/ library?

The module also considered overlapped zones between sub-regions to
avoid discontinuities in the dtm heights.

this is something that has frustrated me about the RST modules when
dealing with data which rapidly changes density. it's hard to get the
tuning right, auto-balance would be nice.

you didn't update the CVS, did you? Don't do it.

no, I didn't.

I uploaded it on my version and I'll send altogether today :slight_smile:

it seems to be there now.

Now one question. Due to allocation memory problems with raster on my
pc some weeks ago, I introduced a limitation on the raster dimensions
only in my v.surf.bspline version.

..

So, I decide to set a limitation in the number of raster cells: about
30mill. I think it is high enough.

But, which is the best way to controll these kinds of things? Should I
set a limitation in the number of cells? If afirmative, which one?
Maybe, considering float cells instead double? As a possible thing I
would like to allocate the whole raster (as It works now) instead of
considering smaller rasters and merge them at the end.

Please remove the limitation. If the user has 28GB RAM and compiled with
large file support, they should be able to use it. If you use G_alloc*()
and friends, they should exit with a nice "out of memory" message if you
don't have enough- let them do their job. If you don't alloc the memory
until the end of your program, you might alloc() then free() it right at
the beginning as a test to save the annoyance of waiting for 30 minutes
before it crashes.

Is is really necessary to allocate memory for the entire raster map at
one time? Can you allocate and free|clear+reuse for each segment?

From your statement that it runs in segments this sounds plausable. (?)

If the method demands that you must store the entire raster in memory,
you might be able to use a percent= option like r.in.xyz, then run it in
multiple passes.

You can allocate float instead of double to save memory, but if you do
that you should give users a choice. e.g. r.in.xyz defaults to
type=FCELL.

Hamish

Hi Michael and all

Michael Perdue <michael_perdue@yahoo.ca> ha escrito:

Hi Roberto,

To give an idea of how that compares; with a smaller data set
consisting of ~10 million nodes and 12 million cells;

v.surf.rst took ~4100min (just under 3 days)

Did you try with the last v.surf.rst CVS version? Maybe I'm wrong but I think that a new quad-tree index was done for this module. Helena Mitasova said so at Laussane. I also took a look into the source and saw this new quad-tree index. So, it should be faster than before. 3 days seem too much time.

Roberto.

Well, today it decided to work. I used another dataset, with ~16.5
mill pts, with a resolution of 2.5m, ~30 mill cells. It took ~8 hours
to interpolate.

really nice!

Carlos

On 12/5/06, Maciej Sieczka <tutey@o2.pl> wrote:

Carlos "Guâno" Grohmann wrote:

> I compiled nnbathy with serial processing and imported the Lidar
> datase with r.in.xyz (which is very fast). Then I trye to interpolate
> with r.surf.nnbathy, but after about 30 minutes, it crashes with a msg
> like "Error in column 1 row 1"..

Hi Carlos,

Thanks for the feedback!

Propably nnbathy failed and r.in.ascii couldn't import it's output. I
should add error trapping in r.surf.nnbathy for such cases (I've got
several fixes and improvements waiting; propably will do on Christmas).

What: platform, compiler, nnbathy version?

Can I have a sample of your data crashing the nnbathy, along with your
region settings and the command syntax?

Can you add a following command into line 140 of the script:

cp $TMP.${PROG}.output_xyz $HOME/nnbathy_out_xyz

and send me the $HOME/nnbathy_out_xyz file?

Thanks,
Maciek

--
+-----------------------------------------------------------+
              Carlos Henrique Grohmann - Guano
  Geologist M.Sc - Doctorate Student at IGc-USP - Brazil
Linux User #89721 - carlos dot grohmann at gmail dot com
+-----------------------------------------------------------+
_________________
"Good morning, doctors. I have taken the liberty of removing Windows
95 from my hard drive."
--The winning entry in a "What were HAL's first words" contest judged
by 2001: A SPACE ODYSSEY creator Arthur C. Clarke

Carlos "Guâno" Grohmann wrote:

Well, today it decided to work. I used another dataset, with ~16.5
mill pts, with a resolution of 2.5m, ~30 mill cells. It took ~8 hours
to interpolate.

really nice!

That's great. As to timing it performed better than I have expected. Cool.

But, first of all, I would really appreciate if you could provide the
details I asked you about in my previous email, and, if possible, the
data it failed with. I want to make my script really good (or help
Pavel Sakov fixing nnbathy), soe it doesn't fail with some data and
works with other.

A thought: maybe you just got your disk full during the first run, when
it failed? Note nnbathy creates 3 *huge* temp files - plain ASCII x,y,z
lists of the input and output, and the output converted into GRASS
ASCII format. Then it makes a GRASS raster from the latter - and only
then it removes the 3 temporary ones, when the script terminates. So at
the runtime, you need several times more space than the resulting GRASS
raster actually occupies. It is easy to run out disk space
interpolating into a grid of several mln points then, using mlns of
points for input...

Maciek