Hi all,
to provide a high resolution vector coastline dataset I have
converted the
G S H H S
Global Self-consistant Hierarchical High-resolution Shorelines
from Paul Wessel and Walter. H. F. Smith
(ftp://gmt.soest.hawaii.edu/pub/wessel/gshhs/).
A problem arises from v.support which doesn't run on large files (at all or
or in reasonable time, process is still running here). If you are interested
to test GRASS with a *large* vector file, I have prepared the GMT dataset
for you:
To describe it's resolution:
(From the GMT page:) The file gshhs_h.c (High resolution) is at 0.2 km, I
have converted gshhs_f.c (Full resolution) at 0.?km (no value given). So I
have converted the *full* resolution dataset gshhs_f to GRASS ASCII vector
format for you to download at:
ftp://www.geog.uni-hannover.de/outgoing/gshhs_f_grassascii.tar.gz (58MB)
-rw-r--r-- 1 neteler users 226115131 Dez 8 17:42 dig_ascii.gshhs_f
-rw-r--r-- 1 neteler users 10561544 Dez 8 17:42 dig_att.gshhs_f
-rw-r--r-- 1 neteler users 2166637 Dez 8 17:42 dig_cats.gshhs_f
You can use this lat/long world coastline dataset within "global dataset"
location which you get from here (or you define an empty world lat/long
location):
http://www.geog.uni-hannover.de/grass/data.html
Add the missing directories:
$LOCATION/dig_ascii
$LOCATION/dig_cats
and move the files into their directories (rename to gshhs_f or similar of
course). Then
v.in.ascii in=gshhs_f out=gshhs_f
v.support gshhs_f op=build
However, this takes muuucchhh time... Does anyone see a chance to speed
up the v.support topology build function? Maybe it would be useful to
add progress output (percentage) to v.support.
Regards
Markus
----------------------------------------
If you want to unsubscribe from GRASS Development Team mailing list write to:
minordomo@geog.uni-hannover.de with
subject 'unsubscribe grass5'
Markus Neteler wrote:
Hi all,
to provide a high resolution vector coastline dataset I have
converted the
G S H H S
Global Self-consistant Hierarchical High-resolution Shorelines
from Paul Wessel and Walter. H. F. Smith
(ftp://gmt.soest.hawaii.edu/pub/wessel/gshhs/).
A problem arises from v.support which doesn't run on large files (at all or
or in reasonable time, process is still running here). If you are interested
to test GRASS with a *large* vector file, I have prepared the GMT dataset
for you:
To describe it's resolution:
(From the GMT page:) The file gshhs_h.c (High resolution) is at 0.2 km, I
have converted gshhs_f.c (Full resolution) at 0.?km (no value given). So I
have converted the *full* resolution dataset gshhs_f to GRASS ASCII vector
format for you to download at:
ftp://www.geog.uni-hannover.de/outgoing/gshhs_f_grassascii.tar.gz (58MB)
-rw-r--r-- 1 neteler users 226115131 Dez 8 17:42 dig_ascii.gshhs_f
-rw-r--r-- 1 neteler users 10561544 Dez 8 17:42 dig_att.gshhs_f
-rw-r--r-- 1 neteler users 2166637 Dez 8 17:42 dig_cats.gshhs_f
You can use this lat/long world coastline dataset within "global dataset"
location which you get from here (or you define an empty world lat/long
location):
http://www.geog.uni-hannover.de/grass/data.html
Add the missing directories:
$LOCATION/dig_ascii
$LOCATION/dig_cats
and move the files into their directories (rename to gshhs_f or similar of
course). Then
v.in.ascii in=gshhs_f out=gshhs_f
v.support gshhs_f op=build
However, this takes muuucchhh time... Does anyone see a chance to speed
up the v.support topology build function? Maybe it would be useful to
add progress output (percentage) to v.support.
I don't know the details but execution time against file size grows at a
rate that is `more than linear' for large files. So large files, like
vector grids, do build very slowly. I think the answer is segmentation.
Perhaps the technique of `walls' where a triangulation topology is built
from a set of points could be modified to build areas. Then only small
regions are built at a time working out from the segment boundaries.
This needs to load the respective segments into a network in memory to
be efficient. A rework of the build function is desperately needed. This
could be in place for 5.1. Anyone else has had thoughts about this?
David
Regards
Markus
----------------------------------------
If you want to unsubscribe from GRASS Development Team mailing list write to:
minordomo@geog.uni-hannover.de with
subject 'unsubscribe grass5'
----------------------------------------
If you want to unsubscribe from GRASS Development Team mailing list write to:
minordomo@geog.uni-hannover.de with
subject 'unsubscribe grass5'
Hi Andreas,
On Fri, Dec 08, 2000 at 08:25:59PM +0100, Andreas Lange wrote:
Hallo Markus,
koenntest Du vielleicht das Skript/Programm posten, mit dem Du die Daten
konvertierst?
Ich wuerde mir gerne den D'ld von 58 MB sparen, die GSHHS habe ich hier
auf CDROM.
Sure: I have put the sources of Simon Cox' convert program here:
ftp://www.geog.uni-hannover.de/outgoing/
gshhstograss.tar.gz
It's the program from Simon with a few additions (includes added) to
make it compile.
It reads the gshhs binary data files and outputs ASCII vectors.
Probably we should make a nice GRASS module from gshhstograss (r.in.gshhs)?
For details please refer to
ftp://gmt.soest.hawaii.edu/pub/wessel/gshhs/README
Cheers
Markus
Markus Neteler wrote:
>
> Hi all,
>
> to provide a high resolution vector coastline dataset I have
> converted the
> G S H H S
> Global Self-consistant Hierarchical High-resolution Shorelines
> from Paul Wessel and Walter. H. F. Smith
> (ftp://gmt.soest.hawaii.edu/pub/wessel/gshhs/).
>
> A problem arises from v.support which doesn't run on large files (at all or
> or in reasonable time, process is still running here). If you are interested
> to test GRASS with a *large* vector file, I have prepared the GMT dataset
> for you:
>
> To describe it's resolution:
> (From the GMT page:) The file gshhs_h.c (High resolution) is at 0.2 km, I
> have converted gshhs_f.c (Full resolution) at 0.?km (no value given). So I
> have converted the *full* resolution dataset gshhs_f to GRASS ASCII vector
> format for you to download at:
>
> ftp://www.geog.uni-hannover.de/outgoing/gshhs_f_grassascii.tar.gz (58MB)
> -rw-r--r-- 1 neteler users 226115131 Dez 8 17:42 dig_ascii.gshhs_f
> -rw-r--r-- 1 neteler users 10561544 Dez 8 17:42 dig_att.gshhs_f
> -rw-r--r-- 1 neteler users 2166637 Dez 8 17:42 dig_cats.gshhs_f
>
> You can use this lat/long world coastline dataset within "global dataset"
> location which you get from here (or you define an empty world lat/long
> location):
>
> Bereich Geographie – Naturwissenschaftliche Fakultät – Leibniz Universität Hannover
> Add the missing directories:
> $LOCATION/dig_ascii
> $LOCATION/dig_cats
>
> and move the files into their directories (rename to gshhs_f or similar of
> course). Then
> v.in.ascii in=gshhs_f out=gshhs_f
> v.support gshhs_f op=build
>
> However, this takes muuucchhh time... Does anyone see a chance to speed
> up the v.support topology build function? Maybe it would be useful to
> add progress output (percentage) to v.support.
>
> Regards
>
> Markus
>
> ----------------------------------------
> If you want to unsubscribe from GRASS Development Team mailing list write to:
> minordomo@geog.uni-hannover.de with
> subject 'unsubscribe grass5'
--
Andreas Lange, 65187 Wiesbaden, Germany, Tel. +49 611 807850
Andreas.Lange@Rhein-Main.de - A.C.Lange@GMX.net
--
Dipl.-Geogr. Markus Neteler * University of Hannover
Institute of Physical Geography and Landscape Ecology
Schneiderberg 50 * D-30167 Hannover * Germany
Tel: ++49-(0)511-762-4494 Fax: -3984
----------------------------------------
If you want to unsubscribe from GRASS Development Team mailing list write to:
minordomo@geog.uni-hannover.de with
subject 'unsubscribe grass5'
Markus,
Great to hear about a program to import the GMT shoreline files. I have
been using these shorelines for a while now. To get them into GRASS I
use the GMT pscoast program which will dump to a multi-segment file. I
then have to run it through some awk routines to get it into a dig_ascii
format. A program to do this and maintian attributes etc, would be a
nice addition. I have downloaded the program and will give it a try.
I am still looking at NVIZ and the color issue. I am trying the
suggestions from Bill Brown with respect to the GL_NORMALIZE routines,
with interesting results. I will keep you posted.
--
Bob Covill
Tekmap Consulting
P.O. Box 2016
Fall River, N.S.
B2T 1K6
Canada
E-Mail: bcovill@tekmap.ns.ca
Phone: 902-860-1496
Fax: 902-860-1498