On Sat, 2008-11-29 at 19:01 -0800, Hamish wrote:
Nikos wrote:
> While QGIS (unstable, revision 9711) can open and report GRASS' region
> setting for a lat-long location and load/view the above mentioned
> "coastline" dataset (both the shapefile and the GRASS vector, the one
> after v.in.ogr -c and NOT after v.clean!!), GRASS reports the error:
>
> g.region -p
> ERROR: default region is invalid
> line 4: <south: 90:22:00.17015S>
latitude > 90deg can not exist.
What might be the reason creating this "illegal" latitude value?
# the shapfile seems to be "legal"
# ogrinfo coastlines.shp -al -so
INFO: Open of
`/geo/geodata/world/coastlines/coastlines_HR/coastlines.shp'
using driver `ESRI Shapefile' successful.
Layer name: coastlines
Geometry: Polygon
Feature Count: 181148
Extent: (-180.000000, -90.000000) - (180.000000, 83.633286)
Layer SRS WKT:
GEOGCS["GCS_WGS_1984",
DATUM["WGS_1984",
SPHEROID["WGS_1984",6378137.0,298.257223563]],
PRIMEM["Greenwich",0.0],
UNIT["Degree",0.0174532925199433]]
[...]
fix it in line 4 of $LOCATION/PERMANENT/DEFAULT_WIND or from within the
PERMANENT mapset run "g.region -s" to set the current (valid) region
to be the default one.
Instead I created a new location (g.proj -c georef=TheShapefile
location=...) and it works.
# region report
# no rasters present, so no interest for resolution(right?)
g.region -p
projection: 3 (Latitude-Longitude)
zone: 0
datum: wgs84
ellipsoid: wgs84
north: 83:37:59.82985N
south: 90S
west: 180W
east: 180E
nsres: 8:40:53.991492
ewres: 18
rows: 20
cols: 20
cells: 400
WRT large shapefiles: your buffer overflow/segfault probably has little
to do with the size of the file. Others regularly load much bigger
shapefiles into/out of grass. Large file errors typically start to show
themselves around the 2GB mark. You need to run a GDB backtrack to see
the cause.
I want to believe that segfaults were related with Ubuntu 8.10 + gdal
1.5.3. I think I have found another, recent, reference about this in the
archive.
the "florida coastline" problem (processing huge single polyline
boundaries) does not result in the program breaking. it is just an
inefficient method which takes a very very long time. It will not result
in a buffer overflow or a segfault. That is something different.
* The shapefile I am trying to "work-out" is a coastline dataset as
well. But it never really goes through the v.in.ogr process or the
"v.in.ogr -c" + "v.split" + "v.clean tool=break" (see previous posts on
this thread of course). It always "hangs" at the "breaking boundaries"
step.
----------------------------------------------------------------------
* Question: What's the difference between of "polygon" and "boundary"?
* Why the "break polygons" step during "building" process? I thought
that there are *only* -nodes & primitives- points, centroids, lines,
boundaries, areas, isles
----------------------------------------------------------------------
* After stopping the process "Ctrl+C" v.info complains (naturally I
guess) about:
ERROR: Unable to open vector map <cstlns_global@PERMANENT> on level 2.
Try
to rebuild vector topology by v.build.
* Then, v.build warns:
WARNING: Coor files of vector map <cstlns_global@PERMANENT> is larger
than
it should be (29979221 bytes excess)
and starts working.
* Question: How long should I leave the system working? As long as it
takes? My last attempt was >30 hours.
How about "v.in.ogr spatial=" ?
Hamish
It works, at least over Greece :-). But this does not resolve the
"problem", does it? In case one needs the whole vector map (in my case a
global dataset) it shouldn't be necessary to import it step-wise.
Regards, Nikos