#3319: v.overlay: incorrect results with "not" operator
-------------------------+---------------------------------
Reporter: sbl | Owner: grass-dev@…
Type: defect | Status: new
Priority: major | Milestone: 7.2.1
Component: Vector | Version: svn-releasebranch72
Keywords: v.overlay | CPU: Unspecified
Platform: Unspecified |
-------------------------+---------------------------------
In a project I want to clip away the sections from a river network that
are within lakes.
Unfortunately, some streams vanished from the result completely (which
should not)...
Replying to [ticket:3319 sbl]:
> In a project I want to clip away the sections from a river network that
are within lakes.
>
> Unfortunately, some streams vanished from the result completely (which
should not)...
Please try v.overlay in trunk r70800.
The problem lies with Vect_break_lines() and Vect_break_lines_list() which
create new vertices that are sometimes too close to existing vertices
(floating point representation error). Radim was aware of that problem in
v.overlay. I (hopefully) fixed the symptom in v.overlay. The real fix
would be to not use a fixed value for rethresh (Radim was also aware of
that problem) in Vect_line_intersection() and Vect_line_intersection2(),
but instead set rethresh to the smallest representable difference (ULP).
Thanks Markus for looking into this! \\
I just tested and after a visual inspection of the results the issue looks
fixed. \\
Will you backport it or do you want me to test more systemically before
backporting?
Replying to [comment:2 sbl]:
> Thanks Markus for looking into this! \\
> I just tested and after a visual inspection of the results the issue
looks fixed. \\
> Will you backport it or do you want me to test more systemically before
backporting?
I have backported the fix to relbr72 in r70703. In theory it is still
possible that lines are missing or are kept erroneously, but this would
require more substantial fixes to Vect_break_line() and
Vect_break_line_list(). Reducing priority.
In the particular case where I stumbled upon the issue, the fix seem to
cover all cases.\\
However, if these errors still can occur, it is probably only a question
of the number of features processed if the problem shows up?\\
Thus, I would probably prefer to keep it open and re-target. But of course
I would not object if you close it...