开发者

Average distance of two line segments in 2d/3d

Intro: I'm working on an image processing task trying to find two borders of an object, which can be described by two straight line segments. I'm using some variant of the hough line transform to find line segments in the target image. There are multiple lines found by the hough transform per border of the object (sharing a very small angle) and there might be some lines found which don'开发者_如何学编程t correspond to the borders of the object somewhere in the image (false positives). Since the spatial relationship (angle) between the two borders of the object is approximately known, i figured I'd go with some kind of clustering approach to leave out the false positives and calculate the average line segment out of the multiple line segments found per border.

Approach: In order to cluster the line segments one needs to define a similarity measure for the location of each segment. I figured I'd go with a tuple of angle between two line segments and some sort of average distance between two line segments. This is also where I'm wondering what the best approach would be to compute this average distance measure. A somewhat simple approach would be to sample each segment at discrete locations and measure the closest distance (L2) of each sampled point to the other line segment, sum the distances up and divide the sum by the number of samples. I'm sure there is a more clever way to do this, any suggestions?

Hint: I'm working in C++ with a couple of LGPL/BSD licensed toolkits (OpenCV, Boost), so somewhat special mathematical operations like integration in mathematica might be hard to implement.


Assuming that I understand the problem correctly, what about the following solution: When you have lines, maybe you could try to determine the starting point and end point of those lines. When you calculated those, you simply measure the distance of both starting points and of both end points and just calculate the average of the distances.

I assume that you have the lines as pixel values. Start and end points could be calculated by finding the maximum and minimum values of x and y of the line's pixels.


Instead of average distance, how about minimum distance? There's an extended discussion of how to compute this here.


Given a set of n line segments, with the ith line segment extending from point (x0i,y0i) to (x1i, y1i):

Look at the first line segment and see if it is more nearly vertical or horizontal. If abs(y00-y10) > abs(x00-x10) then set flag and interchange x and y coordinates. This will prevent the infinite slope problem. (I guess you could still have a problem if two line segments were perpendicular, but if your line segments are that different, the average line doesn't have much meaning.)

Using all 2n endpoints, calculate a least-squares fit to a straight line

y = a*x + b

For each endpoint, calculate abs(a*xij + b - yij). This measures the distance parallel to the y axis of that endpoint to the average line. I guess that if this is bigger than some amount, you could reject that line segment and repeat the fit without it. If it is less than a couple of pixels, you could replace the y coordinate with the fitted one to move that end of the line segment to the fitted line.

If the interchange flag is set, then swap x and y back.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜