Detecting the axis of rotation from a pointcloud
I'm trying to auto-detect the axis of rotation on a 3d pointcloud.
In other words, if I took a small 3d pointcloud, chose a single axis of rotation, and make several copies of the points at different rotation angles, then I get a larger pointcloud.
The input to my algorithm is the larger pointcloud, and the desired output is the single axis of symmetry. And eventually I'm going to compute the correspondences between points that are rotations of each other.
The size of the larger pointcloud is on the order of 100K points, and the number of rotational copies made is unknown.
The rotation angles in my case have constant deltas, but don't necessarily span 360 degrees. For example, I might have 0, 20, 40, 60. Or I might have 0, 90, 180, 270. But I won't have 0, 13, 78, 212 (or if I do, I don't care to detect it).
This seems like a computer vision problem, but I'm having trouble figuring out how to precisely find the axis. The input will generally be very clean, close to float accur开发者_运维技巧acy.
I don't have the original smaller pointcloud that was rotated/copied to make the larger pointcloud. I do know that the data is synthetic with very little noise (it's generally the output of another program).
We can't readily compute the possible numbers of points in the smaller cloud, because right along the axis the points aren't duplicated, unfortunately. If we knew which points were along the axis then we could come up with possible factors, but then we'd have already solved the problem.
--
Thanks everyone for your suggestions. It looks like my final algorithm is going to try to come up with cliques of matching points using a k-nn metric. Each clique will give an axis. I can then use RANSAC to fit an axis to the results of all the cliques.
Well, the following approach might be useful, but it depends on the specific of your data. It is based on the assumptions that the gap between the neighbouring positions are big enough (20 degrees is probably fine) and the small point cloud approximates a surface (the last one could be overcome). I suggest using local feature matching (the popular technique in computer vision).
First, for each point of the large cloud you should compute local descriptors (like SIFT or SURF for images). The most popular one for point clouds is spin image:
Johnson, A., & Hebert, M. (1999). Using spin images for efficient object recognition in cluttered 3 d scenes. IEEE Transactions on Pattern Analysis and Machine Intelligence, 21(5), 433–449. Citeseer. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.23.8816&rep=rep1&type=pdf.
The advanced modification is used here:
Endres, F., Plagemann, C., Stachniss, C., & Burgard, W. (2009). Unsupervised Discovery of Object Classes from Range Data using Latent Dirichlet Allocation. In Robotics: Science and Systems. Seattle, USA.
If it will be computationally hard, ask me how to reduce the dimensionality without loosing a lot in discriminative power, I've done that once.
Then you should match the descriptors, i.e. to find the nearest neighbours for each of them in their high-dimensional space. If the small cloud has been rotated 3 times, there should be 3 good nearest neighbours. However, because of the cloud self-intersections the matches will probably contain noise. You still have to find the axis that fits well for a big amount of matches (though not all of them). Here you can use some robust fitting like RANSAC (you should do some math to define the likelihood of an axis position w.r.t. the found matches). Note that it differs from the methods that the others suggested. You should fit just one line instead of the family of planes, based on the descriptors, not the original points (RANSAC will probably fail to fit a plane having 4-5 correct points and 100K outliers).
Also note:
If you have the small scan that does not approximate a surface, you should come up with a different rotation-invariant descriptor, not spin images.
In order to compute normals and do the retrieval, you might check out this library: http://graphics.cs.msu.ru/en/science/research/3dpoint/lidark (a major release is coming soon).
The following assumes that there are 3 or more copies. Consider the convex hull of the large pointcloud. Find two of its faces that are parallel. The rotation axis will be perpendicular to these. If you find more than one pair, just test each orientation.
Obviously, this doesn't work if the most extreme points according to the axis are right on the axis, however in the general case, this is very rare.
Pick any point, and find two other points that are equidistant from it. This should take O(n^2) worst-case, but heuristics can greatly reduce this. These three points uniquely determine a circle. If there is a fourth point at the same distance from either the first or the third points, that greatly increases your confidence in the circle.
The center of the circle is a point on the axis, and the normal vector to the circle is the directional vector of the axis.
Given the axis, you can determine the angle between rotations and check your guess with some other points. If it's wrong, pick another point and start over.
Edit: By the way, obviously this is non-deterministic, but if your point data is as clean as you say and you use good heuristics, it should be quite accurate.
1) If you find the centroid C of the larger pointcloud, ISTM the original rotation axis would have to pass through that point.
Never mind: I did not see the requirement that the rotations might not span a complete circle. For your 20,40,60 example, the centroid would not be on the rotational axis.
Maybe the following ref would help?
"Reconstruction of surfaces of revolution with partial sampling" http://portal.acm.org/citation.cfm?id=980064
Take a look at the techniques used in stereo vision to compute the homography between two images--your problem with sets of point clouds seems analagous to matching points in multiple images of the same object/scene. Seems like you could apply a RANSAC algorithm to compute a transformation between your sets of point clouds.
Some Points:)
- If we have 1 initial point never rotated then there are infinite number of axes.
- If we have 1 initial point rotated 1 times then also there are infinite number of axes.
- If we have 1 initial point and rotate it 2 times then we can find out the axis because 3 points uniquely determine a plane. Perpendicular to which through the point equidistant from each of the 3 points(1 initial + 2 rotated) is the axis.
- Please note that rotating through 360 makes no sense.
- Choose any one point(P) from the cloud.
- Trace the locus(L) of the point P ( as P is rotating about some axis, L should be a circle)
- Perpendicular to the plane of the circle(L) that passes through its center is the axis of rotation of the cloud.
What I mean is axis of rotation of a rigid body is same as axis of rotation of any single particle of the body. We need not care about all the other points.
A crazy idea ...
If the same point is rotated about the same axis several times, all the points will lie in the same plane. Depending on your dataset it could be possible to detect this plane using the ransac method.
The axis of rotation will be perpendicular to this plane, and it should be relatively easy to determine the location of the axis once its orientation has been determined.
There are 2 things that you should consider:
- The angular span of the point cloud.
- The angle of rotation.
Now if (rotation > span) the solution is simpler, because you have to look for a sub pattern and based on its occurance, try to match a larger pattern.
in case (rotation < span) , you will have to over look the edge-effects, because within the region ( after first rotation, prior to last rotation) you will still have a symmetric point cloud, with an angle of symmetry = span angle; as in the above case.
In case you are not aware in what category you fall, safe to assume in the second.
As mentioned before, RANSAC is the best means of pattern matching, because it takes less time, and gives decent results. The only issue that you left with is estimation angle of span of the mini point cloud during initialization. this estimation is difficult to make. So if you have enough computation power/ time, i would suggest iterating with 1 degree steps. starting from a modest 5 degree going on to say 45 degree. As the results start converging increase the angular precision.
Since the original point cloud is small, the simplest solution might be RANSAC:
- Pick three points at random
- Compute the axis of rotation for these points (line perpendicular to the circle and passes though the center)
- Do the other points agree?
- If not, iterate until correct axis found
The probability for a correct estimation is 1/((n-1)(n-2)), where n is the number of points in the original cloud. Since each test can be done very quickly, this might be a useful approach.
精彩评论