Example 'Build a Map from Lidar Data Using SLAM' Confusion

3 views (last 30 days)
Hi,
I'm trying to follow the example 'Build a Map from Lidar Data Using SLAM' from the computer vision toolbox.
In the function helperComputeInitialEstimateFromINS, there are three lines that I really tried to understand but without sucess.
My problem is not in the theory but in this particular implementation. Here are the lines
% The INS readings are provided with X pointing to the front, Y to the left
% and Z up. Translation below accounts for transformation into the lidar
% frame.
insToLidarOffset = [0 -0.79 -1.73]; % See DATAFORMAT.txt
Tnow = [-insData.Y(end), insData.X(end), insData.Z(end)].' + insToLidarOffset';
Tbef = [-insData.Y(1) , insData.X(1) , insData.Z(1)].' + insToLidarOffset';
  1. The DATAFORMAT.txt file says that the transform from IMU to lidar is [0.79 0 1.73]. Why is it not implemented this way then?
  2. Why are X and Y coordinates inverted and negative in Tnow and Tbef? It is clear in the DATAFORMAT.txt that the X coordinates is front, Y coordinates is left and Z coordinates is up
What makes even less sense to me is that the transformation to insToLidarOffset (i.e.: switching X,Y then put them negative) and the transformation to Tnow and Tbef (i.e.: switching X,Y then just put Y negative) is not even the same.
I appreciate the time you take to answer this question,
Marc
  1 Comment
Marc Tison
Marc Tison on 26 Nov 2020
I think there is an error in this example in fact. I know that the point cloud is not oriented in the same direction then the imu in the inertial frame of reference, but still the transformations here makes no sense.
Try running this example by removing insToLidarOffset to obtain
% The INS readings are provided with X pointing to the front, Y to the left
% and Z up. Translation below accounts for transformation into the lidar
% frame.
insToLidarOffset = [0 -0.79 -1.73]; % See DATAFORMAT.txt
Tnow = [-insData.Y(end), insData.X(end), insData.Z(end)].';
Tbef = [-insData.Y(1) , insData.X(1) , insData.Z(1)].';
and see that you actually get the exact same relative transformation at the end.

Sign in to comment.

Answers (1)

Vinoth Venkatesan
Vinoth Venkatesan on 2 Dec 2020
Edited: Vinoth Venkatesan on 2 Dec 2020
Hi Marc,
There are two things to note here:
  1. The Lidar coordinate system (the one which is used in the pointCloud objects returned by helperReadPointCloudFromFile) has the X-axis pointing to the right, Y-axis to the front (in the direction of motion of the ego-vehicle) and Z-axis pointing up. You can see this while visualizing the point cloud data in the "Load and Explore Recorded Data" section in the example. In order to account for this, the INS readings are switched so that INS-X -> Lidar-Y and INS-Y -> -(Lidar-X)
  2. The extrinsics provided in the DATAFORMATS.txt refer to the offset of the Velodyne frame w.r.t. the IMU (vehicle) frame. Since we want the offset of the IMU frame w.r.t. the Velodyne frame, the signs are switched as well, and so the offset is set to [0 -0.79 -1.73].
  3 Comments
Vinoth Venkatesan
Vinoth Venkatesan on 3 Dec 2020
I have a couple of follow-up questions:
  • Could you provide more info about what you mean by "Velodyne frame transformation is being cancelled at the last line of the function"? Since the offset is relatively low compared to the actual INS datapoints, the results you observe with or without the insToLidarOffset might appear to be similar, but are not the same. Try inspecting the initTform values after setting format long in the Command Window.
  • Could you provide details about the better odometry results that you were able to obtain? Also, by fixing the problem, do you mean removing the offset completely?
Marc Tison
Marc Tison on 3 Dec 2020
Here's the algebraic manipulations that are done in the helperComputeInitialEstimateFromINS function. You could substitude the helperComputeInitialEstimateFromINS function with the following to obtain the same results.
T_imu2lidar = [1 0 0 0.79
0 1 0 0
0 0 1 1.73
0 0 0 0];
M = [0 1 0 0
-1 0 0 0
0 0 1 0
0 0 0 1]; %rotation by z=-pi/2
insToLidar = inv(M)*inv(T_imu2lidar)*M;
Tfnow = [rotmat(quaternion([insData.Heading(end) 0 0],'euler','ZYX','point'),'point') [-insData.Y(end),insData.X(end),insData.Z(end)]'
0 0 0 1]; % X=-Y and Y=X because we are projecting INS data (aligned in x) to the point cloud data(align in y)
Tfbef = [rotmat(quaternion([insData.Heading(1) 0 0],'euler','ZYX','point'),'point') [-insData.Y(1),insData.X(1),insData.Z(1)]'
0 0 0 1];
Tfnow_lid = insToLidar*Tfnow;
Tfbef_lid = insToLidar*Tfbef;
T = inv(Tfbef_lid)*Tfnow_lid; % we see that this cancels insToLidar: inv(insToLidar*Tfbef)*instoLidar*Tfnow = inv(Tfbef)*Tfnow
Line 22 of the original helperComputeInitialEstimateFromINS function computes T as I am also doing here. As you can see, insToLidar is being cancelled. Which means, for example, that a rotation in INS frame is only rotating the lidar, which is wrong, the lidar should also translate.
For your second question, try checking the sum of the rmse from pcregisterndt after the error is fixed. In my solution, I brought everything to a frame of reference where the ego-vehicle faces the x direction. If you are interested in the solution, i could share.

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!