Improve Ego Vehicle Localization
This example shows how to improve ego vehicle localization by fusing global positioning system (GPS) and inertial measurement unit (IMU) sensor data. GPS and IMU sensors suffer from noise and inaccuracies, such as drift in position and orientation. This example enables you to generate an accurate ego trajectory for creating a virtual scenario from recorded sensor data. To interpret ego sensor information accurately, you must accurately localize the ego vehicle.
Introduction
Using recorded vehicle data, you can generate virtual driving scenarios to recreate a real-world scenario. Virtual scenarios enable you to study and visualize these recorded scenarios. To generate a reliable virtual scenario, you must have accurate ego vehicle localization data. However, GPS and IMU sensor data often suffers from noise, as well as other inaccuracies such as drift in the position and orientation of the ego vehicle. This example shows how to perform information fusion using GPS and IMU sensor data, as well as the prior vehicle position, to correct the drift in the ego position and improve ego vehicle localization. You then use these accurate ego trajectories to create a virtual driving scenario. This example also shows how to construct a map with estimated poses and lidar data to visually analyze the estimated trajectory of the ego vehicle.
This example uses Udacity® data recorded using GPS, IMU, camera, and lidar sensors. You preprocess and align the data recorded from each sensor to its respective reference frame. Then, you follow these steps to improve ego vehicle localization through accurate estimation of the position and orientation of the ego vehicle.
Fuse GPS and IMU sensor data.
Convert poses from the north-east-down (NED) to the east-north-up (ENU) frame.
Correct drift using start and end pose information.
This example uses OpenStreetMap® to get the road network of the area where the data was recorded. Using this road network and the generated ego trajectories, you create a virtual driving scenario.
This flowchart gives an overview of the workflow presented in this example.
Sensor Data block consists of preprocessed and aligned GPS, IMU, and lidar data.
Ego Vehicle Pose Extraction block extracts the pose of the ego vehicle. Pose information consists of both position and orientation. The block obtains the orientation of the ego vehicle from IMU sensor data. If IMU sensor data is not present, then the block predicts ego orientation using position and GPS timestamps.
If there is a visual drift in the ego trajectory obtained from the Ego Vehicle Pose Extraction block, with respect to the road network, then perform drift correction.
Drift Correction block takes ego poses obtained from the Ego Vehicle Pose Extraction block for drift correction. Drift correction is done on the basis of initialized start and end pose. If initialized poses are unavailable, then you can correct slight drift by taking an average of a few poses at the start and end of the trajectory to obtain the start and end pose, respectively.
Scenario Composition block reads OpenStreetMap (OSM) data to get information about the road network. Then the block adds drift-corrected, localized ego pose information to generate the scenario.
Download and Prepare Data Set
In this example, recorded vehicle data is collected from the Udacity data set and stored as a .mat
file. The recorded data includes:
GPS data — Contains the latitude, longitude, altitude, and velocity of the ego vehicle at each GPS timestamp.
IMU data — Contains the linear acceleration and angular velocity values at each IMU timestamp.
Lidar data — Contains the point cloud, saved as a
pointCloud
object, at each lidar timestamp.Camera data — Contains a cell array of image data, as 480-by-640-by-3 arrays, at each camera timestamp.
Note: The download time of the data depends on your internet connection. MATLAB will be temporarily unresponsive during the execution of this code block.
downloadFolder = fullfile(tempdir="AutomotiveDataset"); if ~isfolder(downloadFolder) mkdir(downloadFolder) end if ~exist("gps","var") url = "https://ssd.mathworks.com/supportfiles/driving/data/UdacityHighway/gps.zip"; filePath = fullfile(downloadFolder,"gps/gps.mat"); if ~isfile(filePath) unzip(url,downloadFolder) end load(filePath); end if ~exist("imu","var") url = "https://ssd.mathworks.com/supportfiles/driving/data/UdacityHighway/imu.zip"; filePath = fullfile(downloadFolder,"imu/imu.mat"); if ~isfile(filePath) unzip(url,downloadFolder) end load(filePath); end if ~exist("lidar","var") url = "https://ssd.mathworks.com/supportfiles/driving/data/UdacityHighway/lidar.zip"; filePath = fullfile(downloadFolder,"lidar/lidar.mat"); if ~isfile(filePath) unzip(url,downloadFolder) end load(filePath); end if ~exist("centerCamera","var") url = "https://ssd.mathworks.com/supportfiles/driving/data/UdacityHighway/centerCamera.zip"; filePath = fullfile(downloadFolder,"centerCamera/centerCamera.mat"); if ~isfile(filePath) unzip(url,downloadFolder) end load(filePath); end
Process GPS Data
Compute the waypoints for the ego vehicle trajectory from the recorded GPS coordinates. Use the latlon2local
function to convert the raw GPS coordinates to the local frame. The GPS local frame is ENU, and the IMU local frame is NED. To align the GPS and IMU frames, convert the local coordinates to the NED frame.
referenceLocation = [gps.Latitude(1,1) gps.Longitude(1,1) gps.Altitude(1,1)]; [currentEast,currentNorth,currentUp] = latlon2local(gps.Latitude,gps.Longitude,gps.Altitude,referenceLocation);
Convert the GPS local frame to the NED frame using the helperNED2ENU
function.
gpsPositionNED = helperNED2ENU([currentEast,currentNorth,currentUp]); gpsX = gpsPositionNED(:,1); gpsY = gpsPositionNED(:,2); gpsZ = gpsPositionNED(:,3); waypoints = [gpsX gpsY gpsZ]; wp = waypointTrajectory(Waypoints=waypoints,TimeOfArrival=seconds(gps.Time),ReferenceFrame="NED"); % Estimate orientation information [~,orientationGPS] = lookupPose(wp,seconds(imu.Time));
Process Lidar Frames
Select lidar frames, and align them so that the ego vehicle points to the positive x-axis in the lidar point cloud.
% Select lidar frames for drift correction and scenario creation lidarFrames = 50:100; % Rotation matrix for rotations around x-, y-, and z-axes rotX = @(t) [1 0 0; 0 cosd(t) -sind(t); 0 sind(t) cosd(t)]; rotY = @(t) [cosd(t) 0 sind(t); 0 1 0; -sind(t) 0 cosd(t)]; rotZ = @(t) [cosd(t) -sind(t) 0; sind(t) cosd(t) 0; 0 0 1]; % Transform to align lidar frames lidarData = lidar.PointCloud; rot = rotZ(-90)*rotY(0)*rotX(0); tform = rigid3d(rot',[0 0 0]); % Preallocate variable lidarDataAlign = lidarData(lidarFrames); % Align lidar frames for i = 1:size(lidarDataAlign,1) lidarDataAlign(i) = pctransform(lidarDataAlign(i),tform); end
Combine GPS, IMU and Lidar Data
GPS, IMU and lidar data are stored in the timetable
format. Combine the data together into one matrix, inputDataMatrix
, for use in fusion.
gpsTable = timetable(gps.Time,[gps.Latitude gps.Longitude gps.Altitude],gps.Velocity); gpsTable.Properties.VariableNames{1} = 'latLonAlt'; gpsTable.Properties.VariableNames{2} = 'gpsVelocity'; imuTable = timetable(imu.Time,imu.LinearAcceleration,imu.AngularVelocity,compact(orientationGPS)); imuTable.Properties.VariableNames{1} = 'linearAcceleration'; imuTable.Properties.VariableNames{2} = 'angularVelocity'; imuTable.Properties.VariableNames{3} = 'orientation'; lidarTable = timetable(lidar.Time,ones(size(lidar.Time,1),1)); lidarTable.Properties.VariableNames{1} = 'lidarFlag'; inputDataMatrix = synchronize(gpsTable,imuTable,lidarTable); inputDataMatrix.Properties.VariableNames{5} = 'orientation'; inputDataMatrix.Properties.VariableNames{1} = 'latLonAlt';
Initialization
Initialize the ego vehicle start and end poses.
% Initialize ego vehicle Yaw, Pitch and Roll angle initialYaw = atan2d(median(gpsY(1:12,:)),median(gpsX(1:12,:))); initialPitch = 0; initialRoll = 0; % Initial orientation initialEgoVehicleOrientationNED = eul2quat(deg2rad([(initialYaw),initialPitch,initialRoll])); % Accurate vehicle start and end positions for drift correction startPosition = [7.5 0.1752 1.15968]; % In meters startOrientationAngle = [-104.9690 0 0]; % [Yaw, Pitch, Roll] in degrees endPosition = [-21.9728 -109.8728 0.798131]; % In meters endOrientationAngle = [-105.3025 0 0]; % [Yaw, Pitch, Roll] in degrees
If an accurate initial orientation is available, then alignment is not required. Otherwise, set these flags to 0
to align angles using GPS position.
groundStartPositionFlag = 1; % Flag 1 to use initialized start position groundEndPositionFlag = 1; % Flag 1 to use initialized end position
Define the measurement noise for each sensor. This example obtains the noise parameters using experimentation and by autotuning an insfilterAsync
(Sensor Fusion and Tracking Toolbox) object.
For more information, see Automatic Tuning of the insfilterAsync Filter (Sensor Fusion and Tracking Toolbox).
% Velocity Rvel = 1; % Acceleration Racc = 1e+5; % Angular acceleration Rgyro = 1e+5; % GPS Rpos = [1 1 1e+5].^2; % RollPitchHead Rcorr = 1;
Preallocate variables for position and orientation.
sizeInputDataMatrix = size(inputDataMatrix,1); fusedPosition = zeros(sizeInputDataMatrix,3); fusedOrientation = zeros(sizeInputDataMatrix,1,"quaternion"); egoPositionLidar = zeros(size(lidar.Time,1),3); egoOrientationLidar = zeros(size(lidar.Time,1),1,"quaternion");
GPS and IMU Sensor Data Fusion
This example uses the Kalman filter to asynchronously fuse GPS, accelerometer, and gyroscope data using an insfilterAsync
filter. If IMU sensor data is not available, then you can estimate orientation information using the lookupPose
function. To visualize ego vehicle position along with heading direction, use the helperPlotPositionAndHeading
function.
% Create an INS filter to fuse asynchronous GPS and INS data to estimate pose. filt = insfilterAsync(ReferenceFrame="NED"); % GPS reference location filt.ReferenceLocation = referenceLocation; filt.State = [initialEgoVehicleOrientationNED,[0 0 0],[gpsX(1,1) gpsY(1,1) gpsZ(1,1)],[0 0 0], ... imu.LinearAcceleration(1,:),[0 0 0],[0 0 0],[0 0 0],[0 0 0]]'; imuFs = 1/seconds(imu.Time(2)-imu.Time(1)); gpsFs = 1/seconds(gps.Time(2)-gps.Time(1)); prevTime = seconds(inputDataMatrix.Time(1)); % Obtain poses at lidar timestamps lidarStep = 1; % Fusion starts with GPS data startRow = find(~isnan(inputDataMatrix.latLonAlt),1,"first"); for row = startRow:size(inputDataMatrix,1) currTime = seconds(inputDataMatrix.Time(row)); % Predict the filter forward time difference predict(filt,currTime-prevTime); if any(~isnan(inputDataMatrix.latLonAlt(row,:))) % Fuse GPS with velocity readings fusegps(filt,inputDataMatrix.latLonAlt(row,:),Rpos,inputDataMatrix.gpsVelocity(row,:),Rvel); end if any(~isnan(inputDataMatrix.angularVelocity(row,:))) % Fuse accelerometer and gyroscope readings fuseaccel(filt,inputDataMatrix.linearAcceleration(row,:),Racc); fusegyro(filt,inputDataMatrix.angularVelocity(row,:),Rgyro); if any(~isnan(inputDataMatrix.orientation(row,:))) % Correct orientation on the basis of orientation obtained from % GPS data at IMU timestamp correct(filt,1:4,inputDataMatrix.orientation(row,:),Rcorr); end end % Get poses at lidar timestamp if ~isnan(inputDataMatrix.lidarFlag(row)) [egoPositionLidar(lidarStep,:),egoOrientationLidar(lidarStep),~] = pose(filt); lidarStep = lidarStep+1; end prevTime = currTime; % Log the current pose estimate for visualization [fusedPosition(row,:),fusedOrientation(row)] = pose(filt); end
Visualize the fused estimated trajectory with orientation.
helperPlotPositionAndHeading(fusedPosition,fusedOrientation,100); title("Fused Trajectory Using GPS and IMU") xlabel("North (m)") ylabel("East (m)") zlabel("Down (m)")
Pose Conversion from NED to ENU Frame
Convert poses from the NED frame to the ENU frame using the helperNED2ENU
function. In this example, the ENU frame is the reference frame for road network import.
% Convert to map frame (ENU frame) gpsENU = helperNED2ENU([gpsX gpsY gpsZ]); gpsXENU = gpsENU(:,1); gpsYENU = gpsENU(:,2); gpsZENU = gpsENU(:,3); % Conversion for orientation yawPitchRoll = rad2deg(quat2eul(initialEgoVehicleOrientationNED)); [~,yawPitchRollENU] = helperNED2ENU([0 0 0],yawPitchRoll); initialEgoVehicleOrientationENU = eul2quat(deg2rad(yawPitchRollENU)); % Select pose at lidar timestamps yawPitchRollLidarNED = zeros(size(egoOrientationLidar,1),3); for i = 1:size(egoOrientationLidar,1) yawPitchRollLidarNED(i,:) = rad2deg(quat2eul(egoOrientationLidar(i,:))); end [egoPositionLidarENU,yawPitchRollENU] = helperNED2ENU(egoPositionLidar,yawPitchRollLidarNED); egoOrientationLidarENU = eul2quat(deg2rad(yawPitchRollENU));
Visualize the poses at the selected lidar timestamps from the vehicle trajectory information.
figure scatterPlot1 = helperPlotPositionAndHeading(egoPositionLidarENU(lidarFrames,:),egoOrientationLidarENU(lidarFrames,:),1); hold on scatterPlot2 = scatter3(gpsXENU,gpsYENU,gpsZENU); axis([-600 200 -1400 50 -10 10]) view(0,90) hold off legend([scatterPlot1 scatterPlot2],"Selected Lidar Pose","GPS Position",Location="NorthEastOutside") title("Selected Pose from Lidar Timestamp") xlabel("East (m)") ylabel("North (m)") zlabel("Up (m)")
Drift Correction Using Start and End Pose Information
Sensor noise and inaccuracies introduce drift in the estimated trajectory. Correct the drift using specified start and end pose information. Create a poseGraph3D
(Navigation Toolbox) object by using the helperComputeRelativePose
function. The helper function computes relative poses between local frames using the global frame of reference. Match the initialized and estimated final poses using loop-closure-based graph optimization.
Visualize drift correction by comparing corrected and estimated trajectories against road networks.
% Calculate relative poses at lidar timestamps % Convert pose from global to local reference frame globalPose = [egoPositionLidarENU(lidarFrames,:) egoOrientationLidarENU(lidarFrames,:)]; % Normalize poses to take origin as reference normGlobalPose = [globalPose(:,1:3)-globalPose(1,1:3) globalPose(:,4:7)]; % Initialize relative pose relativePose = zeros(lidarFrames(1)+size(normGlobalPose,1)-1,7); relativePose(lidarFrames(1),:) = [[0 0 0] initialEgoVehicleOrientationENU]; % Calculate relative pose for i = 1:(size(normGlobalPose,1)-1) computedPose = helperComputeRelativePose(normGlobalPose(i,:),normGlobalPose(i+1,:)); relativePose(lidarFrames(1)+i,:) = computedPose; end % Calculate start pose if groundStartPositionFlag == 1 % Use initialized start position startOrientation = eul2quat(deg2rad(startOrientationAngle)); startPose = [startPosition startOrientation]; else % Align position by taking average of estimated poses startPosition = normGlobalPose(1,1:3); startOrientationAngle = [atan2d(median(normGlobalPose(1:12,2)),median(normGlobalPose(1:12,1))) 0 0]; startOrientation = eul2quat(deg2rad(startOrientationAngle)); startPose = [startPosition startOrientation]; end poseGraphDrift = poseGraph3D; % Add start pose addRelativePose(poseGraphDrift,startPose); % Add estimated pose information for i = lidarFrames(2):lidarFrames(end) addRelativePose(poseGraphDrift,relativePose(i,:)); end driftPosition = poseGraphDrift.nodes; % First pose of poseGraph firstPose = driftPosition(2,:); % Final pose of poseGraph finalPose = driftPosition(end,:); startNodeId = 2; endNodeId = poseGraphDrift.NumNodes; % Calculate end pose if groundEndPositionFlag == 1 % Use initialized end position endOrientation = eul2quat(deg2rad(endOrientationAngle)); endPose = [endPosition endOrientation]; else % Align position by taking average of estimated poses endPosition = normGlobalPose(end,1:3); endOrientationAngle = [atan2d(median(normGlobalPose(end-3:end,2)),median(normGlobalPose(end-3:end,1))) 0 0]; endOrientation = eul2quat(deg2rad(endOrientationAngle)); endPose = [endPosition endOrientation]; end relativeEndPose = helperComputeRelativePose(firstPose,endPose); % Error in measurement of end pose error = 1; % Add end pose to graph addRelativePose(poseGraphDrift,relativeEndPose,[error 0 0 0 0 0 error 0 0 0 0 1 0 0 0 1 0 0 1 0 1],startNodeId); endPoseID = poseGraphDrift.NumNodes; % Apply loop closure addRelativePose(poseGraphDrift,[0 0 0 1 0 0 0],[1 0 0 0 0 0 1 0 0 0 0 1 0 0 0 1 0 0 1 0 1],endPoseID,endNodeId);
Visualize pose graph with estimated trajectory.
figure scatter3(gpsXENU,gpsYENU,gpsZENU) hold on show(poseGraphDrift); hold off axis([-78 48 -146 15 -5 5]) view(0,90) title("Pose Graph with Estimated Trajectory") xlabel("East (m)") ylabel("North (m)") zlabel("Up (m)")
Visualize the difference between the initialized and estimated end poses.
figure scatter3(gpsXENU,gpsYENU,gpsZENU) hold on show(poseGraphDrift); hold off axis([-24.65 -18.65 -113.47 -105.52 -5.00 5.00]) view(0,90) title("Difference in Final Pose") xlabel("East (m)") ylabel("North (m)") zlabel("Up (m)")
[poseGraphDriftOptimized] = optimizePoseGraph(poseGraphDrift,MaxIterations=1000);
Visualize the optimized pose graph with drift correction.
figure scatter3(gpsXENU,gpsYENU,gpsZENU) hold on show(poseGraphDriftOptimized); hold off axis([-24.93 -18.93 -113.17 -105.22 -5.00 5.00]) view(0,90) title("Optimized Pose Graph") xlabel("East (m)") ylabel("North (m)") zlabel("Up (m)")
optimizedPose = poseGraphDriftOptimized.nodes(startNodeId:endNodeId);
Import and visualize the road network with the corrected and drift-affected trajectories.
scenario = drivingScenario; % Add road to the scenario using OSM network information % Specify bounds for the area of the road network to import mapParameters = getMapROI(gps.Latitude,gps.Longitude); % Fetch the OpenStreetMap XML if ~isfile("drive_map.osm") % Save OSM data into a file websave("drive_map.osm", mapParameters.osmUrl, weboptions(ContentType="xml")); end % Import OSM road network into driving scenario. roadNetwork(scenario,OpenStreetMap="drive_map.osm"); plot(scenario) title("Imported Road Network") xlabel("East (m)") ylabel("North (m)") zlabel("Up (m)")
Align the reference location with the scenario.
% Mean value of latitude and longitude mapRef = [(mapParameters.minLat + mapParameters.maxLat)/2,(mapParameters.minLon + mapParameters.maxLon)/2]; [mapBiasX,mapBiasY,mapBiasZ] = latlon2local(mapRef(1,1),mapRef(1,2),0,referenceLocation); % Align drift-affected ego vehicle position with map frame mapVehicleDriftPosition(:,1) = normGlobalPose(:,1) - mapBiasX; mapVehicleDriftPosition(:,2) = normGlobalPose(:,2) - mapBiasY; mapVehicleDriftPosition(:,3) = normGlobalPose(:,3); mapVehicleDriftPosition(:,4:7) = normGlobalPose(:,4:7); % Align corrected ego vehicle position with map frame mapVehiclePosition(:,1) = optimizedPose(:,1) - mapBiasX; mapVehiclePosition(:,2) = optimizedPose(:,2) - mapBiasY; mapVehiclePosition(:,3) = optimizedPose(:,3); mapVehiclePosition(:,4:7) = optimizedPose(:,4:7);
Visualize the scenario with the drift-corrected ego position.
plot(scenario) hold on scatterPlot1 = scatter3(gpsXENU-mapBiasX,gpsYENU-mapBiasY,gpsZENU); scatterPlot2 = scatter3(mapVehicleDriftPosition(:,1),mapVehicleDriftPosition(:,2),mapVehicleDriftPosition(:,3)); scatterPlot3 = scatter3(mapVehiclePosition(:,1),mapVehiclePosition(:,2),mapVehiclePosition(:,3)); hold off axis([-67 206 498 771 -10 10]) legend([scatterPlot1 scatterPlot2 scatterPlot3],["GPS Position","Fused Position","Drift Corrected Position"]) title("Scenario with Drift Corrected Ego Position") xlabel("East (m)") ylabel("North (m)") zlabel("Up (m)")
Scenario Composition
Create a virtual scenario using the localized trajectory and road network obtained from OpenStreetMap.
To compose a scenario, compute the ego speed and yaw angles using the drift-corrected trajectory.
% Calculate ego vehicle speed timeOfArrivalEgo = seconds(lidarTable.Time(lidarFrames)); diffXPosition = diff(mapVehiclePosition(:,1)); diffYPosition = diff(mapVehiclePosition(:,2)); diffZPosition = diff(timeOfArrivalEgo); euclideanDistance = sqrt(diffXPosition.^2 + diffYPosition.^2); speedEgoVehicle = euclideanDistance./diffZPosition; speedEgoVehicle(end+1) = speedEgoVehicle(end); % Calculate yaw angle mapVehicleAngle = quat2eul(mapVehiclePosition(:,4:7)); yawEgoVehicle = rad2deg(mapVehicleAngle(:,1)); % Reduce ego vehicle and map elevation to ground updatedScenario = drivingScenario; [roadTaken,roadData] = helperGetRoadsInEgoPath(scenario,mapVehiclePosition(:,1:3)); roadCenter = roadData.RoadCenters; roadCenter(:,3) = zeros(size(roadCenter,1),1); road(updatedScenario,roadCenter,roadData.RoadWidth,Lanes=roadData.LanesSpecifications); mapVehiclePosition(:,3) = zeros(size(mapVehiclePosition,1),1); % Add ego vehicle and drift corrected trajectory to the scenario. % Add corrected ego trajectory to scenario egoVehicle = vehicle(updatedScenario,ClassID=1,Name="EgoVehicle", ... Position=mapVehiclePosition(1,1:3),Yaw=yawEgoVehicle(1,1), ... Mesh=driving.scenario.carMesh,Wheelbase=2); % Create ego trajectory trajectory(egoVehicle,mapVehiclePosition(:,1:3),speedEgoVehicle,Yaw=yawEgoVehicle);
Simulate and Visualize Generated Scenario
Plot the scenario and the corresponding chase plot. Run the simulation to visualize the generated driving scenario. Use camTimeBias
to synchronize between the camera frame and scenario time instance.
% Plot scenario plot(updatedScenario,waypoints="on") fig.chasePlot = figure; set(fig.chasePlot,Position=[680 60 640 600]) fig.hPanel.chasePlot = uipanel(fig.chasePlot, ... Title="Chase View Plot",Position=[0 0 1 1]); fig.hPlot.chasePlot = axes(fig.hPanel.chasePlot); chasePlot(egoVehicle,Parent=fig.hPlot.chasePlot,ViewHeight=1.5,ViewYaw=-5,Meshes="on") % Center camera images fig.camera = figure; set(fig.camera,Position=[30 60 640 600]) fig.hPanel.camera = uipanel(fig.camera, ... Title="Camera Image Plot",Position=[0 0 1 1]); fig.hPlot.camera = axes(fig.hPanel.camera); updatedScenario.SampleTime = 0.1; updatedScenario.StopTime = 5; cameraTime = seconds(centerCamera.Time); cameraData = centerCamera.Images; camTimeBias = 0.04; camTime = cameraTime - cameraTime(1) - camTimeBias; while advance(updatedScenario) camIdx = find(camTime<=updatedScenario.SimulationTime,1,"last"); image(cameraData{camIdx,1},Parent=fig.hPlot.camera); drawnow pause(0.1) end
Construct Map
For visual analysis, construct a map with the estimated poses and lidar data using the pcalign
function.
% Initialize variable to store orientation matrix mapOrientation = rigid3d.empty; for row = 1:size(optimizedPose,1) mapOrientation(row,1) = rigid3d((quat2rotm(optimizedPose(row,4:7)))',(optimizedPose(row,1:3))); end % Construct map using orientation matrix and lidar data pointCloudMap = pcalign(lidarDataAlign(1:end,:),mapOrientation,0.5); % Transform the map point cloud to correct bias tform = rigid3d(rotZ(0)',[-mapBiasX -mapBiasY 0]);
Visualize the map with the generated scenario.
plot(scenario) hold on scatterPlot1 = scatter3(gpsXENU-mapBiasX,gpsYENU-mapBiasY,gpsZENU); scatterPlot2 = scatter3(optimizedPose(:,1)-mapBiasX,optimizedPose(:,2)-mapBiasY,optimizedPose(:,3)); pcshow(pctransform(pointCloudMap,tform)) hold off axis([100 200 500 700 -10 10]) view(0,90) lgnd = legend([scatterPlot1 scatterPlot2],["GPS Position","Drift Corrected Position"]); set(lgnd,color="White") title("Generated Scenario") xlabel("East (m)") ylabel("North (m)") zlabel("Up (m)")
See Also
Functions
getMapROI
|roadprops
|selectActorRoads
|insfilterAsync
(Sensor Fusion and Tracking Toolbox)
Objects
drivingScenario
|poseGraph
(Navigation Toolbox)