Main Content

Generate Scenario from Actor Track List and GPS Data

This example shows how to generate a scenario containing actor trajectories by using data extracted from a Global Positioning System (GPS) and an actor track list.

Generating scenarios from recorded sensor data enables you create scenarios mimicking real-world actor behaviors, such as front vehicle cut-in for adaptive cruise control ACC. These scenarios, generated from real-world sensor data, can improve the test coverage of automated driving systems. As unlike manual scenario creation, they are scalable and less prone to human error. This example shows how to automatically create a scenario from recorded sensor data.

This figure shows these steps.

  • Smooth GPS data and format actor track list

  • Reconstruct ego vehicle trajectories

  • Extract ego vehicle driven roads from map

  • Extract non-ego actor properties from track list

  • Build scenario with roads, ego vehicle and non-ego actors

Load Sensor Data

Download a ZIP file containing a subset of sensor data from the PandaSet data set, and then unzip the file. This file contains GPS data, an actor track list, and camera information. In this example, you use the camera data for visual validation of the generated scenario.

dataFolder = tempdir;
dataFilename = "PandasetSensorData.zip";
url = "https://ssd.mathworks.com/supportfiles/driving/data/"+dataFilename;
filePath = fullfile(dataFolder,dataFilename);
if ~isfile(filePath)
    websave(filePath,url)
end
unzip(filePath,dataFolder)
dataset = fullfile(dataFolder,"PandasetSensorData");
data = load(fullfile(dataset,"sensorData.mat"));

Load the GPS data into the workspace.

gpsData = data.GPSData;

gpsData is a table with these columns:

  • timeStamp — Time, in seconds, at which the GPS data was collected.

  • latitude — Latitude coordinate value of the ego vehicle. Units are in degrees.

  • longitude — Longitude coordinate value of the ego vehicle. Units are in degrees.

  • altitude — Altitude coordinate value of the ego vehicle. Units are in meters.

Display the first five entries of gpsData.

gpsData(1:5,:)
ans=5×4 table
    timeStamp     latitude    longitude    altitude
    __________    ________    _________    ________

    1.5576e+09     37.374      -122.06      42.858 
    1.5576e+09     37.374      -122.06      42.858 
    1.5576e+09     37.374      -122.06      42.854 
    1.5576e+09     37.374      -122.06      42.849 
    1.5576e+09     37.374      -122.06      42.848 

Load the actor track list data. Alternatively, you can generate an actor track list by processing raw camera or lidar sensor data. For more information on how to generate actor track list from camera data, see the Extract Vehicle Track List from Recorded Camera Data for Scenario Generation example. For more information on generating a track list from lidar data, see the Extract Vehicle Track List from Recorded Lidar Data for Scenario Generation example.

tracklist = data.ActorTracklist;

tracklist is a table with these columns:

  • timeStamp — Time, in seconds, for each actor track.

  • TrackerInfo — Track information for the non-ego actors.

Each row of the TrackerInfo column is an M-by-1 structure with these fields, where M is the number of tracks per timestamp.

  • TrackID — Track ID of the actor.

  • ClassID — Classification ID of the actor.

  • Position — Position of the actor with respect to the ego vehicle. Units are in meters.

  • Dimension — Dimensions of the actor in the form [length width height]. Units are in meters. This field is optional.

  • Yaw — Yaw angle of the actor with respect to the ego vehicle. Units are in degrees. This field is optional.

Display the first five entries of tracklist.

tracklist(1:5,:)
ans=5×2 table
    timeStamp     TrackerInfo 
    __________    ____________

    1.5576e+09    {6×1 struct}
    1.5576e+09    {6×1 struct}
    1.5576e+09    {6×1 struct}
    1.5576e+09    {6×1 struct}
    1.5576e+09    {6×1 struct}

Display the TrackerInfo value for the first timestamp.

tracklist.TrackerInfo{1,1}
ans=6×1 struct array with fields:
    Position
    Yaw
    TrackID
    Dimension
    ClassID

Load the camera data recorded from a forward-facing monocular camera mounted on the ego vehicle.

cameraData = data.CameraData;

The camera data is a table with two columns:

  • timeStamp — Time, in seconds, at which the image data was captured.

  • fileName — Filenames of the images in the data set.

The images are located in the Camera folder in the dataset directory. Create a table that contains the file paths of these images for each timestamp by using the helperUpdateTable function.

imageFolder = "Camera";
cameraData  = helperUpdateTable(cameraData,dataset,imageFolder);

Display the first five entries of cameraData.

cameraData(1:5,:)

Remove data from workspace.

clear data

Crop and Preprocess Sensor Data

Crop the GPS, actor track list, and camera data relative to the GPS timestamp range by using the helperCropData function.

startTime = gpsData.timeStamp(1);
endTime = gpsData.timeStamp(end);

% Pack all the tables in a cell array. 
recordedData = {gpsData,tracklist,cameraData};

% Crop the data.
recordedData = helperCropData(recordedData,startTime,endTime);

The timestamp values of the recorded data set are in the POSIX® format, which Scenario Builder for Automated Driving Toolbox™ supports. Use the helperNormTimeInSecs function to normalize the timestamps using these arguments:

  • scale — Scale by which to convert the timestamp. Because the recorded timestamps are already in seconds, specify this argument as 1.

  • offset — Offset of the simulation start time. Specify the start time as the first timestamp in gpsData.

scale = 1;
offset = startTime;
recordedData =  helperNormTimeInSecs(recordedData,offset,scale);

Extract the GPS data, actor track list, and camera data with updated timestamp values from recordedData.

gpsData = recordedData{1,1}; 
tracklist = recordedData{1,2};
cameraData = recordedData{1,3};

Remove recordedData from the workspace.

clear recordedData

Extract Map Roads Using GPS data

Create a geographic player using a geoplayer object and display the full route using GPS data.

zoomLevel = 16;
center = mean([gpsData.latitude gpsData.longitude]);
player = geoplayer(center(1),center(2),zoomLevel);
plotRoute(player,gpsData.latitude,gpsData.longitude)

Obtain geographic bounding box coordinates from the GPS data by using the getMapROI function.

mapStruct = getMapROI(gpsData.latitude,gpsData.longitude);

The map file required for importing roads of the specified area is downloaded from the OpenStreetMap® (OSM) website. OpenStreetMap provides access to worldwide, crowd-sourced, map data. The data is licensed under the Open Data Commons Open Database License (ODbL). For more information on the ODbL, see the Open Data Commons Open Database License site.

url = mapStruct.osmUrl;
filename = "drive_map.osm";
websave(filename,url,weboptions(ContentType="xml"));

Extract road properties and geographic reference coordinates to use to identify ego roads by using the roadprops function.

[roadData,geoReference] = roadprops("OpenStreetMap",filename);

Select Ego Roads from Road Network

Convert geographic GPS coordinates to local east-north-up (ENU) coordinates by using the latlon2local function. The transformed coordinates define the trajectory waypoints of the ego vehicle. Units are in meters.

[xEast, yNorth, zUp] = latlon2local(gpsData.latitude,gpsData.longitude,gpsData.altitude,geoReference);
waypoints = [xEast,yNorth,zUp];

Raw GPS data often contains noise. Smooth the GPS waypoints by using the smoothdata function.

window = round(size(waypoints,1)*0.2);
waypoints = smoothdata(waypoints,"rloess",window);

If your GPS data also suffers from drift in position and orientation, then you must improve your ego vehicle localization to generate an accurate ego trajectory. For more information, see the Improve Ego Vehicle Localization example.

Create the ego trajectory using the waypoints and their corresponding times of arrival by using the waypointTrajectory (Sensor Fusion and Tracking Toolbox) System object™. For this example assume that no tracked vehicle leaves the ground at any of the waypoints, and therefore set all altitude values to 0. You must set the ReferenceFrame property of this System object to "ENU" because Scenario Builder for Automated Driving Toolbox™ supports only the ENU format for local coordinate data.

waypoints = double([waypoints(:,1:2) zeros(size(zUp))]);
egoTrajectory = waypointTrajectory(waypoints,gpsData.timeStamp,ReferenceFrame="ENU");

Extract the road properties for the roads on which the ego vehicle is traveling by using the selectActorRoads function.

egoRoadData = selectActorRoads(roadData,egoTrajectory.Waypoints);

Display the first five entries of the egoRoadData table.

egoRoadData(1:5,:)
ans=5×10 table
    RoadID    JunctionID          RoadName            RoadCenters     RoadWidth     BankAngle        Heading          Lanes        LeftBoundary     RightBoundary
    ______    __________    _____________________    _____________    _________    ____________    ____________    ____________    _____________    _____________

      42          0         "West El Camino Real"    { 2×3 double}      3.75       {2×1 double}    {2×1 double}    1×1 lanespec    {22×3 double}    {22×3 double}
      43          0         "West El Camino Real"    { 2×3 double}      3.75       {2×1 double}    {2×1 double}    1×1 lanespec    {13×3 double}    {13×3 double}
      44          0         "West El Camino Real"    { 2×3 double}      3.75       {2×1 double}    {2×1 double}    1×1 lanespec    {10×3 double}    {10×3 double}
      45          0         "West El Camino Real"    {24×3 double}      3.75       {3×1 double}    {3×1 double}    1×1 lanespec    {27×3 double}    {27×3 double}
      48          0         "West El Camino Real"    { 2×3 double}      3.75       {2×1 double}    {2×1 double}    1×1 lanespec    {22×3 double}    {22×3 double}

Visualize the selected roads by using the helperPlotRoads function. Notice that the selected ego roads do not have lane information. Define a crop window of the form [x y width height] to crop, zoom in, and display the map. The x and y elements are the coordinates of the top-left corner of the crop window.

cropWindow = [-30 -30 60 60];
helperPlotRoads(egoRoadData,cropWindow);

Extract Non-Ego Actor Properties

Visualize the actor track list and camera images by using the birdsEyePlot and helperPlotActors functions.

% Initialize the figure with bird's eye plot.
currentFigure = figure(Visible="on",Position=[0 0 1400 600]);
hPlot = axes(uipanel(currentFigure,Position=[0 0 0.5 1],Title="Non-Ego Actors"));
bep = birdsEyePlot(XLim=[0 70],YLim=[-35 35],Parent=hPlot);
camPlot = axes(uipanel(currentFigure,Position=[0.5 0 0.5 1],Title="Camera View"));
helperPlotActors(bep,camPlot,tracklist,cameraData)

Extract actor properties such as entry time, exit time, and dimension from the track list data by using the actorprops function. The function uses extracted ego trajectory information to return the non-ego actor properties in the world frame.

nonEgoActorInfo = actorprops(tracklist,egoTrajectory);

Display the first five entries of nonEgoActorInfo.

nonEgoActorInfo(1:5,:)
ans=5×10 table
    Age                    TrackID                     ClassID           Dimension           EntryTime    ExitTime             Mesh               Waypoints           Speed              Yaw      
    ___    ________________________________________    _______    _______________________    _________    ________    ______________________    ______________    ______________    ______________

     10    {'a6ebfc4a-1adb-4920-af12-f7ac56602785'}       1       2.037    5.273    1.825        0        0.89983     1×1 extendedObjectMesh    { 10×3 double}    { 10×1 double}    { 10×1 double}
    400    {'609aed9d-6681-4e61-83a4-d1e263723107'}       1       1.911    4.672    1.527        0           39.9     1×1 extendedObjectMesh    {400×3 double}    {400×1 double}    {400×1 double}
     10    {'1823b4a0-d5fa-41c0-9335-98ca77c933b6'}       1       2.043    4.537     1.87        0        0.89983     1×1 extendedObjectMesh    { 10×3 double}    { 10×1 double}    { 10×1 double}
    139    {'4cd79810-9938-47ba-a3a9-e81221024549'}       1       2.199    4.827    1.968        0         13.799     1×1 extendedObjectMesh    {139×3 double}    {139×1 double}    {139×1 double}
    400    {'64bf8e46-c5ba-47af-b6fe-40aef7c1aa71'}       1       1.981    4.974     1.58        0           39.9     1×1 extendedObjectMesh    {400×3 double}    {400×1 double}    {400×1 double}

Simulate Scenario with Ego and Non-Ego Actors

Define parameters to create a driving scenario. Specify endTime as GPS end time as final timestamp in the GPS data, and calculate sampleTime as the minimum difference between two consecutive GPS timestamps.

endTime = gpsData.timeStamp(end);
sampleTime = min(diff(gpsData.timeStamp));

Create a driving scenario by using the drivingScenario object. Specify the SampleTime and StopTime properties of the driving scenario.

scenario = drivingScenario(SampleTime=sampleTime,StopTime=endTime);

Roads extracted from OpenStreetMap do not contain lane information. Create reference lane specifications by using the lanespec function. Specify three lanes with an approximate width of four meters per lane, based on a visual inspection of camera data. Update the egoRoadData with the lane information.

lanes = lanespec(3,Width=4);
egoRoadData.Lanes(:,1) = lanes;

Add roads with the updated lane specifications by using the helperAddRoads function. This helper function adds roads to the driving scenario by using the road function.

scenario = helperAddRoads(scenario,egoRoadData);

Add the ego vehicle waypoints to the scenario by using the helperAddEgo function.

scenario = helperAddEgo(scenario,egoTrajectory);

Add the non-ego actors and their trajectories to the scenario by using the helperAddNonEgo function.

scenario = helperAddNonEgo(scenario,nonEgoActorInfo);

Visualize the generated scenario and compare it with the recorded camera data by using the helperViewScenario function.

currentFigure = figure(Name="Generated Scenario",Position=[0 0 700 500]);
helperViewScenario(currentFigure,scenario,cameraData)

Export Scenario to ASAM OpenSCENARIO

Export the generated scenario to the ASAM OpenSCENARIO 1.0 file format by using the export function.

fileName = "example_scenario.xosc";
export(scenario,"OpenSCENARIO",fileName);

You can view the exported scenario in external simulators such as ESMINI.

See Also

Functions

Related Topics