Scenario Generation and Variation
In automated driving applications, scenario generation is the process of building virtual scenarios from real-world vehicle data recorded from global positioning system (GPS), inertial measurement unit (IMU), camera, and lidar sensors. Automated Driving Toolbox™ provides functions and tools to automate scenario generation process. You can preprocess sensor data, extract roads, localize actors and get actor trajectories to create an accurate digital twin of a real-world scenario. Simulate the generated scenario and test your automated driving algorithms against real-world data.
To generate scenarios from recorded sensor data, download the Scenario Builder for Automated Driving Toolbox support package from the Add-On Explorer. For more information on downloading add-ons, see Get and Manage Add-Ons.
Scenario variation is the process of generating multiple variants from a seed scenario that is either manually created or generated from recorded sensor data. You can vary scene parameters, actor parameters, or event parameters of a seed scenario and provide your constraints to generate new scenarios. Use these scenario variations for safety assessment of different automated driving applications such as autonomous emergency braking (AEB), lane keep assist (LKA), and adaptive cruise control (ACC).
To generate scenario variations, download the Scenario Variant Generator for Automated Driving Toolbox support package from the Add-On Explorer. For more information on downloading add-ons, see Get and Manage Add-Ons.
Functions
Topics
Scenario Generation
- Overview of Scenario Generation from Recorded Sensor Data
Learn the basics of generating scenarios from recorded sensor data. - Preprocess Lane Detections for Scenario Generation
Format lane detection data to update lane specifications for scenario generation. - Smooth GPS Waypoints for Ego Localization
Create jitter-limited ego trajectory by smoothing GPS and IMU sensor data. - Ego Vehicle Localization Using GPS and IMU Fusion for Scenario Generation
Localize ego vehicle by fusing GPS and IMU sensor data to generate virtual driving scenario. - Ego Localization Using Lane Detections and HD Map for Scenario Generation
Perform lane-level localization of ego vehicle using lane detections, HD map data, and GPS data. - Extract Lane Information from Recorded Camera Data for Scene Generation
Extract lane information from raw camera data to generate ASAM OpenDRIVE® scene or RoadRunner scene. - Extract Vehicle Track List from Recorded Camera Data for Scenario Generation
Extract actor track list from raw camera data for scenario generation. - Extract Vehicle Track List from Recorded Lidar Data for Scenario Generation
Extract actor track list from recorded lidar data using pretrained vehicle detection model and JPDA tracker. - Generate Road Scene Using Lanes from Labeled Recorded Data
Generate road scene with lanes from labeled camera images and raw lidar data. - Fuse Prerecorded Lidar and Camera Data to Generate Vehicle Track List for Scenario Generation
Generate scenario by fusing and smoothing tracked lidar data and camera data.
Scenario Variation
- Overview of Scenario Variant Generation
Learn the basics of generating scenario variations from a seed scenario. - Generate Scenario Variants by Modifying Actor Dimensions
Generate scenario variants from seed scenario by modifying actor dimensions. - Translocate Collision from Seed Scenario to Target Scene
Translocate collision from RoadRunner seed scenario to target scene.