Choose a Sensor for 3D Simulation

You can use the 3D simulation environment in Automated Driving Toolbox™ to obtain high-fidelity sensor data. This environment is rendered using the Unreal Engine® from Epic Games®.

The table summarizes the sensor blocks that you can simulate in this environment.

Sensor BlockDescriptionSample VisualizationExample

Simulation 3D Camera

  • Camera with lens that is based on the ideal pinhole camera. See What Is Camera Calibration? (Computer Vision Toolbox)

  • Includes parameters for image size, focal length, distortion, and skew

  • Includes options to output ground truth for depth estimation and semantic segmentation

Camera image using a Video Viewer block:

Design of Lane Marker Detector in 3D Simulation Environment

Depth map using a To Video Display block:

Visualize Depth and Semantic Segmentation Data in 3D Environment

Semantic segmentation map using a To Video Display block:

Visualize Depth and Semantic Segmentation Data in 3D Environment

Simulation 3D Fisheye Camera

  • Fisheye camera that can be described using the Scaramuzza camera model. See Fisheye Calibration Basics (Computer Vision Toolbox)

  • Includes parameters for distortion center, image size, and mapping coefficients

Camera image using a Video Viewer block:

Simulate a Simple Driving Scenario and Sensor in 3D Environment

Simulation 3D Lidar

  • Scanning lidar sensor model

  • Includes parameters for detection range, resolution, and fields of view

Point cloud data using pcplayer within a MATLAB Function block:

Simulate Lidar Sensor Perception Algorithm

Simulation 3D Probabilistic Radar

  • Probabilistic radar model that returns a list of detections

  • Includes parameters for radar accuracy, radar bias, detection probability, and detection reporting

Radar detections using the Bird's-Eye Scope:

Simulate Radar Sensors in 3D Environment

See Also

Blocks

Related Topics