Automated Driving Toolbox™ provides blocks for visualizing sensors in a 3D simulation environment that uses the Unreal Engine® from Epic Games®. This model simulates a simple driving scenario in a prebuilt 3D scene and captures data from the scene using a fisheye camera sensor. Use this model to learn the basics of configuring and simulating scenes, vehicles, and sensors. For more background on the 3D simulation environment, see 3D Simulation for Automated Driving.
The model consists of these main components:
Scene — A Simulation 3D Scene Configuration block configures the scene in which you simulate.
Vehicles — Two Simulation 3D Vehicle with Ground Following blocks configure the vehicles within the scene and specify their trajectories.
Sensor — A Simulation 3D Fisheye Camera configures the mounting position and parameters of the fisheye camera used to capture simulation data. A Video Viewer block visualizes the simulation output of this sensor.
In the Simulation 3D Scene Configuration block, the Scene description parameter determines the scene where the simulation takes place. This model uses the Large Parking Lot scene, but you can choose among several prebuilt scenes. To explore a scene, you can open the 2D image corresponding to the 3D scene.
data = load('sim3d_SpatialReferences.mat'); spatialRef = data.spatialReference.LargeParkingLot; figure; imshow('sim3d_LargeParkingLot.jpg',spatialRef) set(gca,'YDir','normal')
To learn how to explore other scenes, see the corresponding scene reference pages.
The Scene view parameter of this block determines the view from which the Unreal Engine window displays the scene. In this block, Scene view is set to
EgoVehicle, which is the name of the ego vehicle (the vehicle with the sensor) in this scenario. During simulation, the Unreal Engine window displays the scene from behind the ego vehicle. You can also change the scene view to the other vehicle. To display the scene from the root of the scene (the scene origin), select
The Simulation 3D Vehicle with Ground Following blocks model the vehicles in the scenario.
The Ego Vehicle block vehicle contains the fisheye camera sensor. This vehicle is modeled as a red hatchback.
The Target Vehicle block is the vehicle from which the sensor captures data. This vehicle is modeled as a green SUV.
During simulation, both vehicles travel straight in the parking lot for 50 meters. The target vehicle is 10 meters directly in front of the ego vehicle.
The X, Y, and Yaw input ports control the trajectories of these vehicles. X and Y are in the world coordinates of the scene, which are in meters. Yaw is the orientation angle of the vehicle and is in degrees.
The ego vehicle travels from a position of (45,0) to (45,50), oriented 90 degrees counterclockwise from the origin. To model this position, the input port values are as follows:
X is a constant value of
Y is a multiple of the simulation time. A Digital Clock block outputs the simulation time every 0.1 second for 5 seconds, which is the stop time of the simulation. These simulation times are then multiplied by 10 to produce Y values of
[0 1 2 3 ... 50], or 1 meter for up to a total of 50 meters.
Yaw is a constant value of
The target vehicle has the same X and Yaw values as the ego vehicle. The Y value of the target vehicle is always 10 meters more than the Y value of the ego vehicle.
In both vehicles, the Initial position [X, Y, Z] (m) and Initial rotation [Roll, Pitch, Yaw] (deg) parameters reflect the initial
[X, Y, Z] and
[Yaw, Pitch, Roll] values of the vehicles at the beginning of simulation.
To create more realistic trajectories, you can obtain waypoints from a scene interactively and specify these waypoints as inputs to the Simulation 3D Vehicle with Ground Following blocks. See Select Waypoints for 3D Simulation.
The Simulation 3D Fisheye Camera block models the sensor used in the scenario. Open this block and inspect its parameters.
The Mounting tab contains parameters that determine the mounting location of the sensor. The fisheye camera sensor is mounted to the center of the roof of the ego vehicle.
The Parameters tab contains the intrinsic camera parameters of a fisheye camera. These parameters are set to their default values.
The Ground Truth tab contains a parameter for outputting the location and orientation of the sensor in meters and radians. In this model, the block outputs these values so you can see how they change during simulation.
The block outputs images captured from the simulation. During simulation, the Video Viewer block displays these images.
Simulate the model. When the simulation begins, it can take a few seconds for the visualization engine to initialize, especially when you are running it for the first time. The
AutoVrtlEnv window shows a view of the scene in the 3D environment.
The Video Viewer block shows the output of the fisheye camera.
To change the view of the scene during simulation, use the numbers 1–9 on the numeric keypad.
For a bird's-eye view of the scene, press 0.
After simulating the model, try modifying the intrinsic camera parameters and observe the effects on simulation. You can also change the type of sensor block. For example, try substituting the 3D Simulation Fisheye Camera with a 3D Simulation Camera block. For more details on the available sensor blocks, see Choose a Sensor for 3D Simulation.