How 3D Simulation for Automated Driving Works

Automated Driving Toolbox™ provides a co-simulation framework that you can use to model driving algorithms in Simulink® and visualize their performance in a 3D environment. This 3D simulation environment uses the Unreal Engine® by Epic Games®.

Understanding how this simulation environment works can help you troubleshoot issues and customize your models.

Communication with 3D Simulation Environment

When you use Automated Driving Toolbox to run your algorithms, Simulink co-simulates the algorithms in the visualization engine.

In the Simulink environment, Automated Driving Toolbox:

  • Configures the 3D visualization environment, specifically the ray tracing, scene capture from cameras, and initial object positions

  • Determines the next position of the objects by using the 3D simulation environment feedback

The diagram summarizes the communication between Simulink and the visualization engine.

Block Execution Order

During simulation, the 3D simulation blocks follow a specific execution order:

  1. The Simulation 3D Vehicle with Ground Following blocks initialize the vehicles and send their X, Y, and Yaw signal data to the Simulation 3D Scene Configuration block.

  2. The Simulation 3D Scene Configuration block receives the vehicle data and sends it to the sensor blocks.

  3. The sensor blocks receive the vehicle data and use it to accurately locate and visualize the vehicles.

The Priority property of the blocks controls this execution order. To access this property for any block, right-click the block, select Properties, and click the General tab. By default, Simulation 3D Vehicle with Ground Following blocks have a priority of -1, Simulation 3D Scene Configuration blocks have a priority of 0, and sensor blocks have a priority of 1.

The diagram shows this execution order.

If your sensors are not detecting vehicles in the scene, it is possible that the 3D simulation blocks are executing out of order. Try updating the execution order and simulating again. For more details on execution order, see Control and Display the Execution Order (Simulink).

Also be sure that all 3D simulation blocks are located in the same subsystem. Even if the blocks have the correct Priority settings, if they are located in different subsystems, they still might execute out of order.

Coordinate Systems

Scenes in the 3D simulation environment use the right-handed Cartesian world coordinate system defined in ISO 8855. In this coordinate system, when looking in the positive X-axis direction, the positive Y-axis points left. The positive Z-axis points up from the ground.

Sensors are mounted on vehicles relative to the vehicle coordinate system. In this system the positive X-axis points forward from the vehicle, the positive Y-axis points left, and the positive Z-axis points up from the ground. The vehicle origin is on the ground, below the longitudinal and lateral center of the vehicle.

These coordinate systems differ from the ones used in the Unreal® Editor. The Unreal Editor uses left-handed Cartesian coordinate systems, where the Y-axis points right and the Z-axis points down.

For more details on the coordinate systems used for 3D simulation, see Coordinate Systems for 3D Simulation in Automated Driving Toolbox.

Related Topics