Automate Virtual Assembly Line with Two Robotic Workcells
This example shows simulation of an automated assembly to demonstrate virtual commissioning applications. The assembly line is based on a modular industrial framework created by ITQ GmbH known as Smart4i. This system consists of four components: two robotic workcells that are connected by a shuttle track and a conveyor belt. One of the two robots places cups onto the shuttle, while the other robot places balls in the cups. The conveyor then delivers those cups to a container. This simulation uses Stateflow® to control the system control and demonstrates how you can use Unreal Engine™ to simulate a complete virtual commissioning application in Simulink®.
System Overview
The core environment has five components that form the assembly line:
Robot 1 (Comau) — The first robot is a Comau Racer V3. This robot picks up the cups and puts them in the shuttles
Robot 2 (Mitsubishi) — The second robot is a Mitusbishi RV-4F. The camera sensor in the workcell of this robot detects the balls and then this robot picks up the balls and places each of them in a separate cup.
Shuttle Track & Shuttles — The shuttle track moves four shuttles to the Robot 1, then to the Robot 2, and then to the conveyor belt before returning to the Robot 1.
Conveyor Belt — The conveyor belt carries the cups away from the track and into a container.
Static Machine Frame — The remaining non-moving model parts comprise the static machine frame, which serves as the base for the assembly. It includes the two work cells housing the robots, as well as the base for the center assembly.
This system is created using CAD data provided by ITQ GmbH. For your convenience, we imported the CAD files to MATLAB®, preprocessed the CAD files using Simulink 3D Animation® functions, and then saved the CAD files to a MAT file for reuse.
During normal execution, Robot 1 picks up cups from a tray in the Robot 1 workcell and places them in a shuttle that is waiting for the cup. Placing the cup in the shuttle triggers the filled shuttle to move along the shuttle track to Robot 2 and creates instance of the ball in the Robot 2 workcell. The location of this ball in Robot 2 workcell is random. The camera sensor in the Robot 2 workcell detects the position of the ball for Robot 2 to use to pick up the ball. Once the shuttle containing the cup stops at the Robot 2 workcell, the Robot 2 picks up the ball and places it inside the cup. Then the shuttle then moves to the conveyor belt, which has an unloading station where it releases the cup onto the conveyor belt. From there, the conveyor belt transports the cup until it reaches the end of the conveyor belt, falling into the container. After releasing the cup onto the conveyor belt, the shuttle moves back to the Robot 1 workcell to wait to receive another cup.
Model Overview
This example uses the assembly line CAD data, robot RBT files, and other supporting files. Download and unzip the data set.
downloadFolder = matlab.internal.examples.downloadSupportFile("R2023a/sl3d/examples/","smart4i_data.zip"); unzip(downloadFolder);
Open the model to view the block diagram:
open_system("smart4i_model.slx");
The model has four main parts labeled:
Scene Creation & Configuration
System Control Logic
Sensor Input
Actor Motion Control
System Creation & Configuration
This section has two main functions. First, the Simulink 3D Scene Configuration block defines the baseline Unreal Engine™ environment that the model connects to.
Second, an Actor block named Prepare World is used to set up the environment defined in the previous section. To do this, the actor calls the setup script, smart4i_setupworld.m
, which loads and configures the scene. In this example, the scene was prepared using these tools:
Simulink 3D Animation tools import the workcells, shuttles, and a static machine frame from CAD files into MATLAB. For more information about importing CAD files, see Use CAD Models with the Simulink 3D Animation Product (Simulink 3D Animation).
The
load
(Simulink 3D Animation) function of thesim3d.Actor
(Simulink 3D Animation) object to import the two robots by loading from their URDF files.The
sim3d.Actor
(Simulink 3D Animation) andsim3d.World
(Simulink 3D Animation) objects to create the conveyor belt.
System Control Logic
Stateflow chart in the System Control Logic section determines the main system behavior.
Inside the Stateflow chart, there are six main sections, listed from bottom to top:
Shuttle Control Logic — These charts in this section control the motion of each of the four shuttles. The charts output three-dimensional vectors containing the specified shuttle XY-position, and the rotation about the Z-axis.
Robot Control Logic — The charts in this section control the motion of the two robot manipulators. The charts output three-dimensional vectors containing the end-effector translations. The chart for Robot 2 also controls creation of new ball instances so that the balls may be picked up.
Cup Instance Logic — This chart controls the destruction of old cups after the conveyor belt drops the cups into the container.
Cup & Ball Detection — This section contains two Simulink functions,
findBallLoc
andfindCupLoc
, which provide the initial pose of a cup and ball, respectively. The robots use the pose to define a configuration in which they can pick up the cup or ball. In a real assembly line, while Robot 1 selects which cup to pick up, the ball may roll in the workspace. Because this would change the position of the ball, the camera in the Robot 2 workcell detects the position of the ball in the workcell.Cup & Ball Actor Utilities — This section contains blocks and functions for enforcing event-based behaviors like picking up and releasing a cup or ball. These subsystems reference the MATLAB functions in the directory of this example to manipulate the scene.
Trajectory Generation — This section contains the
genPath2
function, which generates a trajectory for a robot given two end-effector poses. The function relies on thetrapveltraj
function from the Robotics System Toolbox™ to ensure that the resulting trajectory follows a typical point-to-point profile for a manipulator.
Note that while the bulk of the system logic is controlled by the first two sections, Shuttle Control and Robot Control, all the pieces are required for the system to function properly. For example, in the Robot Control Logic section, the Robot 2 Stateflow chart interfaces with several subsystems and the trajectory subfunction in order to control the overall motion.
Sensor Input
The Sensor Input section contains the camera as a Simulation 3D Camera Get block.
The initial ball position in the workcell of Robot 2 is the same but with a small random offset to reflect the real-life variability caused by the ball rolling. The findBallLoc
triggered subsystem takes the image data from the camera to detect the ball.
In this subsystem, the Deep Learning Object Detector block from the Deep Learning Toolbox™ takes the image data and outputs the bounding boxes. The bounding boxes provide information about the size and location of the ball in the image. The findBallXY
MATLAB function converts the bounding boxes to the XY-positions of the ball and then returns those positions to the Stateflow chart for Robot 2. The Deep Learning Object Detector was pretrained in MATLAB by creating a video of the ball at random locations and using the Video Labeler app from the Computer Vision Toolbox™ to create a labeled dataset and train a yolov2ObjectDetector
object. While there are simpler approaches that would suffice for red ball detection, this approach may scale to more complex inputs.
Actor Control
In order to move the objects in the scene, the system must translate motion commands to the correct desired behavior. This section defines the motion of the two robots, the shuttles, and the existence and initial position of the ball.
Robot Actors
The Unreal Engine environment defines the robots as a tree hierarchy of actors. So each robot body corresponds to an actor in the Unreal Engine world. The system uses the Robot1
and Robot2
subsystems to control the motion of Robot 1 and Robot 2, respectively. In each case, the subsystem gets a specified end-effector translation and the goal of the subsystem is to position the robot actors such that the robots reach the specified poses. This diagram shows the contents of each subsystem:
These steps rely on kinematic computations on the robot, which you can achieve with blocks from the Robotics System Toolbox. To do this, it is necessary to have a kinematic model of the robot in MATLAB using the rigidBodyTree
object. During scene creation, the system imports the rigid body trees for Robot 1 and Robot 2 from their original URDF files to the base MATLAB workspace.
show(Robot1);
title("Robot 1 (Comau Racer V3)");
Inspect each robot to see the bodies it contains. These bodies correspond to the actors in the scene. In the Unreal Engine scene, each actor pose is defined relative to the parent body.
showdetails(Robot1)
-------------------- Robot: (6 bodies) Idx Body Name Joint Name Joint Type Parent Name(Idx) Children Name(s) --- --------- ---------- ---------- ---------------- ---------------- 1 part_1 joint_1 revolute base_link(0) part_2(2) 2 part_2 joint_2 revolute part_1(1) part_3(3) 3 part_3 joint_3 revolute part_2(2) part_4(4) 4 part_4 joint_4 revolute part_3(3) part_5(5) 5 part_5 joint_5 revolute part_4(4) tool(6) 6 tool joint_6 revolute part_5(5) --------------------
Given the robot models, it is possible to translate an end-effector pose to actor motion in two mains steps, as specified by the two areas in the model:
Compute Robot Configuration from Target Pose — The Simulink model combines the input translation and a desired orientation into one signal and inputs that signal as a pose to an Inverse Kinematics block. This block converts a desired end-effector pose to a set of joint angles, known as the joint configuration.
Specify the Hierarchical Actor Poses — Since each robot body pose is specified relative to a parent robot body, you must convert the set of joint angles into six relative poses that relate each body to their corresponding parent body. The model uses six Get Transform blocks to find the relative poses between two specified bodies.
Note that the gripper actions do not happen here. Instead, cups and balls are picked by reparenting them to the end effector, effectively adding the picked part to the robot hierarchy. The supervisory logic in the Stateflow chart sets the events that trigger the Simulink functions calling the attach and detach functions.
Ball Instance
When the Stateflow chart triggers the system to create a new ball to pick up, the Simulation 3D Actor block named Ball Instance
commands the creation of a new ball for Robot 2 to pick up. The block adds some random offset to the position of the ball to simulate real-world conditions caused by the ball rolling, but the detection subsystem ensures that the control logic gets the exact ball position.
Shuttles
The shuttles are moved by setting their translation and orientation in space. The Stateflow chart outputs XY-position and orientation about the Z-axis. Then the Shuttles subsystem combines that information with constants related to the missing information. The Shuttles subsystem then outputs the resultant set of poses to the appropriate actors in the Unreal Engine world.
Simulate the Model
Open the model and click Run to start the simulation:
open_system("smart4i_model.slx");
Since there is a large number of actors in the scene, the simulation may take around 30-45 seconds to start in 3D viewer.