Main Content

LEGO EV3 Object Tracking System Using Pixy2 Vision Sensor and PID Controller

This example shows how to improve the efficiency of an object tracking system using a Proportional Integral Derivative (PID) controller. The object tracking system has been built using the LEGO® MINDSTORMS® EV3 hardware and Pixy2 vision sensor.

Introduction

An object tracking system assesses the movement of an object in motion with the help of a vision sensor.

In this example, an object tracking system is built using an EV3 two-wheeled robot and a Pixy2 vision sensor. The motor mounted on the robot tracks the trained moving object, and self-aligns itself with the object while maintaining a threshold distance from it. The PID controller helps improve the response time and accuracy of the EV3 motors to help achieve a smoother and accurate object tracking system. An Android® application is used to tune these PID controller parameters.

In this example, you will perform these tasks:

  • Set up a network connection between the Android device and the EV3 robot.

  • Configure and run a Simulink® model for the EV3 robot to send and receive TCP/IP packets from the Android device to tune the PID controller parameters.

You will have to deploy an application on an Android device, such as a phone or tablet, to tune the PID controller parameters. For more information on how to deploy the application and tune the PID controller parameters, see the Tune PID Controller Parameters Using Android Application for LEGO EV3 Object Tracking System example.

Dependency

Implementing the LEGO EV3 object tracking system using the PID controller requires successful deployment of these Simulink models on their respective hardware.

        Simulink Model Name                |    Deployed Hardware    |                           Purpose
----------------------------------------------------------------------------------------------------------------------------------------------------
1. ev3_android_pixy2_tracking_pidcontrol   |   LEGO MINDSTORMS EV3   |   Implement and deploy the object tracking system on EV3 robot
2. androidEv3PIDTuner                      |        Android          |   Tune PID controller parameters through the Android application dashboard

Prerequisites

Complete the Getting Started with LEGO MINDSTORMS EV3 Hardware and Communicating with LEGO MINDSTORMS EV3 Hardware example.

Note: This example supports Pixy2 LEGO firmware version 3.0.14.

Required Hardware

  • LEGO MINDSTORMSEV3 brick

  • Pixy2 vision sensor

  • Two LEGO MINDSTORMS EV3 medium motors

  • Micro USB cable

  • Pixy2 LEGO adaptor

Task 1: Set up EV3 Robot

  1. Build a two-wheeled EV3 robot with a Pixy2 vision sensor mounted on it.

  2. Set up a connection between the EV3 robot and your host machine. For details on how to set up the connection, see Task 2 in the Getting Started with LEGO MINDSTORMS EV3 Hardware example.

Task 2: Set up Pixy2 Vision Sensor Through PixyMon Utility

PixyMon is an application that acts as an interface to help you view what the Pixy2 vision sensor sees as either unprocessed or processed video.

  1. Configure I2C Address: Open the PixyMon utility and navigate to File > Configure > Pixy Parameters (saved on Pixy) > Interface. Configure the I2C address between 0x54 and 0x57.

  2. Configure Sensor to Detect Objects: You can use the PixyMon utility to configure the sensor for seven, unique-colored objects. For more information on how to train the sensor for Color Signature 1, refer to Train Pixy2.

  3. Configure Line Tracking Mode: Open the PixyMon utility and navigate to File > Configure > Pixy Parameters (saved on Pixy) > Expert.

  4. Select Delayed turn to enable intersection detection.

Task 3: Configure Model and Calibrate Parameters

This support package provides a preconfigured EV3 Simulink model that you can deploy on the EV3 brick for object tracking.

Open the ev3_android_pixy2_tracking_pidcontrol Simulink model.

Configure the following parameters in the Simulink model:

1. Sensor Input: Refer to this section to configure the Pixy2 Vision Sensor block.

  • Set the EV3 brick input port parameter such that it matches the input port connected to the Pixy2 vision sensor.

  • Set the I2C Address parameter to the same I2C address as configured in the PixyMon utility.

  • For the purpose of this example, keep the Tracking Mode parameter set to its default mode of Color Signature. You can use other tracking options like Color Code Signature, Color and Color Code Signature (Mixed), and Line.

  • For the purpose of this example, keep the Color Signature parameter set to 1. You can use ALL to track all the color signatures.

  • For the purpose of this example, set the Sample time parameter to 0.04 seconds, so that the block can read values from the Pixy2 vision sensor based on this time. You can set the sample time to match your requirements.

2. Control Input: Refer to this section to establish a network connection between the LEGO EV3 Simulink model and the Android Simulink model, and tune the PID controller parameters through the graphical user interface (GUI) on the Android application.

The Android device and the EV3 robot communicate with each other over the TCP/IP network, where the Android device acts as a client (the one that sends data) and the EV3 robot acts as a server (the one that receives data).

Configure all the TCP/IP Receive blocks consistently for the PID controller parameters.

  • Set the Connection mode parameter to its default value Server.

  • Set the Local IP port to match the port value of the Android device. The default starting port number is 25000.

  • For the purpose of this example, set the Data type parameter to double. You can select the data type that is to be received from the remote host.

  • For the purpose of this example, set the Data size parameter to 1. You can set the data size that is to be received from the remote host.

  • For the purpose of this example, set the Sample time to 0.1 seconds to receive data from the Android device based on this set value. You can set the sample time to match your requirements.

3. Algorithm: Refer to this section to configure the Object Tracking subsystem. This subsystem receives inputs from the Sensor Input and Control Input areas. It eventually controls the amount of power delivered to the EV3 motors based on the calculated PID controller parameters.

The PID controller parameters are used in the Object Tracking subsystem. The value for these parameters are preconfigured in the TCP/IP Transmit blocks of the Android Simulink model.

  • P of Alignment: This parameter provides proportional control for the lateral movement of the EV3 robot.

  • P of Proximity: This parameter provides proportional control for adjusting the EV3 robot distance from the tracked object.

  • I of Alignment: This parameter provides integral control for the lateral movement of the EV3 robot.

  • I of Proximity: This parameter provides integral control for adjusting the EV3 robot distance from the tracked object.

  • Pixy2 Bottom Threshold: This parameter controls (slows down or speeds up) the EV3 motor movement. The value on this parameter decides the maximum object distance allowed from the EV3 robot. This value is an input to the PID for Alignment Control block.

  • Pixy2 X Frame Reference: This parameter adjusts the frame of view for the vision sensor. The default alignment is center. The central viewpoint for the sensor shifts to the left as the slider moves toward the left and vice versa. The xy-coordinates, width, and height of the object are computed using the frame of view of the sensor. This value is an input to the PID for Proximity Control block.

4. Actuators: Refer to this section to control the EV3 motors connected to the ports.

Select the EV3 brick output port connected to both the LEGO motors, labeled as in the Motors subsystem as Left Motor and Right Motor. For the purpose of this example, output port D is connected to the left motor and output port A to the right motor.

Task 4: Run Model on EV3 Robot

1. On the Hardware tab of the ev3_android_pixy2_tracking_pidcontrol Simulink model, click Monitor & Tune to run the model on the LEGO MINDSTORMS EV3 hardware.

2 Place the colored object anywhere in the field of view of the Pixy2 vision sensor. Ensure that the EV3 robot follows the object and corrects its trajectory as it loses track of the object.

3. Open the androidEv3PIDTuner Simulink model.

4. Configure and deploy the androidEv3PIDTuner Simulink model on your Android device. The androidEv3PIDTuner application is installed on the device. For more information, refer to Task 2 and Task 3 of the Tune PID Controller Parameters Using Android Application for LEGO EV3 Object Tracking System example.

5. Open the application to view the GUI. This interface includes knobs and sliders to fine tune the PID controller parameters. For more information on the application, refer to androidEv3PIDTuner Application section of the Tune PID Controller Parameters Using Android Application for LEGO EV3 Object Tracking System example.

Related Links

See Also

Track Colored Objects Using Pixy2 Vision Sensor