This example shows how to acquire real-time images from a webcam, process the images using fixed-point blob analysis, and determine world coordinates to score a laser pistol target.
The technology featured in this example is used in a wide range of applications, such as estimating distances to objects in front of a car, and medical image analysis of cells. Key features of this example include:
Fixed-point blob analysis for collecting measurements
Real-time image acquisition
Camera calibration to determine world coordinates of image points
Correct images for lens distortion to ensure accuracy of collected measurements in world units
Determine world coordinates of image points by mapping pixel locations to locations in real-world units
All code for this example is stored in the examples folder. To edit the code, navigate to this folder.
+LaserTargetExamplefolder to a writeable location.
Image Acquisition Toolbox™ enables you to acquire images and video from cameras and frame grabbers directly into MATLAB® and Simulink®. Using the Image Acquisition Toolbox Support Package for GigE Vision® Hardware or the MATLAB® Support Package for USB Webcams, set up a camera to acquire the real-time images to perform the analysis.
For more information on setting up the camera, see Device Connection (Image Acquisition Toolbox).
Use the following commands to create a target to print for use in the exercise. The code generates a postscript file that can be opened and printed double-sided, with the target on one side, and the checkerboard for camera calibration on the other side.
distance = 10; % meters offset_mm = 0; % mm print_target = true; LaserTargetExample.make_target_airpistol10m(distance, ... offset_mm, print_target)
You can find pre-made targets in the
Set up the camera so that it faces the checkerboard side of the target. The shooter faces the target. You can keep the target and camera in fixed positions by mounting them on a board.
Camera calibration is the process of estimating the parameters of the lens and the
image sensor. These parameters measure objects captured by the camera. Use the Camera Calibrator app to detect the
checkerboard pattern on the back of the target, and remove any distortion. Determine
the threshold of ambient light on the target. You may need to adjust camera settings
or the lighting so that the image is not saturated. Use the
pointsToWorld function to determine world coordinates of the
For more information, see What Is Camera Calibration? (Computer Vision Toolbox).
The algorithm scores the shots by detecting the bright light of the laser pistol. While shooting, get a frame and detect if there is a bright spot. If there is a bright spot over the specified threshold, process that frame.
Use blob analysis to find the center of the bright spot, and translate the location from pixel coordinates to world coordinates. The blob analysis is done in fixed point because the image is stored as an 8-bit signed integer. After finding the center of the bright spot in world coordinates, calculate its distance from the bullseye at the origin and assign a point value to the shot.
Add the example code to the path.
Start the simulation by executing the
run script stored in the
(1) gigecam (2) webcam (3) simulation Enter the number of the source type:
The script prompts you to select the source to use for the simulation. Enter
3 to watch a simulation of a previously recorded session. There
are eight previously recorded simulations available. Enter a number (1 through 8) to
begin the simulation.
(1) saved_shots_20170627T201451 (2) saved_shots_20170627T201814 (3) saved_shots_20170702T153245 (4) saved_shots_20170702T153418 (5) saved_shots_20170702T162503 (6) saved_shots_20170702T162625 (7) saved_shots_20170702T162743 (8) saved_shots_20170702T162908 Enter number of file from list:
2 prompts you to set up a
Vision camera or a webcam. The example then prompts you to enter the distance
from the shooter to the target (meters) and the name of the shooter.
To set up the example using your own camera, use the Camera Calibrator app to detect the checkerboard on the back of the target, and remove distortion. Save the calibration variables in a MAT-file. The calibration variables for the GigE Vision camera and a webcam are saved in the following MAT-files.
Edit one of the following files substituting the settings with appropriate values for your camera.
Each time you shoot, the hits are recorded in a file named
ShotDatabase.csv. You can load the data into a table object
readtable to visualize it. For example, after shooting,
which populates the
ShotDatabase.csv file, the following code
plots the center of a group of many
T = readtable('ShotDatabase.csv'); LaserTargetExample.make_target_airpistol10m; LaserTargetExample.plot_shot_points(T.X, T.Y); ax = gca; line(mean(T.X)*[1,1], ax.YLim); line(ax.XLim, mean(T.Y)*[1,1]); grid on;
Each time you shoot, the video frames in which shots were detected are stored in
files in a folder named
simulation_recordings. You can load these
files and explore the raw data from the shots. You can also edit the
frames contains the first frame which was used for
calibration, plus ten frames for each detected shot. The first frame in each run of
ten is where a shot was detected. You can see your hand movement in the subsequent
frames. You can make a short animation of the data using the following code.
d = dir(fullfile('simulation_recordings','*.mat')); record = load(fullfile(d(1).folder, d(1).name)); t = LaserTargetExample.SerialDateNumber_to_seconds(... record.times); t = t-t(1); figure for k = 1:size(record.frames, 3) imshow(record.frames(:,:,k), ... 'InitialMagnification','fit'); title(sprintf('Time since beginning of round: %.3f seconds',... t(k))) drawnow end