get pictures from real time video
2 views (last 30 days)
Show older comments
Hello, i'm coding a programme of facial expression recognition on real time and i want to take a picture every second and execute the PCA (eigen face) algorithm on it to classify the emotion, all that precessus it has to be automaticlly how can you help me please (this is what i did but i still have to take it manually)
for taoefaeffadef =1:1800
%while ~isDone(videoFileReader) % get the next frame videoFrame = step(videoFileReader);
% Track the points. Note that some points may be lost.
[points, isFound] = step(pointTracker, videoFrame);
visiblePoints = points(isFound, :);
oldInliers = oldPoints(isFound, :);
if size(visiblePoints, 1) >= 2 % need at least 2 points
if newperson==1
matchPic = imcropPolygon(bboxPolygon,videoFrame);
matchPic = cutPic(matchPic);
imwrite(matchPic,'test2.bmp','bmp');
%load database at the beginning
number = libCheck(load_database(3),imread('test2.bmp'))
if number<11
% disp('::::::::happy::::::::::')
elseif number<21
% disp('::::::::::angry:::::::::')
elseif number<31
% disp('::::::: Surprised :::::::')
elseif number<41
% disp('::::::: normal :::::::')
newperson=0;
end
% Estimate the geometric transformation between the old points
% and the new points and eliminate outliers
[xform, oldInliers, visiblePoints] = estimateGeometricTransform(...
oldInliers, visiblePoints, 'similarity', 'MaxDistance', 4);
% Apply the transformation to the bounding box
[bboxPolygon(1:2:end), bboxPolygon(2:2:end)] ...
= transformPointsForward(xform, bboxPolygon(1:2:end), bboxPolygon(2:2:end));
% Insert a bounding box around the object being tracked
videoFrame = insertShape(videoFrame, 'Polygon', bboxPolygon);
% Display tracked points
videoFrame = insertMarker(videoFrame, visiblePoints, '+', ...
'Color', 'red');
% Reset the points
oldPoints = visiblePoints;
setPoints(pointTracker, oldPoints);
else
%if the current face has disappeared, we will need to build another
%point tracker object
release(pointTracker);
%buildling another point tracker object
pointTracker = vision.PointTracker('MaxBidirectionalError', 2);
% Detect feature points in the face region.
points = detectMinEigenFeatures(rgb2gray(videoFrame), 'ROI', bbox(:,:));
points = points.Location;
initialize(pointTracker, points, videoFrame);
oldPoints = points;
bbox= step(faceDetector, videoFrame);
while(size(bbox,1)<1)
videoFrame = step(videoFileReader);
bbox= step(faceDetector, videoFrame);
step(videoPlayer, videoFrame);
end
% Convert the first box to a polygon.
% This is needed to be able to visualize the rotation of the object.
x = bbox(1, 1); y = bbox(1, 2); w = bbox(1, 3); h = bbox(1, 4);
bboxPolygon = [x, y, x+w, y, x+w, y+h, x, y+h];
% Draw the returned bounding box around the detected face.
videoFrame = insertShape(videoFrame, 'Polygon', bboxPolygon);
newperson=1;
end
% Display the annotated video frame using the video player object
step(videoPlayer, videoFrame);
axes(handles.axes4)
imshow(videoFrame);
% Detect feature points in the face region.
%points = detectMinEigenFeatures(rgb2gray(videoFrame), 'ROI', bbox);
end
0 Comments
Answers (1)
Image Analyst
on 21 Apr 2017
You're not doing real time video capture - you're just reading an already saved video. You need to get a live video and call the getsnapshot() function.
0 Comments
See Also
Categories
Find more on Feature Detection and Extraction in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!