Info

This question is closed. Reopen it to edit or answer.

Monocular Visual Odometry - How to get the relative coordinate of an image according to the previous frame

1 view (last 30 days)
Hi,
I'm here to ask a question about recovering the position of a robot from a frame to another, using images taken by a camera.
My process is simple, i have image taken at each frame.
I recover the features (corners + blobs), match them from a frame to another ...
At the end, i estimated my essential matrix with these points, and finally my rotation matrix and translation vector.
Then it seems to be not work, but i don't really know why.
[E, epipolar] = estimateEssentialMatrix(location_l_p, location_l_a, stereo_params.CameraParameters2);
% Find epipolar inliers -> Matches which follows the
% epipolar constraint, avoiding using matches from the
% borders. Normally, i don't need it.
location_l_p = location_l_p(epipolar, :);
location_l_a = location_l_a(epipolar, :);
% 1
% Get the relative position of the camera in the actual
% image using the [R] rotation matrix, [t] translation
% vector and the Essential matrix of the previous left
% image.
[orientation, location] = relativeCameraPose(E, stereo_params.CameraParameters2, location_l_p, location_l_a);
% Compute extrinsics of the second camera to get the parameters.
[actual_R, actual_t] = cameraPoseToExtrinsics(orientation, location);
R = R * actual_R;
t = t + actual_t * R;
The things is that i want to return, at each frame, the position [x, y] in my coordinate system, using the previous coordinates and the actual R (Rotation Matrix) and T (translation vector).
Can you help me ?
PS : i verify my features selected, i used a lot of process and there good (from 100 features to 20 features : circle validation, selection uniform, epipolar ....)

Answers (0)

This question is closed.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!