How Can I recognize/detect the Needle/hand in the below images?

Hi I am very new to image processing and struggling to detect the Needle in the below images. Anybody who can help me to segment the Needle. Keep in mind that it moves as in sped meter. We can use multiple stream to find out the needle but I don't know how to do it?
many many thanks in advance for your time

 Accepted Answer

ImageAnalyst will certainly provide a fancy solution (like a bwGetBlobAngle from his FEX ;-)), but (waiting for him to hit the forum) here is one way to do it "by hand" (I did that during a short pause, so all should be optimized).
=== VERSION 07/31, 2 =======================================================
Reading another post by ImageAnalyst, I realize that I should learn more about NORMXCORR2, but here is a version based on a 2D convolution for matching the "200" in the indicator. I chose it because it should never be hidden by the needle. From the center of the match, we get the center of rotation of the needle by adding a fixed offset. I generated a shifted 3rd image to test the approach. I attached the shifted image, the code and a template file (for the match) to this answer, below.
clear ; close all ; clc ;
display = true ; % True for debug, false for processing
% large number of files.
% - Parameters associated with the needle
originAngle = 4 ; % Used to compensate for the image tilt.
needleLength = 400 ;
% - Parameters associated with the curve around the needle center.
curveRadius = 200 ;
kernelSize = 30 ; % Kernel size for 1D conv along curve.
% - Parameters associated with the window that must frame the "200"
% on the indicator.
w200Center = [1000, 1860] ; % Approx. center of window "200".
w200OffsetNC = [290, 180] ; % Approx. offset to needle center.
w200width = 800 ;
w200Height = 600 ;
w200MatFile = 'w200_threshold08_template.mat' ;
% - Load/build '200' template for 2D convolution. Prepare for convolution
% purpose: remap 0 to -1 and 1 to 1.
loaded = load( w200MatFile ) ;
w200Template = -1 + 2 * rot90( double( loaded.thresh08_200 ), 2 ) ;
% - Get listing of jpeg files.
D_jpg = dir( '*.jpg' ) ;
nFiles = numel( D_jpg ) ;
% - Prealloc(s), initialize.
needleAngles = zeros( nFiles, 1 ) ;
w200RowBnd = round( w200Center(1) + 0.5*[-w200Height, w200Height] ) ;
w200ColBnd = round( w200Center(2) + 0.5*[-w200width, w200width] ) ;
for fId = 1 : nFiles
% Initialize angle with NaN (=failed).
needleAngles(fId) = NaN ;
% - Read and "flatten" image.
I_flat = sum( imread( D_jpg(fId).name ), 3 ) ;
tic ;
% - Extract window "200", threshold @ 80% for comparison with template.
w200 = I_flat(w200RowBnd(1):w200RowBnd(2), w200ColBnd(1):w200ColBnd(2)) ;
w200 = w200 > 0.8 * max( w200(:) ) ;
% - Match template in window.
match = conv2( double( w200 ), w200Template, 'same' ) ;
[r, c] = find( match == max( match(:) ) ) ;
if numel( r ) > 1
fprintf( '%-20s: failed to match "200" template.\n', D_jpg(fId).name ) ;
if ~display
continue ; % Loop back, leaving NaN angle value.
end
end
matchCenter(1) = r(1) ;
matchCenter(2) = c(1) ;
% - Compute needle center.
needleCenter = [w200RowBnd(1) + matchCenter(1) + w200OffsetNC(1), ...
w200ColBnd(1) + matchCenter(2) + w200OffsetNC(2)] ;
% - Build curve around indicator center, match needle origin and
% rotation direction.
curveRowIds = round( needleCenter(1) - curveRadius * cos( 0:0.02:2*pi )) ;
curveColIds = round( needleCenter(2) + curveRadius * sin( 0:0.02:2*pi )) ;
% - Extract "signal" or sample along curve, perform 1D convolution.
% Manage circularity by tripling sample and working on middle part
% (crappy, must be overworked).
curveSample = I_flat( sub2ind( size(I_flat), curveRowIds, curveColIds )) ;
sampleConv = conv( repmat( curveSample, 1, 3 ), ...
ones( 1, kernelSize )/kernelSize, 'same' ) ;
nSamples = numel( curveSample ) ;
sampleConv = sampleConv(nSamples + (1:nSamples)) ;
needleAngles(fId) = -originAngle + ...
360/nSamples * find( sampleConv == max(sampleConv), 1 ) ;
fprintf( '%-20s: processing time = %.2fs.\n', D_jpg(fId).name, toc ) ;
if display
figure() ;
set( gcf, 'Units', 'normalized', 'Position', [0.1,0.1,0.8,0.8] ) ;
% - Plot image, window "200", curve, center, needle.
subplot( 2, 2, 1 ) ; hold on ;
imshow( I_flat/max(I_flat(:)) ) ;
plot( needleCenter(2), needleCenter(1), 'xb', curveColIds, curveRowIds, 'r' ) ;
plot( [reshape( [w200ColBnd; w200ColBnd], 1, [] ), w200ColBnd(1)], ...
[w200RowBnd, fliplr(w200RowBnd), w200RowBnd(1)], 'g' ) ;
needleTip = [needleCenter(1) - cosd( needleAngles(fId)+originAngle ) * needleLength, ...
needleCenter(2) + sind( needleAngles(fId)+originAngle ) * needleLength] ;
plot( [needleCenter(2), needleTip(2)], [needleCenter(1), needleTip(1)], ...
'b', 'LineWidth', 3 ) ;
% - Plot window "200" content and match center.
subplot( 2, 2, 2 ) ; hold on ;
imshow( w200 ) ;
plot( matchCenter(2), matchCenter(1), 'rx', 'LineWidth', 3, 'MarkerSize', 10 ) ;
% - Plot "signal" along curve, convolution, angle.
subplot( 2, 2, 3 ) ; hold on ; grid on ;
title( D_jpg(fId).name ) ;
angleSample = 0:360/(nSamples-1):360 ;
plot( angleSample, curveSample, 'b', angleSample, sampleConv, 'r' ) ;
line( [needleAngles(fId) needleAngles(fId)], ylim, 'Color', [0, 1, 0] ) ;
% - Text: angle, ..
subplot( 2, 2, 4 ) ; hold on ;
set(gca, 'Visible', 'off' ) ;
text( 0.1, 0.5 , sprintf( 'Angle = %.1f°', needleAngles(fId)), 'FontSize', 20 ) ;
end
end
With that we get:
And the 3rd, shifted image:
Note: I moved my first two answers (irrelevant now) at the end of my first comment below, for reference.

20 Comments

It’s not regularly graduated. The airspeed indicator here shows TAS (True Air Speed), that is IAS (Indicated Air Speed) corrected for air density (a function of ambient air temperature and pressure) known as ‘density altitude’. IAS is the difference between the pitot-tube (ram) air pressure and ambient (static port) air pressure.
A good discussion is in the FAA Pilot’s Handbook Of Aeronautical Knowledge ‘Flight Insturments’ chapter.
Many many thanks Cedric Wannaz I think this is what I was looking for. I will try and let you know
My pleasure, but wait a day or so before accepting my answer please, and see if ImageAnalyst sees the thread, because he would certainly direct you towards a much more correct/solid way of doing this.
=== VERSION 31/07, 1 ======================================================
This is a new version that tries to find the center of the indicator by matching a template of the area that contains the "200" (which should never be hidden by the needle) using a 2D convolution with the template that is in the attached MAT-File. From there, unless major catastrophes, the center of the indicator should by at a fixed offset ;-)
clear ; close all ; clc ;
detectCenter = true ;
kerSize = 30 ;
centerApprox = [2080, 1290] ;
threshOffset = [180, 290] ;
indicBoxDim = 1500 ;
curveRadius = 200 ;
thresh08file = 'thresh08_200.mat' ;
% - Load/build '200' template for 2D convolution.
loaded = load( thresh08file ) ;
thresh08_200 = -1 + 2 * rot90( double( loaded.thresh08_200 ), 2 ) ;
% - Get dir listing.
D_jpg = dir( '*.jpg' ) ;
nFiles = numel( D_jpg ) ;
angleDeg = zeros( nFiles, 1 ) ;
for fId = 1 : nFiles
I_flat = sum( imread( D_jpg(fId).name ), 3 ) ;
if detectCenter
% - Extract square chunk and threshold.
xBox = round( centerApprox(1) + 0.5 * [-indicBoxDim, indicBoxDim] ) ;
yBox = round( centerApprox(2) + 0.5 * [-indicBoxDim, indicBoxDim] ) ;
chunk = I_flat(yBox(1):yBox(2), xBox(1):xBox(2)) ;
chunk = chunk > 0.8 * max( chunk(:) ) ;
% - Find position of 200 (which should never be hidden by the needle).
match = conv2( double(chunk),thresh08_200, 'same' ) ;
[chunkCenter(2), chunkCenter(1)] = find( match == max( match(:) ), 1 ) ;
% - Compute indicator center.
center = [xBox(1) + chunkCenter(1) + threshOffset(1), ...
yBox(1) + chunkCenter(2) + threshOffset(2)] ;
else
center = centerApprox ;
end
% - Build curve around indicator center.
xCurve = round( center(1) + curveRadius * cos( 0:0.02:2*pi )) ;
yCurve = round( center(2) + curveRadius * sin( 0:0.02:2*pi )) ;
% - Extract "signal" along curve, perform 1D convolution.
curvSample = I_flat( sub2ind( size(I_flat), yCurve, xCurve )) ;
convSample = conv( repmat( curvSample, 1, 3 ), ones(1, kerSize)/kerSize, 'same' ) ;
nSamples = numel( curvSample ) ;
convSample = convSample(nSamples + (1:nSamples)) ;
angleDeg(fId) = 360/nSamples * find( convSample == max(convSample), 1 ) ;
if true % *** For debugging visually; set to false otherwise.
figure() ;
set( gcf, 'Units', 'normalized', 'Position', [0.1,0.1,0.8,0.8] ) ;
% - Plot image, curve, center.
subplot( 2, 2, 1 ) ; hold on ;
imshow( I_flat/max(I_flat(:)) ) ;
plot( center(1), center(2), 'xb', xCurve, yCurve, 'xr' ) ;
if detectCenter
plot( [xBox, fliplr(xBox), xBox(1)], [reshape( [yBox; yBox], 1, [] ), yBox(1)], 'g' ) ;
end
% - Plot "signal" along curve, convolution, angle.
subplot( 2, 2, 3 ) ; hold on ; grid on ;
title( D_jpg(fId).name ) ;
angleSample = 0:360/(nSamples-1):360 ;
plot( angleSample, curvSample, 'b', angleSample, convSample, 'r' ) ;
line( [angleDeg(fId) angleDeg(fId)], ylim, 'Color', [0, 1, 0] ) ;
text( 180, max(ylim), sprintf( 'Angle = %.1fdeg', angleDeg(fId)), 'HorizontalAlignment', 'center', 'VerticalAlignment', 'top', 'FontSize', 20 ) ;
% - Plot chunk around indicator, center of matched 200, estimate of
% indicator center.
subplot( 2, 2, 2 ) ; hold on ;
imshow( chunk ) ;
plot( chunkCenter(1), chunkCenter(2), 'rx', 'LineWidth', 3, 'MarkerSize', 10 ) ;
plot( chunkCenter(1)+threshOffset(1), chunkCenter(2)+threshOffset(2), 'bx', 'LineWidth', 3, 'MarkerSize', 10 ) ;
end
end
I made a 3rd image with a large offset/vibration to test (attached). With that, we get:
And the 3rd with the shift:
Note that it's a bit of a mess because I just used my coffee break, and that the chunk should be centered on the approx position of the '200' and not the approx. position of the center of the indicator. It is also slower than the previous approach because of the 2D convolution; centering on '200' would allow to reduce the size of the chunk and accelerate the process though, but my break is over ;-)
=== First, crappy version ==================================================
clear ; close all ; clc ;
kerSize = 30 ;
center = [2080, 1290] ; radius = 200 ;
xCircle = round( center(1) + radius * cos( 0:0.02:2*pi )) ;
yCircle = round( center(2) + radius * sin( 0:0.02:2*pi )) ;
D_jpg = dir( '*.jpg' ) ;
nFiles = numel( D_jpg ) ;
angleDeg = zeros( nFiles, 1 ) ;
for fId = 1 : nFiles
I_flat = sum( imread( D_jpg(fId).name ), 3 ) ;
curvSample = I_flat( sub2ind( size(I_flat), yCircle, xCircle )) ;
convSample = conv( repmat( curvSample, 1, 3 ), ones(1, kerSize)/kerSize, 'same' ) ;
nSamples = numel( curvSample ) ;
convSample = convSample(nSamples + (1:nSamples)) ;
angleDeg(fId) = 360/nSamples * find( convSample == max(convSample), 1 ) ;
if true % *** For debugging visually; set to false otherwise.
figure() ;
set( gcf, 'Units', 'normalized', 'Position', [0.1,0.1,0.8,0.8] ) ;
subplot( 1, 2, 1 ) ; hold on ;
imshow( I_flat/max(I_flat(:)) ) ;
plot( center(1), center(2), 'xb', xCircle, yCircle, 'xr' ) ;
subplot( 1, 2, 2 ) ; hold on ; grid on ;
title( D_jpg(fId).name ) ;
angleSample = 0:360/(nSamples-1):360 ;
plot( angleSample, curvSample, 'b', angleSample, convSample, 'r' ) ;
line( [angleDeg(fId) angleDeg(fId)], ylim, 'Color', [0, 1, 0] ) ;
text( 180, max(ylim), sprintf( 'Angle = %.1fdeg', angleDeg(fId)), 'HorizontalAlignment', 'center', 'VerticalAlignment', 'top', 'FontSize', 20 ) ;
end
end
With that you get:
and
>> angleDeg
angleDeg =
276.5714
358.8571
Then you still have to convert the angle into whatever the needle represents (noting that graduation doesn't seem to be regular (?)).
Notes:
  • This assumes that the location and resolution of the camera do not vary. It simply creates a circular curve around a fixed center (no recognition here), and spots the angle where the convolution of values in the array along the curve hits a max.
  • The angle starts at 0 "along the East axis" and then is measured by rotating clockwise.
OK, I hope this will work. I have to make it general for kernal radious selection. What if there is a light reflection (bright spot) around the center of image?? I think, this spot will make some ambiguity ??
Post this image. One way would be to just have a "template" image of the gauge with the white needle erased. Then register the images (align them), and subtract. Then call regionprops() and ask for the orientation. Then you can have a formula where you convert the angle in degrees into the air speed.
but image registration takes long processing time. Also, produce extra white colour pixels in case of vibration.
I had not seen this comment about the bright spot. What we are doing here is to create a circular path around what looks to be the rotation center of the needle. The only requirement is that the needle is brighter than the background over the circular path. Just see how it performs in special cases, with the display enabled, and you'll see if you need to adjust the size of the kernel for the convolution.
The ‘bright spot’ would most likely be a reflection off the plastic faceplate of the airspeed indicator. One solution would be to put a correctly-positioned polarising lens on the camera. This would significantly reduce or eliminate it.
Yes image registration will take longer. But if your camera is mounted on some kind of arm in a plane and is vibrating and the light is changing, then you have registration problems and lighting problems. Sure you can ignore them but you may either not be able to find the needle at all or may estimate that it's pointing to a different speed than it actually is. It just depends on how good your images are - how stable they are both spatially and radiometrically. Star's idea about using a polarizer to cut down on faceplate specular reflections is a good one. You say that you have vibrations so the center of the gauge moves - it's not always at the same pixel location each time - that's why you need to register it. Or if you don't register it then you at least need to use some algorithm where you identify the location of the center of the gauge (pivot point of the needle).
Generated a 3rd version, with better notation and indexing, which matches the "200" in the indicator to determine the position of the rotation center of the needle.
Dear Cedric
Many many thanks for your time to find out solution for my problem. As I said before, I am very new to DIP area, I am struggling to understand what you did in new approach. For me , your previous approach is perfect for fixed positioned cameras as it starts from center. Only the challenge was that in case of a tiny vibration, the center moves.
In new version, as far as I understood, we are cropping a window around '200' and recognize the '200' pattern by template matching with stored '200' template. Then we measure the center of '200' and we measure the Needle rotation center w.r.t. center of recognized '200' template. am I right??
It means, this version will work better in case of vibrations/shifted images, because it measure the center of Needle/indicator w.r.t. relative point ('200'). Please correct me if I am making any mistake to understood what you have done.
Also if sunlight/reflection light hides the '200' (very very rare chance that reflection hide exactly '200')?? Also, Previous approach is general for all kind of devices/meters/speed meters, whereas, we have to change the template of '200' and its position all the time w.r.t. device shape/readings etc.
Kind Regards Wasiq
Well the first thing to do is to make sure it works under perfect conditions. I assume Cedric's code does that. Then you have to adapt the code to handle imperfect situations. One way to handle that might be to simply say you can't determine the speed. If a huge specular reflection totally obscures the needle and you can't determine the direction it's pointing to, then you should just either say that it's indeterminable or else go with the last known speed as long as you have one within a reasonable time withing the past. This assumes the speed would not change drastically over, say a few seconds.
Now if you have vibration, then the image will be shifted, and possibly rotated. Assuming the image is not corrupted by a reflection at the same time, you can register the image to a reference location. If that takes too long, you can maybe look for a quicker way, like to threshold and find centroids of known big white blobs, then shift the image as needed.
Dear Wasiq,
You are right about all points. The rational was that we can match patterns using CONV2 or NORMXCORR2, so when you mentioned vibrations I though that a good option would be to match something that is distinct enough and that cannot be hidden by the needle. The "200" was a good option, so I grabbed a template for this purpose, and we are computing the center of the needle relative to the center of the matched "200". This should make the approach robust to vibrations, even if some images are a bit blurred, because the convolution is still likely to max out at the center of a blurry 200.
One thing that you could improve though, is the detection of failures. I do a simple test when I check if the max is unique, but this is technically very weak (I put it as a place holder) and you could implement much more powerful tests (e.g. unique peak, or that the pixels within a small radius of the estimated needle center are all white enough, or average luminosity below a "reflection-based" threshold). The crucial point is the "continue" in the IF statement, which loops back and leaves a NaN in the vector of angles (instead of a wrong estimate). A good option for you if there are sun reflections would be, as mentioned by ImageAnalyst, to state that the angle cannot be determined in these cases, and to leave NaNs in the vector of angle.
Then you could interpolate between NaNs for example, as the TAS is likely to vary smoothly at the time scale of video frames.
Cheers,
Cedric
Dear Cedric Thanks for your reply. Yes I can do lot of things now to make it more robust. As mentioned by imageAnalyst and you, I have different ways now to deal with a challange when the needle center is surrounded by white reflection, which will be very very rare case. However, I am little bit worried about your 2nd version due to time complexity for convolution and even cross-corr. It takes much longer for convolution of template image due to the window size and template image size of '200'. Can we enhance somehow the execution time?? Yet, I am very thankful to you for your time and help and I am going to accept this answer, as it is the best, generic way to resolve this. I had applied rest of almost all available methods including, features, surfaceFeatures, subtraction and features, Hough transforms, and much more. Another approach in my mind is to: Check any white connected pixels starting in the window around the estimated center of image and whose width is continuously decreasing but I don't know how can I do this?.
Kind Regards Wasiq
What can be done might depend on how much movement we can expect. So just how much will it vibrate? Just a few pixels, or by like a quarter of the image?
Another option would be to start by dividing the resolution by half, by keeping every even or odd rows and columns. That would have to be done only once for the template, and then once per image but this would be very fast as it involves no complex computation (it is just a basic indexing operation):
I_flat = I_flat(1:2:end, 1:2:end) ;
If you do that, you have to recompute all coordinates and parameters that refer to rows and columns though, but it is a one shot operation which is pretty easy to perform.
In any case, if I were you, I would start by cropping the image first to a zone that will contain only the indicator (just large enough to account for vibrations).
Also, as ImageAnalyst mentions, if the amplitude of vibrations is large, then this window "200" must be pretty large to guarantee that it will capture the 200 (this is why I define it as a 600x800 array, so it works with the super large vibration/shift that I generated in the 3rd image). On the other hand, if the amplitude is only a few pixels, you can reduce the size of this window quite much (as you can see with the green box, it is very large with respect to the 200).
Last but not least, you should run the code from the profiler, to see what is really time consuming. Loading the image may become the limiting factor at one point. For that, type
profile viewer
in the command window, and the name of the script in the field labeled "Run this code". Then press [Start profiling], and you will get a full report.
I won't have time to build an example, but now that I see the original pictures again, I think that you could isolate the green part of the indicator and find a best estimate for the center of the arc of circle that it defines. This may be the fastest option by far.
OK, thanks for your great efforts and help CEDRIC. Also, thanks to image Analyst for his suggestions.
My pleasure! If you go for the green band approach, one way to get the center without performing a fit, is to define 3 windows that capture respectively the leftmost, the middle, and the topmost regions of the green band. Then you compute the center of mass of true pixels (the detection of the green band is likely to produce an array of logicals) in the three regions, and you get 3 points that characterize the arc of circle. The computation of the center of this circle based on these 3 points is direct (no fit of a model or anything).
yes, but I am satisfied with your previous approach as it is more generic to all types of Dials. Also, in different light/reflections dynamics, the green colour detection/filter may be false. for example, in yellow light, the green clour may change....similarly, for very bright and so on.... Also, i need to apply simoultanously for different colours and kind of dials. So happy with previous approach. I will let you know if needed your help at any point. Thanks again

Sign in to comment.

More Answers (0)

Categories

Find more on Image Processing and Computer Vision in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!