what does this line do

5 views (last 30 days)
Kamil Kacer
Kamil Kacer on 20 Nov 2020
Answered: Walter Roberson on 21 Nov 2020
d{i}(end+1:max(numOfTrainSamples)) = inf;
can anyone tell what does this do d has 3 iteration and numOfTrainSamples is equal to 1
It suppose to do something but when i ran it doeasnt do anything

Accepted Answer

Walter Roberson
Walter Roberson on 21 Nov 2020
The code goes through all of the F cell entries, and finds the number of columns in each, recording the number of columns in numOfTrainSamples(i) at each point. It takes the maximum overall of the number of those. As it goes through and finds the distances, it extends the calculated distances out to the maximum number of samples that were encountered, extending with infinities.
The reason it does this is to be able to compare between the different F, by producing distance arrays that are the same size.
By the way, the way it initializes the arrays is incorrect, but in a way that does not matter. The initialization of d{i} with inf values is not needed as long as d is initialized to be a cell with the correct number of entries.

More Answers (1)

John D'Errico
John D'Errico on 20 Nov 2020
It does nothing on its own. A single line of code has little meaning, out of context.
What does it do? It sets some elements of a specific element of one cell in a cell array to inf. Actually, it APPENDS infs to the end of the vector. There will be max(numOfTrainSamples) infs appended.
Why it does that is what really matters.
  1 Comment
Kamil Kacer
Kamil Kacer on 20 Nov 2020
This is the code but i dont understand why it is there
function [Ps, winnerClass] = classifyKNN_D_Multi(F, testSample, k, NORMALIZE, useL1distance )
% function [Ps, winnerClass] = classifyKNN_D_Multi(F, testSample, k, NORMALIZE, useL1distance);
%
% This function is used for classifying an unknown sample using the kNN
% algorithm, in its multi-class form.
%
% ARGUMENTS:
% - F: an CELL array that contains the feature values for each class. I.e.,
% F{1} is a matrix of size numOfDimensions x numofSamples FOR THE FIRST
% CLASS, etc.
%
% - testSample: the input sample to be classified
% - k: the kNN parameter
% - NORMALIZE: use class priors to weight results
% - useL1distance: use L1 instead of L2 distance
%
% RETURNS:
% - Ps: an array that contains the classification probabilities for each class
% - winnerClass: the label of the winner class
%%error(nargchk(4,5,nargin))
switch nargin
case 4
useL1distance = '1'; % euclidean distance if 4 variables included
case 5
useL1distance = '0'; % mahalanobis distance if 5 variables included
otherwise
disp('error')
end
numOfClasses = length(F);
if (size(testSample, 2)==1)
testSample = testSample';
end
% initilization of distance vectors:
numOfDims = zeros( 1, numOfClasses );
numOfTrainSamples = zeros( 1, numOfClasses );
d = cell(numOfClasses,1);
% d{i} is a vector, whose elements represent the distance of the testing
% sample from all the samples of i-th class
testSample(isnan(testSample)) = 0.0;
for i=1:numOfClasses
[ numOfDims(i), numOfTrainSamples(i) ] = size( F{i} );
d{i} = inf*ones(max(numOfTrainSamples), 1); % we fill it with inf values
F{i}(isnan(F{i})) = 0.0;
end
if (length(testSample)>1)
for i=1:numOfClasses % for each class:
if (numOfTrainSamples(i)>0)
if ( useL1distance == 0)
% d{i} = sum( abs(repmat(testSample, [numOfTrainSamples(i) 1]) - F{i}'),2); % L1
d{i} = pdist2(F{i}.', testSample, 'euclidean');
else
%[size(repmat(testSample, [numOfTrainSamples(i) 1])) size(F{i}')]
%sum(sum(isnan(F{i}')))
d{i} = sum( ((repmat(testSample, [numOfTrainSamples(i) 1]) - F{i}').^2 ),2); % L2
d{i} = pdist2(F{i}.', testSample, 'mahalanobis');
end
d{i} = sort(d{i});
d{i}(end+1:max(numOfTrainSamples)) = inf;
else
d{i} = inf;
end
end
else % single dimension (NO SUM required!!!)
for i=1:numOfClasses
if (numOfTrainSamples(i)>0)
d{i} = (abs(repmat(testSample, [numOfTrainSamples(i) 1]) - F{i}')');
d{i} = sort(d{i});
d{i}(end+1:max(numOfTrainSamples)) = inf;
else
d{i} = inf;
end
end
end
kAll = zeros(numOfClasses, 1);
for j=1:k
curArray = zeros(numOfClasses, 1);
for i=1:numOfClasses
curArray(i) = d{i}(kAll(i)+1);
end
[MIN, IMIN] = min(curArray);
kAll(IMIN) = kAll(IMIN) + 1;
end
if ( NORMALIZE == 0 )
Ps = (kAll ./ k);
else
Ps = kAll ./ numOfTrainSamples';
Ps = Ps / sum(Ps);
end
[MAX, IMAX] = max(Ps);
winnerClass = IMAX;

Sign in to comment.

Categories

Find more on Econometrics Toolbox in Help Center and File Exchange

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!