Doing something simply dumb :(( Trying to get a meshgrid to read my third Class category but not working.

3 views (last 30 days)
So Im using machine learning to read my training and testing data where X1 is my first feature, X2 is my second feature, and y is the category it belongs to. This is really simple but im having trouble with it.
Im saying in my code that one category is green (this is where y = 1), another category is blue (where y=2), and the last category is red (y=3). I can plot green and blue but not red data points over my mesh grid and i think this line of code is the problem:
%%%%%%% this is an examply of my y: %%%%%
% y = [ones(60,1); 2*ones(60,1); 3*ones(60,1)];
Y = data.y;
Y = ismember(training_data.y,Y);
scatter(training_data.X1(Y==1),training_data.X2(Y==1),'o' , 'MarkerEdgeColor', 'black', 'MarkerFaceColor', 'green');
Y = ismember(training_data.y,Y);
scatter(training_data.X1(Y==2),training_data.X2(Y==2) , 'o' , 'MarkerEdgeColor', 'black', 'MarkerFaceColor', 'red');
Y = ismember(training_data.y,Y);
scatter(training_data.X1(Y==3),training_data.X2(Y==3) , 'o' , 'MarkerEdgeColor', 'black', 'MarkerFaceColor', 'red');
xlabel('X1');
This is the code for only 2 categories, and this runs with no issues.
labels = data.y;
Y = ismember(training_data.y,labels(1));
scatter(training_data.X1(Y),training_data.X2(Y), 'o' , 'MarkerEdgeColor', 'black', 'MarkerFaceColor', 'green');
scatter(training_data.X1(~Y),training_data.X2(~Y) , 'o' , 'MarkerEdgeColor', 'black', 'MarkerFaceColor', 'red');
Here is the full code:
%Read in the data and structure a table and rename the variables for
%simplicity
data = table([x(:,1)], [x(:,2)], [y]);
data.Properties.VariableNames = {'X1','X2','y'};%Read in the data and structure a table and rename the variables for
classifier_name = {'Naive Bayes','Decision Tree','Discriminant Analysis','Nearest Neighbor'};
classifier{1} = fitcnb(data,'y~X1+X2');
%SVM has trouble with more than 1 class
% classifier{2} = fitcsvm(x,y,'KernelFunction','polynomial','PolynomialOrder',6);
classifier{2} = fitctree(data,'y~X1+X2');
classifier{3} = fitcdiscr(data,'y~X1+X2');
% classifier{4} = fitcknn(x,y);
k = 5;
classifier{4} = fitcknn(data,'y~X1+X2','NumNeighbors',k,'Distance','euclidean');
% %%%%%%%%%%%%%%%%%%%%%% Loop through each Machine learning classifier,
% then specify accuracy
for i = 1:numel(classifier)
cv = cvpartition(classifier{i}.NumObservations,'HoldOut',0.2); %split into 20% testing and 80% training
cross_validated_model = crossval(classifier{i},'cvpartition',cv);
%Create Predictions for training and test sets
Predictions_test = predict(cross_validated_model.Trained{1},data(test(cv),1:end-1));
Predictions_train = predict(cross_validated_model.Trained{1},data(training(cv),1:end-1));
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%% Create Predictions %%%%
Predict = predict(cross_validated_model.Trained{1},data(test(cv),1:end-1));
%%%% Draw Confusion Matrix %%%% this sets up an accuracy based matrix
Result = confusionmat(cross_validated_model.Y(test(cv)),Predict);
%%% Accuracy %%%% calulates the accuracy based on the confusion matrix
display(classifier_name{i})
Testing_accuracy = ((Result(2,2)+Result(1,1))/((Result(2,2)+Result(1,1)+Result(1,2)+Result(2,1))))
%%%% Create Predictions %%%%
Predict = predict(cross_validated_model.Trained{1},data(training(cv),1:end-1));
%%%% Draw Confusion Matrix %%%% this sets up an accuracy based matrix
Result = confusionmat(cross_validated_model.Y(training(cv)),Predict);
%%% Accuracy %%%% calulates the accuracy based on the confusion matrix
display(classifier_name{i})
Training_accuracy = ((Result(2,2)+Result(1,1))/((Result(2,2)+Result(1,1)+Result(1,2)+Result(2,1))))
labels = data.y;
X1_range = min(data.X1(training(cv)))-1:1000:max(data.X1(training(cv)))+1;
X2_range = min(data.X2(training(cv)))-1:1000:max(data.X2(training(cv)))+1;
[xx1, xx2] = meshgrid(X1_range,X2_range);
XGrid = [xx1(:) xx2(:)];
predictions_meshgrid = predict(cross_validated_model.Trained{1},XGrid);
%figure
subplot(2,2,i);
gscatter(xx1(:), xx2(:), predictions_meshgrid,'gbr');
hold on
training_data = data(training(cv),:);
Y = data.y;
Y = ismember(training_data.y,labels(1));
scatter(training_data.X1(Y),training_data.X2(Y),'o' , 'MarkerEdgeColor', 'black', 'MarkerFaceColor', 'green');
Y = ismember(training_data.y,labels(2));
scatter(training_data.X1(~Y),training_data.X2(~Y) , 'o' , 'MarkerEdgeColor', 'black', 'MarkerFaceColor', 'blue');
Y = ismember(training_data.y,labels(3));
scatter(training_data.X1(Y==3),training_data.X2(Y==3) , 'o' , 'MarkerEdgeColor', 'black', 'MarkerFaceColor', 'red');
xlabel('X1');
ylabel('X2');
title(classifier_name);
legend off, axis tight
% legend(labels,'Location',[0.45,0.01,0.45,0.05],'Orientation','Horizontal');
hold off
% subplot(2,2,i);
% gscatter(xx1(:), xx2(:), predictedhealth,'gbr','osd');
title(classifier_name{i})
legend off, axis tight
end
Pretty Please help!

Answers (1)

yanqi liu
yanqi liu on 30 Dec 2021
yes,sir,may be use
index1 = find(Y==1);
index2 = find(Y==2);
index3 = find(Y==3);
to generate data index vector,and get data split to plot

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!