I'm having trouble with convolution1dLayer

49 views (last 30 days)
layers = [
featureInputLayer(24)
convolution1dLayer(5, 32, 'Padding', 'same')
batchNormalizationLayer
reluLayer
maxPooling1dLayer(2, 'Stride', 2)
convolution1dLayer(5, 64, 'Padding', 'same')
batchNormalizationLayer
reluLayer
maxPooling1dLayer(2, 'Stride', 2)
convolution1dLayer(5, 128, 'Padding', 'same')
batchNormalizationLayer
reluLayer
maxPooling1dLayer(2, 'Stride', 2)
dropoutLayer(0.5)
fullyConnectedLayer(5)
softmaxLayer
classificationLayer];
options = trainingOptions('adam', ...
'MaxEpochs', 20, ...
'MiniBatchSize', 128, ...
'ValidationData', {XVal, YVal}, ...
'ValidationFrequency', 50, ...
'Shuffle', 'every-epoch', ...
'Verbose', false, ...
'Plots', 'training-progress');
%Xtrain = 5000x24 Ytrain = 5000x1 Xtest = 5000x24 Ytest=50000x1
net = trainNetwork(XTrain, YTrain, layers, options);
% Doğruluk oranını hesapla
YPred = classify(net, XTest);
accuracy = sum(YPred == YTest) / numel(YTest);
fprintf('Doğruluk oranı: %0.2f%%\n', 100*accuracy);
I have a dataset of 15300 records with 24 features. (size 15300x24) My output dataset consists of 5 classes (15300x1). I am trying to classify with cnn. When I write the Layer, I encounter the following error:
Caused by:
Layer 2: Input data must have one spatial dimension only, one temporal dimension only, or one of each.
Instead, it has 0 spatial dimensions and 0 temporal dimensions.
I haven't been able to solve it.
  2 Comments
nagihan yagmur
nagihan yagmur on 26 Mar 2023
Edited: Walter Roberson on 26 Mar 2023
clc;
clear all;
load veri_seti.mat
X = MyData.Inp;
Y = categorical(MyData.Out);
numClasses = 5;
layers = [ featureInputLayer(24,'Name','inputs')
convolution1dLayer(128,3,'Stride',2)
reluLayer() maxPooling1dLayer(2,'Stride',2)
batchNormalizationLayer()
convolution1dLayer(64,3,'Stride',1)
reluLayer()
maxPooling1dLayer(2,'Stride',2)
batchNormalizationLayer()
dropoutLayer(0.2)
convolution1dLayer(32,3,'Stride',1)
reluLayer()
batchNormalizationLayer()
convolution1dLayer(16,3,'Stride',1)
reluLayer()
batchNormalizationLayer()
dropoutLayer(0.2)
convolution1dLayer(8,3,'Stride',1)
reluLayer()
maxPooling1dLayer(2,'Stride',2)
globalMaxPooling1dLayer()
dropoutLayer(0.2)
batchNormalizationLayer()
fullyConnectedLayer(1024)
fullyConnectedLayer(1024)
softmaxLayer()
classificationLayer()];
% Öğrenme oranı ve diğer hiperparametreler
miniBatchSize = 128;
maxEpochs = 30;
initialLearningRate = 0.001;
learnRateDropFactor = 0.1;
learnRateDropPeriod = 10;
% Options nesnesi
options = trainingOptions('adam', ...
'MiniBatchSize', miniBatchSize, ...
'MaxEpochs', maxEpochs, ...
'InitialLearnRate', initialLearningRate, ...
'LearnRateSchedule', 'piecewise', ...
'LearnRateDropFactor', learnRateDropFactor, ...
'LearnRateDropPeriod', learnRateDropPeriod, ...
'Shuffle', 'every-epoch', ...
'Verbose', false, ...
'Plots', 'training-progress');
Error using vertcat
Dimensions of arrays being concatenated are not consistent.
Error in example (line 9)
layers = [ featureInputLayer(24,'Name','inputs')
I'm constantly between these two errors

Sign in to comment.

Accepted Answer

Matt J
Matt J on 4 Apr 2023
Edited: Matt J on 6 Apr 2023
Tech Support has suggested 2 workarounds to me. The simplest IMO is to recast the training as a 2D image classification problem, where one of the dimensions of the image is a singleton. This requires the use of an imageInputLayer as well as converting convolutional and pooling layers to 2D form, also specifying one of the dimensions as a singleton.
load veri_seti
XTrain = reshape(MyData.Inp',24,1,1,[]); %Dimensions: 24x1x1xBatch
YTrain = reshape( categorical(MyData.Out),[],1); %Dimensions: Batchx1
layers = [ imageInputLayer([24,1],'Name','inputs') %<---Use imageInputLayer
convolution2dLayer([5,1], 32, 'Padding', 'same')
batchNormalizationLayer
reluLayer
maxPooling2dLayer([2,1], 'Stride', [2,1])
convolution2dLayer([5,1], 64, 'Padding', 'same')
batchNormalizationLayer
reluLayer
maxPooling2dLayer([2,1], 'Stride', [2,1])
convolution2dLayer([5,1], 128, 'Padding', 'same')
batchNormalizationLayer
reluLayer
maxPooling2dLayer([2,1], 'Stride', [2,1])
dropoutLayer(0.5)
flattenLayer
fullyConnectedLayer(5)
softmaxLayer
classificationLayer];
%analyzeNetwork(layers);
options = trainingOptions('adam', ...
'MaxEpochs', 3, ...
'MiniBatchSize', 128, ...
'Verbose', false, ...
'Plots', 'training-progress','ExecutionEnvironment','cpu');
net = trainNetwork(XTrain, YTrain(:), layers,options);
  3 Comments
Matt J
Matt J on 6 Apr 2023
I'm glad, but please Accept-click the answer to indicate that it worked.

Sign in to comment.

More Answers (1)

Walter Roberson
Walter Roberson on 26 Mar 2023
Moved: Walter Roberson on 26 Mar 2023
layers = [ featureInputLayer(24,'Name','inputs')
convolution1dLayer(128,3,'Stride',2)
reluLayer() maxPooling1dLayer(2,'Stride',2)
Notice you have two layers on the same line.
  3 Comments
Walter Roberson
Walter Roberson on 27 Mar 2023
Please show
whos -file veri_seti.mat
whos X Y

Sign in to comment.

Products


Release

R2022a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!