Implement flatten layer in CNN

Please, how to implement the flatten layer in CNN, i.e. transform 2D feature map of convoulution layer output to 1D vector?

5 Comments

MATLAB R2019a introduced flatten layer. Which version are you using.
https://www.mathworks.com/help/deeplearning/ref/nnet.cnn.layer.flattenlayer.html
I am using MATLAB 2019a, the availabe flatten layer is used for sequence data. I do not know wheather it is possible to use it for transforming 2D features to a 1D vector.
You should be able to put a fully connected layer after a convolutional layer. This will essentially flatten as well.
I was thinking about this but it is actually confusing me as it is just a fully connected, I understand it as connecting each entry within a feature map to all of the neurons in the fully connected layer.
I still cannot imagine how a fully connected layer flattens a feature map!
Thanks for your help!
Hi friend, I also met this problem, I tried flatten layer but I found it is used for squence data. So have you solved this problem? Or do you have any ideas? I don't know how to do it so I just jump into pytorch instead:(

Sign in to comment.

Answers (1)

I understand from the comments that you're facing issues visualizing how a fully connected (FC) layer flattens a feature map. In a typical CNN architecture, convolutions are used to extract features from the input data. After the convolutional layers, the feature map is flattened and passed to fully connected layers for further processing.
For example, if your feature map after the last convolutional layer looks like this:
[ [a, b],
[c, d] ]
After flattening, it will look like this:
[a, b, c, d]
Consider the below code:
% Define the layers of the network
layers = [
imageInputLayer([28 28 1], 'Name', 'input') % Input layer for 28x28 grayscale images
convolution2dLayer(3, 8, 'Padding', 'same', 'Name', 'conv1') % 3x3 convolution with 8 filters
batchNormalizationLayer('Name', 'batchnorm1')
reluLayer('Name', 'relu1')
maxPooling2dLayer(2, 'Stride', 2, 'Name', 'maxpool1') % Max pooling layer
convolution2dLayer(3, 16, 'Padding', 'same', 'Name', 'conv2') % Another convolution layer
batchNormalizationLayer('Name', 'batchnorm2')
reluLayer('Name', 'relu2')
maxPooling2dLayer(2, 'Stride', 2, 'Name', 'maxpool2') % Another max pooling layer
flattenLayer('Name', 'flatten') % Flatten layer to convert 2D feature maps to 1D vector
% fullyConnectedLayer(10, 'Name', 'fc') % Fully connected layer with 10 neurons
% softmaxLayer('Name', 'softmax') % Softmax layer for classification
% classificationLayer('Name', 'output') % Classification layer
];
% Analyze the network
analyzeNetwork(layers);
After running this code, the output is 784x1 vector:
But if you do not use a flatten layer, and use FC layer instead, MATLAB automatically flattens the output in background (Weights is 10 x 784) and use it for further processing
Each entry of this (hidden) flattened layer is connected to each neuron in the FC layer. Therefore, there is no need to explicitly add a flatten layer unless you have a specific reason to do so.
Refer to FlattenLayer documentation for further details:
If you want to transform a 2D feature map into a 1D vector manually, you can use the reshape function like this:
% Example 2D feature map
featureMap2D = rand(4, 4); % A 4x4 matrix as an example
% Convert the 2D feature map to a 1D vector
featureMap1D = reshape(featureMap2D, [], 1);
% Display the original and reshaped feature map
disp('Original 2D Feature Map:');
disp(featureMap2D);
disp('Reshaped 1D Feature Map:');
disp(featureMap1D);
Refer to reshape documentation to know more about this function:
I hope this helps you get on the right track.

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Tags

Asked:

on 5 Sep 2020

Answered:

on 23 Sep 2024

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!