I'd like to define my own layer with backward function

Hi, like I mentioned in the title, I'd like to define a custom layer reshaping [1, 1, h*w*n] input to [h, w, n] output.
I've done it like below:
classdef reshapeLayer < nnet.layer.Layer
properties
NumFilters;
Height;
Width;
end
methods
function self = reshapeLayer(name, height, width, numFilters)
self.Name = name;
self.Height = height;
self.Width = width;
self.NumFilters = numFilters;
end
function Z = predict(self, X)
numFilters = self.NumFilters;
height = self.Height;
width = self.Width;
numSamples = size(X,4);
Z = cast(zeros([height, width, numFilters, numSamples], class(X)), 'like', X);
for n = 1:numSamples
Z(:,:,:,n) = reshape(X(:,:,:,n), height, width, numFilters);
end
end
function [Z, memory] = forward(self, X)
numFilters = self.NumFilters;
height = self.Height;
width = self.Width;
numSamples = size(X,4);
Z = cast(zeros([height, width, numFilters, numSamples], class(X)), 'like', X);
for n = 1:numSamples
Z(:,:,:,n) = reshape(X(:,:,:,n), height, width, numFilters);
end
memory = [];
end
function dLdX = backward(self, X, ~, dLdZ, ~)
numFilters = self.NumFilters;
height = self.Height;
width = self.Width;
numSamples = size(X,4);
dLdX = cast(zeros(size(X)), 'like', dLdZ);
for n = 1:numSamples
dLdX(:,:,:,n) = reshape(dLdZ(:,:,:,n), 1, 1, height * width * numFilters);
end
end
end
end
This layer has been passed the validation test by
height = 8;
width = 8;
numFilters = 32;
batchSize = 16;
inputSize = [1, 1, height * width * numFilters];
layer = reshapeLayer('reshape1', height, width, numFilters);
checkLayer(layer, inputSize, 'ObservationDimension', 4);
However, when I put this layer in the middle of a set of layers and make it into a dlnetwork,
layer = [
imageInputLayer(inputSize, 'Normalization', 'none', 'Name', 'input')
reshapeLayer('reshape1', height, width, numFilters)
convolution2dLayer(3, numFilters, 'Padding', 'same', 'Name', 'conv1')
reluLayer('Name', 'act1')
convolution2dLayer(3, numFilters, 'Padding', 'same', 'Name', 'conv2')
reluLayer('Name', 'act2')
];
net = dlnetwork(layerGraph(layer));
I find the following error.
>> Layer 'reshape1': Custom layers with backward
functions are not supported.
Isn't it possible to define a custom layer with predefined backward function? I'm trying to do this because the competibility to Keras model. So, I have to make another custom layer, the NNupsampling layer, also. Could anyone help me?

4 Comments

Supported Layers
The dlnetwork function supports the layers listed below and custom layers without a custom backward function.
Because the predict function only uses functions that support dlarray objects, defining the backward function is optional. For a list of functions that support dlarray objects, see List of Functions with dlarray Support.
If the layer forward functions fully support dlarray objects, then the layer is GPU compatible. Otherwise, to be GPU compatible, the layer functions must support inputs and return outputs of type gpuArray.
I think if you use the function listed here in the predict and forward functions, you don't need to define the backwards function
This code
Z = cast(zeros([height, width, numFilters, numSamples], class(X)), 'like', X);
for n = 1:numSamples
Z(:,:,:,n) = reshape(X(:,:,:,n), height, width, numFilters);
end
Is both the same as, and far slower than, the single line:
Z = reshape(X, height, weight, numFilters, numSamples);
Thank both of you so much. It has got solved.
But there is another issue came after it.
Could you check that for me?

Sign in to comment.

Answers (0)

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Products

Release

R2019b

Asked:

on 2 Feb 2020

Commented:

on 2 Feb 2020

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!