MATLAB Answers

1

Slow checkLayer-function for simple custom reshape layer

Asked by Julius Åkesson on 22 Jul 2019
Latest activity Commented on by Valentin Steininger on 9 Aug 2019
I'm trying to create a simple layer to use directly after my imageInputLayer in a neural network. The purpose of the layer is to channel-split the image-input. The input is a three-channel image, where the first channel contains grayscale image data, the middle channel is a zeros-matrix and the third channel contains a few parameters that are later extracted into a 1x11x1 vector by using a max-pooling layer. The purpose of doing this is to be able to use both image and parameter inputs in my neural network. The code for creating the custom layer is as follows:
classdef splitLayer < nnet.layer.Layer
properties
inputSize;
outputSize;
end
methods
function layer = splitLayer(Name)
%Initialize output and input sizes
layer.Name = Name;
layer.inputSize = [256 256 3];
layer.outputSize = [256 256 1];
layer.NumOutputs = 2;
end
function [Z1, Z2] = predict(layer, X)
numObservations = size(X,4);
%Split channels and return as separate outputs
Z1 = zeros([layer.outputSize numObservations], 'like', X);
Z2 = zeros([layer.outputSize numObservations], 'like', X);
for i = 1:numel(numObservations)
Z1(:, :, :, i) = X(:, :, 1, i);
Z2(:, :, :, i) = X(:, :, 3, i);
end
end
function [Z1, Z2, memory] = forward(layer, X)
numObservations = size(X,4);
%Split channels and return as separate outputs
Z1 = zeros([layer.outputSize numObservations], 'like', X);
Z2 = zeros([layer.outputSize numObservations], 'like', X);
for i = 1:numel(numObservations)
Z1(:, :, :, i) = X(:, :, 1, i);
Z2(:, :, :, i) = X(:, :, 3, i);
end
memory = numObservations;
end
function [dLdX] = backward(layer, ~, ~, ~, dLdZ1, dLdZ2, memory)
numObservations = size(dLdZ1, 4);
is = layer.inputSize;
dLdX = zeros(is(1), is(2), 3, numObservations, 'like', dLdZ1);
zeromat = zeros(is(1), is(2), 'like', dLdZ1);
%For each observation dimension, concatenate separated dimensions and pass backwards
for i = 1:numel(numObservations)
dLdX(:,:,:,i) = cat(3, dLdZ1(:, :, :, i), zeromat, dLdZ2(:, :, :, i));
end
end
end
end
However, when using checkLayer to validate my custom layer, the checkLayer-function is very slow, and by some debugging I have realized that it is slow at computing the derivatives in the backward function.
Since this is only my second custom layer, I'm not sure where the problem lies. Could it be the use of the cat-function? In that case, how do I implement the same functionality differently in a way that functions faster on a GPU?

  1 Comment

Hi
I also tried to create such a splitData layer with 4 Outputs. When I run my code I get the error thrown: "Incorrect number of output arguments for 'predict' in Layer splitDataLayer. Expected to have1, but instead it has 4."
Then I tried to run it with your code to see what happens and it throws the same error although the numOutputs property has been set properly. So this might be a release issue.
May I ask what release version you are using for that layer?

Sign in to comment.

1 Answer

Answer by Divya Gaddipati on 5 Aug 2019

While using the checkLayer function, for large input sizes, the gradient checks take longer time to run. To speed up the tests, try to specify a smaller valid input and output sizes (like for example [24 24 3] or [5 5 3]).

  0 Comments

Sign in to comment.