reluLayer

Rectified Linear Unit (ReLU) layer

Description

A ReLU layer performs a threshold operation to each element of the input, where any value less than zero is set to zero.

This operation is equivalent to

`$f\left(x\right)=\left\{\begin{array}{cc}x,& x\ge 0\\ 0,& x<0\end{array}.$`

Creation

Syntax

``layer = reluLayer``
``layer = reluLayer('Name',Name)``

Description

````layer = reluLayer` creates a ReLU layer.```

example

````layer = reluLayer('Name',Name)` creates a ReLU layer and sets the optional `Name` property using a name-value pair. For example, `reluLayer('Name','relu1')` creates a ReLU layer with the name `'relu1'`. Enclose the property name in single quotes.```

Properties

expand all

Layer name, specified as a character vector or a string scalar. To include a layer in a layer graph, you must specify a nonempty unique layer name. If you train a series network with the layer and `Name` is set to `''`, then the software automatically assigns a name to the layer at training time.

Data Types: `char` | `string`

Number of inputs of the layer. This layer accepts a single input only.

Data Types: `double`

Input names of the layer. This layer accepts a single input only.

Data Types: `cell`

Number of outputs of the layer. This layer has a single output only.

Data Types: `double`

Output names of the layer. This layer has a single output only.

Data Types: `cell`

Examples

collapse all

Create a ReLU layer with the name `'relu1'`.

`layer = reluLayer('Name','relu1')`
```layer = ReLULayer with properties: Name: 'relu1' ```

Include a ReLU layer in a `Layer` array.

```layers = [ ... imageInputLayer([28 28 1]) convolution2dLayer(5,20) reluLayer maxPooling2dLayer(2,'Stride',2) fullyConnectedLayer(10) softmaxLayer classificationLayer]```
```layers = 7x1 Layer array with layers: 1 '' Image Input 28x28x1 images with 'zerocenter' normalization 2 '' Convolution 20 5x5 convolutions with stride [1 1] and padding [0 0 0 0] 3 '' ReLU ReLU 4 '' Max Pooling 2x2 max pooling with stride [2 2] and padding [0 0 0 0] 5 '' Fully Connected 10 fully connected layer 6 '' Softmax softmax 7 '' Classification Output crossentropyex ```

expand all

References

[1] Nair, Vinod, and Geoffrey E. Hinton. "Rectified linear units improve restricted boltzmann machines." In Proceedings of the 27th international conference on machine learning (ICML-10), pp. 807-814. 2010.

Extended Capabilities

GPU Code GenerationGenerate CUDA® code for NVIDIA® GPUs using GPU Coder™.

Introduced in R2016a