softplusLayer
Description
A softplus layer applies the softplus activation function Y = log(1 +
eX), which ensures that the output is always positive. This activation function is
a smooth continuous version of reluLayer
. You can
incorporate this layer into the deep neural networks you define for actors in reinforcement
learning agents. This layer is useful for creating continuous Gaussian policy deep neural
networks, for which the standard deviation output must be positive.
Creation
Description
creates a softplus
layer with default property values.sLayer
= softplusLayer
sets properties using
name-value pairs. For example, sLayer
= softplusLayer(Name,Value
)softplusLayer('Name','softlayer')
creates a softplus layer and assigns the name 'softlayer'
.
Properties
Examples
Extended Capabilities
Version History
Introduced in R2020a