# softmax

Apply softmax activation to channel dimension

## Syntax

``dlY = softmax(dlX)``
``dlY = softmax(dlX,'DataFormat',FMT)``

## Description

The softmax activation operation applies the softmax function to the channel dimension of the input data.

The softmax function normalizes the value of the input data across the channel dimension such that it sums to one. You can regard the output of the softmax function as a probability distribution.

Note

This function applies the softmax operation to `dlarray` data. If you want to apply softmax within a `layerGraph` object or `Layer` array, use the following layer:

example

````dlY = softmax(dlX)` computes the softmax activation of the input `dlX` by applying the softmax transfer function to the channel dimension of the input data. All values in `dlY` are between `0` and `1`, and sum to `1`. The input `dlX` is a formatted `dlarray` with dimension labels. The output `dlY` is a formatted `dlarray` with the same dimension labels as `dlX`.```
````dlY = softmax(dlX,'DataFormat',FMT)` also specifies dimension format `FMT` when `dlX` is not a formatted `dlarray`. The output `dlY` is an unformatted `dlarray` with the same dimension order as `dlX`.```

## Examples

collapse all

Use the `softmax` function to set all values in the input data to values between `0` and `1` that sum to `1` over all channels.

Create the input classification data as two observations of random variables. The data can be in any of 10 categories.

```numCategories = 10; observations = 2; X = rand(numCategories,observations); dlX = dlarray(X,'CB');```

Compute the `softmax` activation.

```dlY = softmax(dlX); totalProb = sum(dlY,1)```
```dlY = 10(C) x 2(B) dlarray 0.1151 0.0578 0.1261 0.1303 0.0579 0.1285 0.1270 0.0802 0.0959 0.1099 0.0562 0.0569 0.0673 0.0753 0.0880 0.1233 0.1328 0.1090 0.1337 0.1288 totalProb = 1(C) x 2(B) dlarray 1.0000 1.0000```

All values in `dlY` range between `0` and `1`. The values over all channels sum to `1` for each observation.

## Input Arguments

collapse all

Input data, specified as a `dlarray` with or without dimension labels. When `dlX` is not a formatted `dlarray`, you must specify the dimension label format using `'DataFormat',FMT`.

`dlX` must contain a `'C'` channel dimension.

Data Types: `single` | `double`

Dimension order of unformatted input data, specified as the comma-separated pair consisting of `'DataFormat'` and a character array or string `FMT` that provides a label for each dimension of the data. Each character in `FMT` must be one of the following:

• `'S'` — Spatial

• `'C'` — Channel

• `'B'` — Batch (for example, samples and observations)

• `'T'` — Time (for example, sequences)

• `'U'` — Unspecified

You can specify multiple dimensions labeled `'S'` or `'U'`. You can use the labels `'C'`, `'B'`, and `'T'` at most once.

You must specify `'DataFormat',FMT` when the input data `dlX` is not a formatted `dlarray`.

Example: `'DataFormat','SSCB'`

Data Types: `char` | `string`

## Output Arguments

collapse all

Softmax activations, returned as a `dlarray`. All values in `dlY` are between `0` and `1`. The output `dlY` has the same underlying data type as the input `dlX`.

If the input data `dlX` is a formatted `dlarray`, `dlY` has the same dimension labels as `dlX`. If the input data is not a formatted `dlarray`, `dlY` is an unformatted `dlarray` with the same dimension order as the input data.

collapse all

### Softmax Activation

The `softmax` function normalizes the input across the channel dimension, such that it sums to one. For more information, see the definition of Softmax Layer on the `softmaxLayer` reference page.

## Extended Capabilities

### GPU Code GenerationGenerate CUDA® code for NVIDIA® GPUs using GPU Coder™.

Introduced in R2019b