Documentation

reluLayer

Create a Rectified Linear Unit (ReLU) layer

Syntax

layer = reluLayer()
layer = reluLayer(Name,Value)

Description

layer= reluLayer()返回一个修正线性单元(ReLU)层。这页erforms a threshold operation to each element, where any input value less than zero is set to zero, i.e.,

f ( x ) = { x , x 0 0 , x < 0 .

The ReLU layer does not change the size of its input.

example

layer= reluLayer(Name,Value)returns a ReLU layer, with the additional option specified by theName,Valuepair argument.

Examples

collapse all

Create a rectified linear unit layer with the namerelu1.

layer = reluLayer('Name','relu1');

Input Arguments

collapse all

Name-Value Pair Arguments

Example:'Name','relu1'specifies the name of the layer asrelu1.

Specify optional comma-separated pair ofName,Valueargument.Nameis the argument name andValueis the corresponding value.Namemust appear inside single quotes (' ').

collapse all

Name for the layer, specified as the comma-separated pair consisting ofNameand a character vector.

Data Types:char

Output Arguments

collapse all

Rectified linear unit layer, returned as aReLULayerobject.

For information on concatenating layers to construct convolutional neural network architecture, seeLayer.

References

[1] Nair, V. and G. E. Hinton. Rectified linear units improve restricted boltzmann machines. In Proc. 27th International Conference on Machine Learning, 2010.

See Also

Introduced in R2016a

Was this topic helpful?