reluLayer
Create a Rectified Linear Unit (ReLU) layer
Syntax
layer = reluLayer()
layer = reluLayer(Name,Value)
Description
返回一个修正线性单元(ReLU)层。这页erforms a threshold operation to each element, where any input value less than zero is set to zero, i.e.,layer
= reluLayer()
The ReLU layer does not change the size of its input.
returns a ReLU layer, with the additional option specified by thelayer
= reluLayer(Name,Value
)Name,Value
pair argument.
Examples
Input Arguments
Output Arguments
References
[1] Nair, V. and G. E. Hinton. Rectified linear units improve restricted boltzmann machines. In Proc. 27th International Conference on Machine Learning, 2010.
See Also
Introduced in R2016a
Was this topic helpful?