leakyReluLayer
Leaky Rectified Linear Unit (ReLU) layer
Description
A leaky ReLU layer performs a threshold operation, where any input value less than zero is multiplied by a fixed scalar.
This operation is equivalent to:
Creation
Properties
Examples
References
[1] Maas, Andrew L., Awni Y. Hannun, and Andrew Y. Ng. "Rectifier nonlinearities improve neural network acoustic models." InProc. ICML, vol. 30, no. 1. 2013.