clippedReluLayer
Clipped Rectified Linear Unit (ReLU) layer
Description
A clipped ReLU layer performs a threshold operation, where any input value less than zero is set to zero and any value above theclipping ceilingis set to that clipping ceiling.
This operation is equivalent to:
This clipping prevents the output from becoming too large.
Creation
Properties
Examples
参考
[1] Hannun, Awni, Carl Case, Jared Casper, Bryan Catanzaro, Greg Diamos, Erich Elsen, Ryan Prenger, et al. "Deep speech: Scaling up end-to-end speech recognition." Preprint, submitted 17 Dec 2014. http://arxiv.org/abs/1412.5567