Main Content

clippedReluLayer

Clipped Rectified Linear Unit (ReLU) layer

Description

A clipped ReLU layer performs a threshold operation, where any input value less than zero is set to zero and any value above theclipping ceilingis set to that clipping ceiling.

This operation is equivalent to:

f ( x ) = { 0 , x < 0 x , 0 x < c e i l i n g c e i l i n g , x c e i l i n g .

This clipping prevents the output from becoming too large.

Creation

Description

layer= clippedReluLayer(ceiling)returns a clipped ReLU layer with the clipping ceiling equal toceiling.

example

layer= clippedReluLayer(ceiling,'Name',Name)sets the optionalNameproperty.

Properties

expand all

Clipped ReLU

Ceiling for input clipping, specified as a positive scalar.

Example:10

Layer

Layer name, specified as a character vector or a string scalar. ForLayerarray input, thetrainNetwork,assembleNetwork,layerGraph, anddlnetworkfunctions automatically assign names to layers withNameset to''.

Data Types:char|string

This property is read-only.

Number of inputs of the layer. This layer accepts a single input only.

Data Types:double

This property is read-only.

Input names of the layer. This layer accepts a single input only.

Data Types:cell

This property is read-only.

Number of outputs of the layer. This layer has a single output only.

Data Types:double

This property is read-only.

Output names of the layer. This layer has a single output only.

Data Types:cell

Examples

collapse all

Create a clipped ReLU layer with the name'clip1'and the clipping ceiling equal to 10.

layer = clippedReluLayer(10,'Name','clip1')
layer = ClippedReLULayer with properties: Name: 'clip1' Hyperparameters Ceiling: 10

Include a clipped ReLU layer in aLayerarray.

layers = [...imageInputLayer([28 28 1]) convolution2dLayer(5,20) clippedReluLayer(10) maxPooling2dLayer(2,'Stride',2) fullyConnectedLayer(10) softmaxLayer classificationLayer]
layers = 7x1 Layer array with layers: 1 '' Image Input 28x28x1 images with 'zerocenter' normalization 2 '' Convolution 20 5x5 convolutions with stride [1 1] and padding [0 0 0 0] 3 '' Clipped ReLU Clipped ReLU with ceiling 10 4 '' Max Pooling 2x2 max pooling with stride [2 2] and padding [0 0 0 0] 5 '' Fully Connected 10 fully connected layer 6 '' Softmax softmax 7 '' Classification Output crossentropyex

                    

References

[1] Hannun, Awni, Carl Case, Jared Casper, Bryan Catanzaro, Greg Diamos, Erich Elsen, Ryan Prenger, et al. "Deep speech: Scaling up end-to-end speech recognition." Preprint, submitted 17 Dec 2014. http://arxiv.org/abs/1412.5567

Extended Capabilities

C/C++ Code Generation
Generate C and C++ code using MATLAB® Coder™.

GPU Code Generation
Generate CUDA® code for NVIDIA® GPUs using GPU Coder™.

Introduced in R2017b