主要内容

Elulayer

指数线性单元(ELU)层

Description

An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.

The layer performs the following operation:

f ( x ) = { x , x 0 α (EXP(EXP)( x ) - 1) , x < 0

的默认值αis 1. Specify a value ofαfor the layer by setting theAlphaproperty.

Creation

Description

layer= eluLayer创建一个ELU层。

layer= eluLayer(alpha)creates an ELU layer and specifies theAlphaproperty.

example

layer= eluLayer(___,'Name',Name)另外设置可选的Nameproperty using any of the previous syntaxes. For example,Elulayer(“名称”,“ Elu1”)creates an ELU layer with the name'elu1'.

Properties

expand all

ELU

Nonlinearity parameterα,指定为数字标量。ELU层的输出的最小值等于and the slope at negative inputs approaching 0 isα.

Layer

图层名称,指定为字符向量或字符串标量。为了Layerarray input, thetrainNetwork,汇编工作,LayerGraph, 和dlnetworkfunctions automatically assign names to layers with name''.

Data Types:char|string

This property is read-only.

该层的输入数量。该层仅接受单个输入。

Data Types:double

This property is read-only.

Input names of the layer. This layer accepts a single input only.

Data Types:cell

This property is read-only.

Number of outputs of the layer. This layer has a single output only.

Data Types:double

This property is read-only.

层的输出名称。该层仅具有单个输出。

Data Types:cell

Examples

collapse all

Create an exponential linear unit (ELU) layer with the name'elu1'非线性参数的默认值为1Alpha.

layer = eluLayer('Name','elu1')
layer = ELULayer with properties: Name: 'elu1' Alpha: 1 Learnable Parameters No properties. State Parameters No properties. Show all properties

Include an ELU layer in aLayerarray.

layers = [ imageInputLayer([28 28 1]) convolution2dLayer(3,16) batchNormalizationLayer eluLayer maxPooling2dLayer(2,'Stride',2) convolution2dLayer(3,32) batchNormalizationLayer eluLayer fullyConnectedLayer(10) softmaxLayer classificationLayer]
layers = 11x1 Layer array with layers: 1 '' Image Input 28x28x1 images with 'zerocenter' normalization 2 '' Convolution 16 3x3 convolutions with stride [1 1] and padding [0 0 0 0] 3 '' Batch Normalization Batch normalization 4 '' ELU ELU with Alpha 1 5 '' Max Pooling 2x2 max pooling with stride [2 2] and padding [0 0 0 0] 6 '' Convolution 32 3x3 convolutions with stride [1 1] and padding [0 0 0 0] 7 '' Batch Normalization Batch normalization 8 '' ELU ELU with Alpha 1 9 '' Fully Connected 10 fully connected layer 10 '' Softmax softmax 11 '' Classification Output crossentropyex

参考

[1] Clevert, Djork-Arné, Thomas Unterthiner, and Sepp Hochreiter. "Fast and accurate deep network learning by exponential linear units (ELUs)."ARXIV预印ARXIV:1511.07289(2015)。

Extended Capabilities

C/C++ Code Generation
使用MATLAB®CODER™生成C和C ++代码。

GPU Code Generation
使用GPU CODER™为NVIDIA®GPU生成CUDA®代码。

Version History

Introduced in R2019a