Main Content

regressionLayer

Create a regression output layer

Description

A regression layer computes the half-mean-squared-error loss for regression tasks.

layer= regressionLayerreturns a regression output layer for a neural network as aRegressionOutputLayerobject.

Predict responses of a trained regression network usingpredict. Normalizing the responses often helps stabilizing and speeding up training of neural networks for regression. For more information, seeTrain Convolutional Neural Network for Regression.

example

layer= regressionLayer(Name,Value)sets the optionalNameandResponseNamesproperties using name-value pairs. For example,regressionLayer('Name','output')creates a regression layer with the name'output'. Enclose each property name in single quotes.

Examples

collapse all

Create a regression output layer with the name'routput'.

layer = regressionLayer('Name','routput')
layer = RegressionOutputLayer with properties: Name: 'routput' ResponseNames: {} Hyperparameters LossFunction: 'mean-squared-error'

The default loss function for regression is mean-squared-error.

包括回归输出层Laye之类r array.

layers = [...imageInputLayer([28 28 1]) convolution2dLayer(12,25) reluLayer fullyConnectedLayer(1) regressionLayer]
layers = 5x1 Layer array with layers: 1 '' Image Input 28x28x1 images with 'zerocenter' normalization 2 '' Convolution 25 12x12 convolutions with stride [1 1] and padding [0 0 0 0] 3 '' ReLU ReLU 4 '' Fully Connected 1 fully connected layer 5 '' Regression Output mean-squared-error

Input Arguments

collapse all

Name-Value Arguments

Specify optional pairs of arguments asName1=Value1,...,NameN=ValueN, whereNameis the argument name andValueis the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Before R2021a, use commas to separate each name and value, and encloseNamein quotes.

Example:regressionLayer('Name','output')creates a regression layer with the name'output'

Layer name, specified as a character vector or a string scalar. ForLayerarray input, thetrainNetwork,assembleNetwork,layerGraph, anddlnetworkfunctions automatically assign names to layers with name''.

Data Types:char|string

Names of the responses, specified a cell array of character vectors or a string array. At training time, the software automatically sets the response names according to the training data. The default is{}.

Data Types:cell

Output Arguments

collapse all

Regression output layer, returned as aRegressionOutputLayerobject.

More About

collapse all

Regression Output Layer

A regression layer computes the half-mean-squared-error loss for regression tasks.For typical regression problems, a regression layer must follow the final fully connected layer.

For a single observation, the mean-squared-error is given by:

MSE = i = 1 R ( t i y i ) 2 R ,

whereRis the number of responses,tiis the target output, andyiis the network’s prediction for responsei.

For image and sequence-to-one regression networks, the loss function of the regression layer is the half-mean-squared-error of the predicted responses, not normalized byR:

loss = 1 2 i = 1 R ( t i y i ) 2 .

For image-to-image regression networks, the loss function of the regression layer is the half-mean-squared-error of the predicted responses for each pixel, not normalized byR:

loss = 1 2 p = 1 H W C ( t p y p ) 2 ,

whereH,W, andCdenote the height, width, and number of channels of the output respectively, andpindexes into each element (pixel) oftandylinearly.

For sequence-to-sequence regression networks, the loss function of the regression layer is the half-mean-squared-error of the predicted responses for each time step, not normalized byR:

loss = 1 2 S i = 1 S j = 1 R ( t i j y i j ) 2 ,

whereSis the sequence length.

When training, the software calculates the mean loss over the observations in the mini-batch.

Extended Capabilities

C/C++ Code Generation
Generate C and C++ code using MATLAB® Coder™.

GPU Code Generation
Generate CUDA® code for NVIDIA® GPUs using GPU Coder™.

Version History

Introduced in R2017a