Main Content

dlgradient

Compute gradients for custom training loops using automatic differentiation

Description

使用dlgradientto compute derivatives using automatic differentiation for custom training loops.

Tip

对于大多数深度学习的任务,您可以使用一个pretrained network and adapt it to your own data. For an example showing how to use transfer learning to retrain a convolutional neural network to classify a new set of images, seeTrain Deep Learning Network to Classify New Images. Alternatively, you can create and train networks from scratch usinglayerGraph对象with thetrainNetworkandtrainingOptionsfunctions.

If thetrainingOptionsfunction does not provide the training options that you need for your task, then you can create a custom training loop using automatic differentiation. To learn more, seeDefine Deep Learning Network for Custom Training Loops.

example

[dydx1,...,dydxk] = dlgradient(y,x1,...,xk)returns the gradients ofywith respect to the variablesx1throughxk.

Calldlgradientfrom inside a function passed todlfeval. SeeCompute Gradient Using Automatic Differentiationand使用Automatic Differentiation In Deep Learning Toolbox.

[dydx1,...,dydxk] = dlgradient(y,x1,...,xk,Name,Value)returns the gradients and specifies additional options using one or more name-value pairs. For example,dydx = dlgradient(y,x,'RetainData',true)causes the gradient to retain intermediate values for reuse in subsequentdlgradientcalls. This syntax can save time, but uses more memory. For more information, seeTips.

Examples

collapse all

Rosenbrock's function is a standard test function for optimization. Therosenbrock.mhelper function computes the function value and uses automatic differentiation to compute its gradient.

typerosenbrock.m
function [y,dydx] = rosenbrock(x) y = 100*(x(2) - x(1).^2).^2 + (1 - x(1)).^2; dydx = dlgradient(y,x); end

To evaluate Rosenbrock's function and its gradient at the point[–1,2], create adlarrayof the point and then calldlfeval在function handle@rosenbrock.

x0 = dlarray([-1,2]); [fval,gradval] = dlfeval(@rosenbrock,x0)
fval = 1x1 dlarray 104
gradval = 1x2 dlarray 396 200

Alternatively, define Rosenbrock's function as a function of two inputs,x1and x2.

typerosenbrock2.m
function [y,dydx1,dydx2] = rosenbrock2(x1,x2) y = 100*(x2 - x1.^2).^2 + (1 - x1).^2; [dydx1,dydx2] = dlgradient(y,x1,x2); end

Calldlfevalto evaluaterosenbrock2on twodlarrayarguments representing the inputs–1and2.

x1 = dlarray (-1); x2 = dlarray(2); [fval,dydx1,dydx2] = dlfeval(@rosenbrock2,x1,x2)
fval = 1x1 dlarray 104
dydx1 = 1x1 dlarray 396
dydx2 = 1x1 dlarray 200

Plot the gradient of Rosenbrock's function for several points in the unit square. First, initialize the arrays representing the evaluation points and the output of the function.

[X1 X2] = meshgrid(linspace(0,1,10)); X1 = dlarray(X1(:)); X2 = dlarray(X2(:)); Y = dlarray(zeros(size(X1))); DYDX1 = Y; DYDX2 = Y;

Evaluate the function in a loop. Plot the result usingquiver.

fori = 1:length(X1) [Y(i),DYDX1(i),DYDX2(i)] = dlfeval(@rosenbrock2,X1(i),X2(i));endquiver(extractdata(X1),extractdata(X2),extractdata(DYDX1),extractdata(DYDX2)) xlabel('x1') ylabel('x2')

Figure contains an axes object. The axes object contains an object of type quiver.

使用dlgradientanddlfevalto compute the value and gradient of a function that involves complex numbers. You can compute complex gradients, or restrict the gradients to real numbers only.

Define the functioncomplexFun, listed at the end of this example. This function implements the following complex formula:

f ( x ) = ( 2 + 3 i ) x

Define the functiongradFun, listed at the end of this example. This function callscomplexFunand usesdlgradientto calculate the gradient of the result with respect to the input. For automatic differentiation, the value to differentiate — i.e., the value of the function calculated from the input — must be a real scalar, so the function takes the sum of the real part of the result before calculating the gradient. The function returns the real part of the function value and the gradient, which can be complex.

Define the sample points over the complex plane between -2 and 2 and -2 i and 2 i and convert todlarray.

functionRes = linspace(-2,2,100); x = functionRes + 1i*functionRes.'; x = dlarray(x);

Calculate the function value and gradient at each sample point.

[y, grad] = dlfeval(@gradFun,x); y = extractdata(y);

Define the sample points at which to display the gradient.

gradientRes = linspace(-2,2,11); xGrad = gradientRes + 1i*gradientRes.';

Extract the gradient values at these sample points.

[~,gradPlot] = dlfeval(@gradFun,dlarray(xGrad)); gradPlot = extractdata(gradPlot);

Plot the results. Useimagescto show the value of the function over the complex plane. Usequiverto show the direction and magnitude of the gradient.

imagesc([-2,2],[-2,2],y); axisxycolorbar holdonquiver(real(xGrad),imag(xGrad),real(gradPlot),imag(gradPlot),"k"); xlabel("Real") ylabel("Imaginary") title("Real Value and Gradient","Re$(f(x)) = $ Re$((2+3i)x)$","interpreter","latex")

The gradient of the function is the same across the entire complex plane. Extract the value of the gradient calculated by automatic differentiation.

grad(1,1)
ans = 1×1 dlarray 2.0000 - 3.0000i

By inspection, the complex derivative of the function has the value

df ( x ) dx = 2 + 3 i

However, the function Re( f ( x ) ) is not analytic, and therefore no complex derivative is defined. For automatic differentiation in MATLAB, the value to differentiate must always be real, and therefore the function can never be complex analytic. Instead, the derivative is computed such that the returned gradient points in the direction of steepest ascent, as seen in the plot. This is done by interpreting the function Re ( f ( x ) ) :C Ras a function Re ( f ( x R + i x I ) ) :R × R R.

functiony = complexFun(x) y = (2+3i)*x;endfunction[y,grad] = gradFun(x) y = complexFun(x); y = real(y); grad = dlgradient(sum(y,"all"),x);end

Input Arguments

collapse all

Variable to differentiate, specified as a scalardlarrayobject. For differentiation,ymust be a traced function ofdlarrayinputs (seeTraced dlarray) and must consist of supported functions fordlarray(seeList of Functions with dlarray Support).

Variable to differentiate must be real even when the name-value option'AllowComplex'is set totrue.

Example:100*(x(2) - x(1).^2).^2 + (1 - x(1)).^2

Example:relu(X)

Data Types:single|double|logical

Variable in the function, specified as adlarrayobject, a cell array, structure, or table containingdlarray对象, or any combination of such arguments recursively. For example, an argument can be a cell array containing a cell array that contains a structure containingdlarray对象.

If you specifyx1,...,xkas a table, the table must contain the following variables:

  • Layer— Layer name, specified as a string scalar.

  • Parameter— Parameter name, specified as a string scalar.

  • Value— Value of parameter, specified as a cell array containing adlarray.

Example:dlarray([1 2;3 4])

Data Types:single|double|logical|struct|cell
Complex Number Support:Yes

Name-Value Arguments

Specify optional pairs of arguments asName1=Value1,...,NameN=ValueN, whereNameis the argument name andValueis the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Before R2021a, use commas to separate each name and value, and encloseNamein quotes.

Example:dydx = dlgradient(y,x,'RetainData',true)causes the gradient to retain intermediate values for reuse in subsequentdlgradientcalls

Flag to retain trace data during the function call, specified asfalseortrue. When this argument isfalse, adlarraydiscards the derivative trace immediately after computing a derivative. When this argument istrue, adlarrayretains the derivative trace until the end of thedlfevalfunction call that evaluates thedlgradient. Thetruesetting is useful only when thedlfevalcall contains more than onedlgradientcall. Thetruesetting causes the software to use more memory, but can save time when multipledlgradientcalls use at least part of the same trace.

When'EnableHigherDerivatives'istrue, then intermediate values are retained and the'RetainData'option has no effect.

Example:dydx = dlgradient(y,x,'RetainData',true)

Data Types:logical

Flag to enable higher-order derivatives, specified as one of the following:

  • true— Enable higher-order derivatives. Trace the backward pass so that the returned gradients can be used in further computations for subsequent calls to thedlgradientfunction. If'EnableHigherDerivatives'istrue, then intermediate values are retained and the'RetainData'option has no effect.

  • false— Disable higher-order derivatives. Do not trace the backward pass. Use this option when you need to compute first-order derivatives only as this is usually quicker and requires less memory.

When using thedlgradientfunction inside anAcceleratedFunctionobject, the default value istrue. Otherwise, the default value isfalse.

For examples showing how to train models that require calculating higher-order derivatives, see:

Data Types:logical

Flag to allow complex variables in function and complex gradients, specified as one of the following:

  • true— Allow complex variables in function and complex gradients. Variables in the function can be specified as complex numbers. Gradients can be complex even if all variables are real. Variable to differentiate must be real.

  • false— Do not allow complex variables and gradients. Variable to differentiate and any variables in the function must be real numbers. Gradients are always real. Intermediate values can still be complex.

Variable to differentiate must be real even when the name-value option'AllowComplex'is set totrue.

Data Types:logical

Output Arguments

collapse all

Gradient, returned as adlarrayobject, or a cell array, structure, or table containingdlarray对象, or any combination of such arguments recursively. The size and data type ofdydx1,...,dydxkare the same as those of the associated input variablex1,…,xk.

Limitations

  • Thedlgradientfunction does not support calculating higher-order derivatives when usingdlnetwork对象containing custom layers with a custom backward function.

  • Thedlgradientfunction does not support calculating higher-order derivatives when usingdlnetwork对象containing the following layers:

    • gruLayer

    • lstmLayer

    • bilstmLayer

  • Thedlgradientfunction does not support calculating higher-order derivatives that depend on the following functions:

    • gru

    • lstm

    • embed

    • prod

    • interp1

More About

collapse all

Traceddlarray

在函数的计算,一个dlarrayinternally records the steps taken in atrace, enabling reverse mode automatic differentiation. The trace occurs within adlfevalcall. SeeAutomatic Differentiation Background.

Tips

  • Adlgradientcall must be inside a function. To obtain a numeric value of a gradient, you must evaluate the function usingdlfeval, and the argument to the function must be adlarray. See使用Automatic Differentiation In Deep Learning Toolbox.

  • To enable the correct evaluation of gradients, theyargument must use only supported functions fordlarray. SeeList of Functions with dlarray Support.

  • If you set the'RetainData'name-value pair argument totrue, the software preserves tracing for the duration of thedlfevalfunction call instead of erasing the trace immediately after the derivative computation. This preservation can cause a subsequentdlgradientcall within the samedlfevalcall to be executed faster, but uses more memory. For example, in training an adversarial network, the'RetainData'setting is useful because the two networks share data and functions during training. SeeTrain Generative Adversarial Network (GAN).

  • When you need to calculate first-order derivatives only, ensure that the'EnableHigherDerivatives'option isfalseas this is usually quicker and requires less memory.

  • Complex gradients are calculated using the Wirtinger derivative. The gradient is defined in the direction of increase of the real part of the function to differentiate. This is because the variable to differentiate — for example, the loss — must be real, even if the function is complex.

Extended Capabilities

版本历史

Introduced in R2019b