Main Content

resubLoss

Resubstitution regression loss

    Description

    example

    L= resubLoss(Mdl)returns the regression loss by resubstitution (L), or the in-sample regression loss, for the trained regression modelMdlusing the training data stored inMdl.Xand the corresponding responses stored inMdl.Y.

    The interpretation ofLdepends on the loss function ('LossFun') and weighting scheme (Mdl.W). In general, better models yield smaller loss values. The default'LossFun'value is'mse'(mean squared error).

    example

    L= resubLoss(Mdl,名称,值)specifies additional options using one or more name-value arguments. For example,'IncludeInteractions',falsespecifies to exclude interaction terms from a generalized additive modelMdl.

    Examples

    collapse all

    Train a generalized additive model (GAM), then calculate the resubstitution loss using the mean squared error (MSE).

    Load thepatientsdata set.

    loadpatients

    Create a table that contains the predictor variables (Age,Diastolic,Smoker,Weight,Gender,SelfAssessedHealthStatus) and the response variable (Systolic).

    tbl = table(Age,Diastolic,Smoker,Weight,Gender,SelfAssessedHealthStatus,Systolic);

    Train a univariate GAM that contains the linear terms for the predictors intbl.

    Mdl = fitrgam(tbl,"Systolic")
    Mdl = RegressionGAM PredictorNames: {1x6 cell} ResponseName: 'Systolic' CategoricalPredictors: [3 5 6] ResponseTransform: 'none' Intercept: 122.7800 IsStandardDeviationFit: 0 NumObservations: 100 Properties, Methods

    Mdlis aRegressionGAMmodel object.

    Calculate the resubstitution loss using the mean squared error (MSE).

    L = resubLoss(Mdl)
    L = 4.1957

    Load the sample data and store in atable.

    loadfisheriristbl = table(meas(:,1),meas(:,2),meas(:,3),meas(:,4),species,...'VariableNames',{'meas1','meas2','meas3','meas4','species'});

    Fit a GPR model using the first measurement as the response and the other variables as the predictors.

    mdl = fitrgp(tbl,'meas1');

    Predict the responses using the trained model.

    ypred = predict(mdl,tbl);

    Compute the mean absolute error.

    n = height(tbl); y = tbl.meas1; fun = @(y,ypred,w) sum(abs(y-ypred))/n; L = resubLoss(mdl,'lossfun',fun)
    L = 0.2345

    Train a generalized additive model (GAM) that contains both linear and interaction terms for predictors, and estimate the regression loss (mean squared error, MSE) with and without interaction terms for the training data and test data. Specify whether to include interaction terms when estimating the regression loss.

    Load thecarbigdata set, which contains measurements of cars made in the 1970s and early 1980s.

    loadcarbig

    SpecifyAcceleration,Displacement,Horsepower, andWeightas the predictor variables (X) andMPGas the response variable (Y).

    X = [Acceleration,Displacement,Horsepower,Weight]; Y = MPG;

    Partition the data set into two sets: one containing training data, and the other containing new, unobserved test data. Reserve 10 observations for the new test data set.

    rng('default')% For reproducibilityn = size(X,1); newInds = randsample(n,10); inds = ~ismember(1:n,newInds); XNew = X(newInds,:); YNew = Y(newInds);

    Train a generalized additive model that contains all the available linear and interaction terms inX.

    Mdl = fitrgam(X(inds,:),Y(inds),'Interactions','all');

    Mdlis aRegressionGAMmodel object.

    Compute the resubstitution MSEs (that is, the in-sample MSEs) both with and without interaction terms inMdl. To exclude interaction terms, specify'IncludeInteractions',false.

    resubl = resubLoss(Mdl)
    resubl = 0.0292
    resubl_nointeraction = resubLoss(Mdl,'IncludeInteractions',false)
    resubl_nointeraction = 4.7330

    Compute the regression MSEs both with and without interaction terms for the test data set. Use a memory-efficient model object for the computation.

    CMdl = compact(Mdl);

    CMdlis aCompactRegressionGAMmodel object.

    l = loss(CMdl,XNew,YNew)
    l = 12.8604
    l_nointeraction = loss(CMdl,XNew,YNew,'IncludeInteractions',false)
    l_nointeraction = 15.6741

    Including interaction terms achieves a smaller error for the training data set and test data set.

    Input Arguments

    collapse all

    Regression machine learning model, specified as a full regression model object, as given in the following table of supported models.

    Model Regression Model Object
    Gaussian process regression model RegressionGP
    Generalized additive model (GAM) RegressionGAM
    Neural network model RegressionNeuralNetwork

    Name-Value Arguments

    Specify optional pairs of arguments asName1=Value1,...,NameN=ValueN, whereNameis the argument name andValueis the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

    Before R2021a, use commas to separate each name and value, and encloseNamein quotes.

    Example:resubLoss(Mdl,'IncludeInteractions',false)不包括交互从一件alized additive modelMdl.

    Flag to include interaction terms of the model, specified astrueorfalse. This argument is valid only for a generalized additive model. That is, you can specify this argument only whenMdlisRegressionGAM.

    The default value istrueifMdlcontains interaction terms. The value must befalseif the model does not contain interaction terms.

    Example:'IncludeInteractions',false

    Data Types:logical

    Loss function, specified as'mse'or a function handle.

    • 'mse'— Weighted mean squared error.

    • Function handle — To specify a custom loss function, use a function handle. The function must have this form:

      lossval =lossfun(Y,YFit,W)

      • The output argumentlossvalis a floating-point scalar.

      • You specify the function name (lossfun).

      • Yis a lengthnnumeric vector of observed responses, wherenis the number of observations inTblorX.

      • YFitis a lengthnnumeric vector of corresponding predicted responses.

      • Wis ann-by-1 numeric vector of observation weights.

    Example:'LossFun',@lossfun

    Data Types:char|string|function_handle

    More About

    collapse all

    Weighted Mean Squared Error

    The weighted mean squared error measures the predictive inaccuracy of regression models. When you compare the same type of loss among many models, a lower error indicates a better predictive model.

    The weighted mean squared error is calculated as follows:

    mse = j = 1 n w j ( f ( x j ) y j ) 2 j = 1 n w j ,

    where:

    • nis the number of rows of data.

    • xjis thejth row of data.

    • yjis the true response toxj.

    • f(xj)is the response prediction of the modelMdltoxj.

    • wis the vector of observation weights.

    Algorithms

    resubLosscomputes the regression loss according to the correspondinglossfunction of the object (Mdl). For a model-specific description, see thelossfunction reference pages in the following table.

    Model Regression Model Object (Mdl) lossObject Function
    Gaussian process regression model RegressionGP loss
    Generalized additive model RegressionGAM loss
    Neural network model RegressionNeuralNetwork loss

    Alternative Functionality

    To compute the response loss for new predictor data, use the correspondinglossfunction of the object (Mdl).

    Version History

    Introduced in R2021a

    See Also