Main Content

edge

Classification edge

Syntax

E = edge(ens,tbl,ResponseVarName)
E = edge(ens,tbl,Y)
E = edge(ens,X,Y)
E = edge(___,Name,Value)

Description

E= edge(ens,tbl,ResponseVarName)returns the classification edge forenswith datatbland classificationtbl.ResponseVarName.

E= edge(ens,tbl,Y)returns the classification edge forenswith datatbland classificationY.

E= edge(ens,X,Y)returns the classification edge forenswith dataXand classificationY.

E= edge(___,Name,Value)computes the edge with additional options specified by one or moreName,Valuepair arguments, using any of the previous syntaxes.

Note

If the predictor dataXor the predictor variables intblcontain any missing values, theedgefunction can return NaN. For more details, seeedge can return NaN for predictor data with missing values.

Input Arguments

ens

A classification ensemble constructed withfitcensemble, or a compact classification ensemble constructed withcompact.

tbl

Sample data, specified as a table. Each row oftblcorresponds to one observation, and each column corresponds to one predictor variable.tblmust contain all of the predictors used to train the model. Multicolumn variables and cell arrays other than cell arrays of character vectors are not allowed.

If you trainedensusing sample data contained in a table, then the input data for this method must also be in a table.

ResponseVarName

Response variable name, specified as the name of a variable intbl.

You must specifyResponseVarNameas a character vector or string scalar. For example, if the response variableYis stored astbl.Y, then specify it as'Y'. Otherwise, the software treats all columns oftbl, includingY, as predictors when training the model.

X

矩阵中每一行代表一个观察,and each column represents a predictor. The number of columns inXmust equal the number of predictors inens.

If you trainedensusing sample data contained in a matrix, then the input data for this method must also be in a matrix.

Y

Class labels of observations intblorX.Yshould be of the same type as the classification used to trainens, and its number of elements should equal the number of rows oftblorX.

Name-Value Arguments

Specify optional pairs of arguments asName1=Value1,...,NameN=ValueN, whereNameis the argument name andValueis the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Before R2021a, use commas to separate each name and value, and encloseNamein quotes.

learners

Indices of weak learners in the ensemble ranging from1toens.NumTrained.edgeuses only these learners for calculating loss.

Default:1:NumTrained

mode

Meaning of the outputE:

  • 'ensemble'Eis a scalar value, the edge for the entire ensemble.

  • 'individual'Eis a vector with one element per trained learner.

  • 'cumulative'Eis a vector in which elementJis obtained by using learners1:Jfrom the input list of learners.

Default:'ensemble'

UseObsForLearner

A logical matrix of sizeN-by-T, where:

  • Nis the number of rows ofX.

  • Tis the number of weak learners inens.

WhenUseObsForLearner(i,j)istrue, learnerjis used in predicting the class of rowiofX.

Default:true(N,T)

UseParallel

Indication to perform inference in parallel, specified asfalse(compute serially) ortrue(并行计算)。并行计算要求es Parallel Computing Toolbox™. Parallel inference can be faster than serial inference, especially for large datasets. Parallel computation is supported only for tree learners.

Default:false

weights

Observation weights, a numeric vector of lengthsize(X,1). If you supply weights,edgecomputes weighted classification edge.

Default:ones(size(X,1),1)

Output Arguments

E

The classification edge, a vector or scalar depending on the setting of themodename-value pair. Classification edge is weighted average classification margin.

Examples

expand all

Find the classification edge for some of the data used to train a boosted ensemble classifier.

Load theionospheredata set.

loadionosphere

Train an ensemble of 100 boosted classification trees using AdaBoostM1.

t = templateTree('MaxNumSplits',1);% Weak learner template tree objectens = fitcensemble(X,Y,'Method','AdaBoostM1',“那时rs',t);

Find the classification edge for the last few rows.

E = edge(ens,X(end-10:end,:),Y(end-10:end))
E = 8.3310

More About

expand all

Extended Capabilities

Version History

expand all

See Also

|