Main Content

pentropy

Spectral entropy of signal

Description

example

se= pentropy(xt)returns theSpectral Entropyof single-variable, single-columntimetablextas thetimetablesepentropycomputes the spectrogram ofxtusing the default options ofpspectrum

example

se= pentropy(x,sampx)returns the spectral entropy of vectorx, sampled at rate or time intervalsampx, as a vector.

example

se= pentropy(p,fp,tp)returns the spectral entropy using the power spectrogramp, along with spectrogram frequency and time vectorsfptp

Use this syntax when you want to customize the options forpspectrum, rather than accept the defaultpspectrumoptions thatpentropyapplies.

example

se= pentropy(___,Name=Value)specifies additional properties using name-value arguments. Options include instantaneous or whole-signal entropy, scaling by white noise entropy, frequency limits, and time limits. You can useName=Valuewith any of the input arguments in previous syntaxes.

example

[se,t] = pentropy(___)returns the spectral entropysealong with the time vector ortimetablet。Ifseis atimetable, thentis equal to the row times oftimetablese。This syntax does not apply ifInstantaneousis set tofalse

pentropy(___)with no output arguments plots the spectral entropy against time. IfInstantaneousis set tofalse, the function outputs the scalar value of the spectral entropy.

Examples

collapse all

Plot the spectral entropy of a signal expressed as a timetable and as a time series.

Generate a random series with normal distribution (white noise).

xn = randn(1000,1);

Create time vectort和convert todurationvectortdur。Combinetdurxnin a timetable.

fs = 10; ts = 1/fs; t = 0.1:ts:100; tdur = seconds(t); xt = timetable(tdur',xn);

Plot the spectral entropy of the timetablext

pentropy(xt) title('Spectral Entropy of White Noise Signal Timetable')

Figure contains an axes object. The axes object with title Spectral Entropy of White Noise Signal Timetable, xlabel Time (mins), ylabel Spectral Entropy contains an object of type line.

Plot the spectral entropy of the signal, using time-point vectort和the form which returnsse和associated timete。Match thex-axis units and grid to thepentropy-generated plots for comparison.

[se,te] = pentropy(xn,t'); te_min = te/60; plot(te_min,se) title('Spectral Entropy of White Noise Signal Vector') xlabel('Time (mins)') ylabel('Spectral Entropy') gridon

Figure contains an axes object. The axes object with title Spectral Entropy of White Noise Signal Vector, xlabel Time (mins), ylabel Spectral Entropy contains an object of type line.

Both yield the same result.

The second input argument forpentropycan represent either frequency or time. The software interprets according to the data type of the argument. Plot the spectral entropy of the signal, using sample rate scalarfsinstead of time vectort

pentropy(xn,fs) title('Spectral Entropy of White Noise Signal Vector using Sample Rate')

Figure contains an axes object. The axes object with title Spectral Entropy of White Noise Signal Vector using Sample Rate, xlabel Time (mins), ylabel Spectral Entropy contains an object of type line.

This plot matches the previous plots.

Plot the spectral entropy of a speech signal and compare it to the original signal. Visualize the spectral entropy on a color map by first creating a power spectrogram, and then taking the spectral entropy of frequency bins within the bandwidth of speech.

Load the datax, which contains a two-channel recording of the word "Hello" embedded by low-level white noise.xconsists of two columns representing the two channels. Use only the first channel.

Define the sample rate and the time vector. Augment the first channel ofxwith white noise to achieve a signal-to-noise ratio of about 5 to 1.

loadHelloxfs = 44100; t = 1/fs*(0:length(x)-1); x1 = x(:,1) + 0.01*randn(length(x),1);

Find the spectral entropy. Visualize the data for the original signal and for the spectral entropy.

[se,te] = pentropy(x1,fs); subplot(2,1,1) plot(t,x1) ylabel("Speech Signal") xlabel("Time") subplot(2,1,2) plot(te,se) ylabel("Spectral Entropy") xlabel("Time")

Figure contains 2 axes objects. Axes object 1 with xlabel Time, ylabel Speech Signal contains an object of type line. Axes object 2 with xlabel Time, ylabel Spectral Entropy contains an object of type line.

The spectral entropy drops when "Hello" is spoken. This is because the signal spectrum has changed from almost a constant (white noise) to the distribution of a human voice. The human-voice distribution contains more information and has lower spectral entropy.

Compute the power spectrogrampof the original signal, returning frequency vectorfp和time vectortpas well. For this case, specifying a frequency resolution of 20 Hz provides acceptable clarity in the result.

[p,fp,tp] = pspectrum(x1,fs,"spectrogram",FrequencyResolution=20);

The frequency vector of the power spectrogram goes to 22,050 Hz, but the range of interest with respect to speech is limited to the telephony bandwidth of 300–3400 Hz. Divide the data into five frequency bins by defining start and end points, and compute the spectral entropy for each bin.

flow = [300 628 1064 1634 2394]; fup = [627 1060 1633 2393 3400]; se2 = zeros(length(flow),size(p,2));fori = 1:length(flow) se2(i,:) = pentropy(p,fp,tp,FrequencyLimits=[flow(i) fup(i)]);end

Visualize the data in a color map that shows ascending frequency bins, and compare with the original signal.

figure subplot(2,1,1) plot(t,x1) xlabel("Time (seconds)") ylabel("Speech Signal") subplot(2,1,2) imagesc(tp,[],flip(se2))% Flip se2 so its plot corresponds to the% ascending frequency bins.h = colorbar(gca,"NorthOutside"); ylabel(h,"Spectral Entropy") yticks(1:5) set(gca,YTickLabel=num2str((5:-1:1).'))% Label the ticks for the ascending bins.xlabel("Time (seconds)") ylabel("Frequency Bin")

Figure contains 2 axes objects. Axes object 1 with xlabel Time (seconds), ylabel Speech Signal contains an object of type line. Axes object 2 with xlabel Time (seconds), ylabel Frequency Bin contains an object of type image.

Create a signal that combines white noise with a segment that consists of a sine wave. Use spectral entropy to detect the existence and position of the sine wave.

Generate and plot the signal, which contains three segments. The middle segment contains the sine wave along with white noise. The other two segments are pure white noise.

fs = 100;t = 0:1 / fs: 10;sin_wave = 2 * sin(2 *π* 20 *t')+randn(length(t),1); x = [randn(1000,1);sin_wave;randn(1000,1)]; t3 = 0:1/fs:30; plot(t3,x) title("Sine Wave in White Noise")

Figure contains an axes object. The axes object with title Sine Wave in White Noise contains an object of type line.

Plot the spectral entropy.

pentropy(x,fs) title("Spectral Entropy of Sine Wave in White Noise")

Figure contains an axes object. The axes object with title Spectral Entropy of Sine Wave in White Noise, xlabel Time (secs), ylabel Spectral Entropy contains an object of type line.

The plot clearly differentiates the segment with the sine wave from the white-noise segments. This is because the sine wave contains information. Pure white noise has the highest spectral entropy.

The default forpentropyis to return or plot the instantaneous spectral entropy for each time point, as the previous plot displays. You can also distill the spectral entropy information into a single number that represents the entire signal by settingInstantaneoustofalse。Use the form that returns the spectral entropy value if you want to directly use the result in other calculations. Otherwise,pentropyreturns the spectral entropy inans

se = pentropy(x,fs,Instantaneous=false)
se = 0.9033

A single number characterizes the spectral entropy, and therefore the information content, of the signal. You can use this number to efficiently compare this signal with other signals.

Input Arguments

collapse all

信号从w时间表hichpentropyreturns the spectral entropyse, specified as atimetablethat contains a single variable with a single column.xtmust contain increasing, finite row times. If thexttimetablehas missing or duplicate time points, you can fix it using the tips inClean Timetable with Missing, Duplicate, or Nonuniform Timesxtcan be nonuniformly sampled, with thepspectrumconstraint that the median time interval and the mean time interval must obey:

1 100 < Median time interval Mean time interval < 100.

For an example, seePlot Spectral Entropy of Signal

Time-series signal from whichpentropyreturns the spectral entropyse, specified as a vector.

Sample rate or sample time, specified as one of the following:

  • Positive numeric scalar — Sample rate in hertz

  • durationscalar — Time interval between consecutive samples ofX

  • Vector,durationarray, ordatetimearray — Time instant or duration corresponding to each element ofx

Whensampxrepresents a time vector, time samples can be nonuniform, with thepspectrumconstraint that the median time interval and the mean time interval must obey:

1 100 < Median time interval Mean time interval < 100.

For an example, seePlot Spectral Entropy of Signal

Power spectrogram or spectrum of a signal, specified as a matrix (spectrogram) or a column vector (spectrum). If you specifyp, thenpentropyusesprather than generate its own spectrogram or power spectrogram.fptp, which provide the frequency and time information, must accompanyp。Each element ofpat thei'th row and thej'th column represents the signal power at the frequency bin centered atfp(i) and the time instancetp(j).

For an example, seePlot Spectral Entropy of Speech Signal

Frequencies for spectrogram or power spectrogrampwhenpis supplied explicitly topentropy, specified as a vector in hertz. The length offp一定等于number of rows ins

Time information for power spectrogram or spectrumpwhenpis supplied explicitly topentropy, specified as one of the following:

  • Vector of time points, whose data type can be numeric,duration, ordatetime。The length of vectortp一定等于number of columns inp

  • durationscalar that represents the time interval inp。的标量形式tpcan be used only whenpis a power spectrogram matrix.

  • For the special case wherepis a column vector (power spectrum),tpcan be a numeric,duration, ordatetimescalar representing the time point of the spectrum.

For the special case wherepis a column vector (power spectrum),tpcan be a single/double/duration/datetimescalar representing the time point of the spectrum.

Name-Value Arguments

Specify optional pairs of arguments asName1=Value1,...,NameN=ValueN, whereNameis the argument name andValueis the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Before R2021a, use commas to separate each name and value, and encloseNamein quotes.

Example:"Instantaneous",false,"FrequencyLimits",[25 50]computes the scalar spectral entropy representing the portion of the signal ranging from 25 Hz to 50 Hz.

Instantaneous time series option, specified as a logical.

  • IfInstantaneousistrue, thenpentropyreturns the instantaneous spectral entropy as a time-series vector.

  • IfInstantaneousisfalse, thenpentropyreturns the spectral entropy value of the whole signal or spectrum as a scalar.

For an example, seeUse Spectral Entropy to Detect Sine Wave in White Noise

Scale by white noise option, specified as a logical. Scaling by white noise — or log2n, wherenis the number of frequency points — is equivalent to normalizing inSpectral Entropy。It allows you to perform a direct comparison on signals of different length.

  • IfScaledistrue, thenpentropyreturns the spectral entropy scaled by the spectral entropy of the corresponding white noise.

  • IfScaledisfalse, thenpentropydoes not scale the spectral entropy.

Frequency limits to use, specified as a two-element vector containing lower and upper bounds f1 and f2 in hertz. The default is [0sampfreq/2], wheresampfreqis the sample rate in hertz thatpentropyderives fromsampx

This specification allows you to exclude a band of data at either end of the spectral range.

For an example, seePlot Spectral Entropy of Speech Signal

Time limits, specified as a two-element vector containing lower and upper bounds t1 and t2 in the same units as the sample time provided insampx, and of the data types:

  • Numeric ordurationwhensampxis numeric or duration

  • Numeric,duration, ordatetimewhensampxisdatetime

This specification allows you to extract a time segment of data from the full timespan.

Output Arguments

collapse all

Spectral Entropy,返回timetable如果输入信号timetablext, and as a double vector if the input signal is time seriesx

Time values associated withse, returned in the same form as the time inse。This argument does not apply ifInstantaneousis set tofalse

For an example, seePlot Spectral Entropy of Signal

More About

collapse all

Spectral Entropy

Thespectral entropy(SE) of a signal is a measure of its spectral power distribution. The concept is based on the Shannon entropy, or information entropy, in information theory. The SE treats the signal's normalized power distribution in the frequency domain as a probability distribution, and calculates the Shannon entropy of it. The Shannon entropy in this context is the spectral entropy of the signal. This property can be useful for feature extraction in fault detection and diagnosis[2],[1]。SE is also widely used as a feature in speech recognition[3]和biomedical signal processing[4]

The equations for spectral entropy arise from the equations for the power spectrum and probability distribution for a signal. For a signalx(n), the power spectrum isS(m) = |X(m)|2, whereX(m) is the discrete Fourier transform ofx(n). The probability distributionP(m) is then:

P ( m ) = S ( m ) i S ( i )

The spectral entropyHfollows as:

H = m = 1 N P ( m ) log 2 P ( m )

Normalizing:

H n = m = 1 N P ( m ) log 2 P ( m ) log 2 N ,

whereNis the total frequency points. The denominator, log2Nrepresents the maximal spectral entropy of white noise, uniformly distributed in the frequency domain.

If a time-frequency power spectrogramS(t,f) is known, then the probability distribution becomes:

P ( m ) = t S ( t , m ) f t S ( t , f )

Spectral entropy is still:

H = m = 1 N P ( m ) log 2 P ( m )

To compute the instantaneous spectral entropy given a time-frequency power spectrogramS(t,f), the probability distribution at timetis:

P ( t , m ) = S ( t , m ) f S ( t , f )

Then the spectral entropy at timetis:

H ( t ) = m = 1 N P ( t , m ) log 2 P ( t , m )

References

[1] Pan, Y. N., J. Chen, and X. L. Li. "Spectral Entropy: A Complementary Index for Rolling Element Bearing Performance Degradation Assessment."Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science。Vol. 223, Issue 5, 2009, pp. 1223–1231.

[2]Sharma, V., and A. Parey. "A Review of Gear Fault Diagnosis Using Various Condition Indicators."Procedia Engineering。Vol. 144, 2016, pp. 253–263.

[3] Shen, J., J. Hung, and L. Lee. "Robust Entropy-Based Endpoint Detection for Speech Recognition in Noisy Environments."ICSLP。Vol. 98, November 1998.

[4] Vakkuri, A., A. Yli‐Hankala, P. Talja, S. Mustola, H. Tolvanen‐Laakso, T. Sampson, and H. Viertiö‐Oja. "Time‐Frequency Balanced Spectral Entropy as a Measure of Anesthetic Drug Effect in Central Nervous System during Sevoflurane, Propofol, and Thiopental Anesthesia."Acta Anaesthesiologica Scandinavica。Vol. 48, Number 2, 2004, pp. 145–153.

Extended Capabilities

Version History

Introduced in R2018a