开发者

Compare smoothed signal to the input signal

I smooth a series of data points using the algorithm described here: http://www.scipy.org/Cookbook/SignalS开发者_StackOverflowmooth .

How could I compare the smoothed signal with the input signal afterward? I'm hoping I could get a scalar describing how "close" the output is from the input. Is there any standard way to do this? Some term I could look for?

I have no idea what to even look for. Thanks!


I used normalized root mean squared deviation. That gives me a number between 0 and 1. The bigger the number, the further away the two data series are. 0 means perfect match between the signal and the smoothed signal.


Discrete correlation is a way to detect a known waveform in a noisy background. Just find the correlation between two signals. Discrete correlation is simply a vector dot product:

for n in range(N):
   y[n] = sum( [x1[i]*x2[i+n] for i in range(N)] )

in pure Python, or:

y = xcorr(x1,x2);

in Matlab, or:

y = correlate(x1,x2) 

in Python+Scipy.

Correlation is a very sensitive measure of similarity of two signals. It is maximized when the two signals are similar in frequency content and are in phase with each other.


Assuming you smoothed the signal to remove noise, the most natural figure of merit would be the SNR.

So something like:

mean((smoothed[n] - original[n])^2) / mean( (smoothed[n])^2 )

The above assume the average of the signal is ~0.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜