MATLAB Answers

0

Interpolating Uncertainties in a time series

Asked by Nick Hitt on 6 Feb 2019
Latest activity Edited by Nick Hitt on 6 Feb 2019
So I have a time series formatted as such below:
y1 x1 +/-z1
y2 x2 +/-z2
y3 x3 +/-z3
y4 x4 +/-z4
and I have been trying to linearly interpolate so the resulting data is as such:
y1 x1
y1.5 x1.5
y2 x2
y2.5 x2.5
I have been able to do that using interp1, so I do not have a problem creating the interpolated dataset with x and y, however I an unsure about how to interpolate the uncertainties. Would I just use interp1 again to linearly interpolate the uncertainties corresponding to x1.5 and x2.5? The uncertainty is only in the x values and not the y values if that helps.
Any help would be appreciated!
Nick

  0 Comments

Sign in to comment.

0 Answers