Clear Filters
Clear Filters

Any ideas for making this exponential decay function match the actual data better?

87 views (last 30 days)
Hi all,
I have been trying to fit some weight data that was collected over time to an exponential decay function. I have come up with the following:
f(t) = An + (A0 - An) * exp (- t / tau)
Where: An = final data point measured, A0 = initial data point measured, and tau = a time constant that I am seeking to fit/solve for.
I am using the following code to set up the decay and solve for tau. Data is the collected weight data and t is the times at which the data was collected:
funlist={@(tau, t) An + (A0 - An) .* exp(-t / tau),1}; % define function
[tau]=fminspleas(funlist,1,time,data); % tau estimate
A0 = data(1);
An = data(end);
fn_data=@(t) funlist{1}(tau,t);
When I go to plot the data points to compare it with the fit, I notice the decay function does not do well at later time points where there is a greater plateau (see attached). I am wondering if anyone has any code or functions suggestions to help improve the fit of the curve? Any suggestions are welcome. Thank you very much!
  3 Comments
Kaya
Kaya on 29 Jul 2024 at 15:32
Thank you - see attached for x (time) and y (data).
I didn't mean to submit my question through File Exchange, that was a mistake. I can repost elsewhere.
I do have access to the Curve Fitting Toolbox and tried to use it to write a custom function, but was having trouble getting the toolbox to recognize the constants, An and A0.
the cyclist
the cyclist on 29 Jul 2024 at 15:37
You misunderstood my comment about the File Exchange. You did not "submit" via the FEX. I was just wondering if there was a reason you used an FEX function to do the fit, rather than something from a MathWorks toolbox.
I see that that particular FEX submission is authored by John D'Errico, though, so you can safely presume that it is rock-solid.

Sign in to comment.

Answers (2)

John D'Errico
John D'Errico on 30 Jul 2024 at 10:55
Edited: John D'Errico on 30 Jul 2024 at 11:47
(I spent so much time writing this as a comment, I might as well make it an answer.)
@Image Analyst asks a valid question. I have given talks about the difference between the various types of modeling. For example, there are the examples of physical models, where you have a very good model for a process, often based on a differential eqution, that describe a process. A good examople here is diffusion, where a simple model describes how heat flows, or how molecules diffuces through media, etc.
And sometimes you have a good metaphor for a process. The example I always liked is for product sales, where you can descibe the sales of a product in terms of models for epidemiology. Here you will see that disease transmission is like how different people learn about and then buy your product. So anything from washing machines, to cameras, etc. I call these metaphorical models, because the model is not truly a model of your process, but forms a good approximation to the process.
Next, in the talk as I have given it, are things like spline models. In fact, splines are one form of metaphorical model, since a spline can itself be viewed as a model of a thin flexible beam, then forced to pass through a set of points.
Finally, we see the purely empirical models. Polynomials are often thrown into this bin of course. But also, the many forms of nonlinear models, chosen simply because they can take on the shape you see in some curve.
The problem is when someone sees a curve as we see here, and then decide it looks vaguely like an exponential decay, and so it must be that. In fact, @Kaya has provided no logic behind the choice of an exponential decay, and certainly not the one chosen, where they then try to implicitly force the curve through two points. (The scheme chosen does not in fact do that, but that is another point.)
Now, a good thing about having a physical model for a process is you can then use that model to predict behavior of that process beyond the support of your data. I'd even call this a great thing. It allows you to extrapolate with some degree of confidence. To interpolate happily. It also can even help you draw conclusions about your process. You can interpret rate parameters in terms of things you understand about these physical models. To some extent, metaphorical models can also be used with some degree of success, to the extent the metaphor applies.
Again, the example I have used here in my teaching is that of Romeo, describing Juliet in the balcony scene. I'm sure you all recall this line:
"what light through yonder window breaks? It is the East, and Juliet the sun. Arise, fair sun, and kill the envious moon, Who is already sick and pale with grief..." W. Shakespeare, Romeo & Juliet
Here Romeo describes Juliet as the sun, then compares her to the moon. She shines brightly, but there is a limit to the metaphor. We cannot carry the metaphor to extremes. So we do not expect Juliet to have a surface temperature comparable to the sun. Metals will not immediately melt when in contact with her skin.
The point in this, is we need to understand the limits of a metaphor, as well as a metaphorical model for some process.
I've now gone far too deeply into the philosophy of the modeling process, but the point is when I have seen a situation like this in my consulting career where someone wants to fit a curve to data, but has no argument for their choice of model, this means you are now in the realm of empirical modeling, If you choose some form of model based merely on the shape of the curve you see, then you are doing so only to find a function that fits your data. And if that is the case, then a decaying exponential, or a spline or even a polynomial is as good as anything else as long as it has the proper shape. So the question you @Kaya needs to answer, is do you have some valid reason for your choice of model?
For example, IF your only goal is to fit the data well, then you could just do this:
load time.mat
load data.mat
plot(time,smoothed_visine,'-')
Now I can just use a low order polynomial model to fit that data quite well. I'll scale time to make things well-behaved for the fit.
P4 = fit(time'/1000,smoothed_visine','poly4')
P4 =
Linear model Poly4: P4(x) = p1*x^4 + p2*x^3 + p3*x^2 + p4*x + p5 Coefficients (with 95% confidence bounds): p1 = -7.406e-05 (-7.588e-05, -7.223e-05) p2 = 0.0004789 (0.0004591, 0.0004987) p3 = 0.002775 (0.002704, 0.002846) p4 = -0.03114 (-0.03123, -0.03104) p5 = 0.1358 (0.1358, 0.1358)
plot(P4,time/1000,smoothed_visine)
As you should see, this curve does fit the data very nicely, far better than an exponential model can be forced to do. The model is a purely empirical one. As a polynomial, it should NEVER be used to extrapolate at all, else you would get random crapola. But if your goal is to merely have a nice, smooth approximation to the data, valid over the support of your data, I'm not sure you will do too much better.
One thing I can do, is now that we have a nicely fitting model of your process, we can look at the derivative of that process. If a good model is truly a decaying exponential, then the first derivative will ALSO be a decaying exponential!
fplot(@(x) differentiate(P4,x),[0,5.5])
Warning: Function behaves unexpectedly on array inputs. To improve performance, properly vectorize your function to return an output with the same size and shape as the input arguments.
Since, in fact, the result does not look at all like a decaying exponential, I must repeat what I have said before in this thread, that your data does NOT merit use of that model. You MIGHT be able to justify a sum of 2 or more decaying eponentials, so a competing thing, where we might see several species of "stuff", all decaying at different rates. But when I tried that, even that model fails to perform as well as the simple 4th degree polynomial I show above.

Image Analyst
Image Analyst on 30 Jul 2024 at 19:41
Edited: Image Analyst on 30 Jul 2024 at 19:46
@Kaya John brings up some good points and I hope you read it carefully to understand.
From your other comment it sounds like the data is the weight of a contact lens over time as fluid evaporates off of it. As I understand it, you would like a single number that relates to the amount of "bend" in the curve. Some samples might get lighter linearly, like via a ramp, while others weigh less rapidly and then maybe level off so there is a pronounced "corner" in the curve. Another shape might be one that stays fairly flat for a long time and then drops off a cliff.
I think that there are some sigmoid functions that may be able to handle all of those situations (from bending one way, to fairly flat, to bending the other way) with one equation with one or maybe two parameters to describe the shape of the curve.
See https://en.wikipedia.org/wiki/Sigmoid_function for a list of some of these functions.
What we'd like you to do is to upload some data and plotted screenshots of several of your experimental data so we can see the shape of the different cuve samples. Especially pick one that evaporates slowly, one that has a fairly straight curve, and one at the other extreme where it evaporates quickly.
As an example, I'm attaching a demo where I use fitnlm to fit data to the "Michaelis-Menten function" for chemical rate description (it's one kind of sigmoid curve). Here's what it shows, though of course once it's adapted to use your data the curve would bend in the other direction.
If you do this, you might be able to intuitively understand one or two parameters describing the bend rather than 5 or 6 for a polynomial. It might not fit as well as a polynomial but the advantage of having only one or two parameters rather than a bunch of them might outweigh that.
Another option, empirical, is to just use the actual data and not try to model it. Doing that way, you could pick arbitrary times, like how much time does it take to dry from the starting weight to 150% of the dry weight? Or in other words, just look at how long it takes to lose 10%, 50%, and 90% of the added fluid. Very empirical, no modeling, but this kind of thing is used very often to describe weird shaped curves, for example the MTF (Modulation Transfer Function) of an optical system to describe the spatial frequencies the system can resolve.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!