update guess in fsolve
Show older comments
I want to solve f(x) = 0 using x0 as a guess. I can use fsolve. Let's assume that given the guess the exit flag is not 1 (equation not solved). What would be the most straighforward way to update my initial guess?
2 Comments
Walter Roberson
on 16 Feb 2025
It is normal for the exit flag is positive (1 through 4), unless you have set options to make most tolerances extremely small.
Luigi
on 16 Feb 2025
Answers (2)
Star Strider
on 16 Feb 2025
1 vote
I am not certain what you are actually asking.
If possible (if an equation can be plotted), I usually begin by plotting iit to get an idea of its characteristics. I then use an initial estimate based on what I learn from the plot.
There are a number of apppropriate functions in the Global Optimization Toolbox (my favourite being ga) that will themselves iterate the initial guess until they find the best one. This works even if an equation cannot be plotted.
.
2 Comments
Luigi
on 16 Feb 2025
Star Strider
on 16 Feb 2025
My plotting approach will not work with n equations and n unknowns.
Genetic algorithms are not difficult to write, however writing and debugging a complete version involving crossovers and mutations can be time-consuming.
I do not understand a reason to need to update the fsolve initial guess. I would just use the optimset function to increase the number of iterations and the number of function evaluations to allow fsolve to converge to an acceptable result.
Perhaps:
opts = optimset(MaxIterations=1000, MaxFunctionEvaluations=50000);
and then pass that as the third argument to fsolve. Change those as necessary for your problem.
.
John D'Errico
on 16 Feb 2025
1 vote
Um, no, You cannot always find a way to "update" your initial guess for fsolve. One can even prove this to be true.
You can change your initial guess. Simplest is to use a multi-start scheme, which will work even wothout the global optimization TB. There is nothing magic in multi-start. Just generate a random start point, and see if you get a solution you like. If not, keep trying random points inside the domain of interest until you are happy, well, until you get lucky.
However, there is no simple (or any complicated scheme either) to infer a better start point for an optimization problem, even if you can try to base it on a previous set of iterations. Again, we can prove that must fail. So no. There is no straightforward way to do what you want.
Unfortunately, from what you say, this is a system of nonlinear equations. And that means they can be arbitrarily nasty. They can have large domains where no matter what the start point in that domain, fsolve will always converge to a non-solution, and even if a solution does exist, the set of start points that will converge to the solution you want will be arbitrarily small.
So I'm sorry, but want what you want, there is no magic you can use. At best, you want to use global optimization tools, which themselves are not gauranteed to converge to a happy place. (But they are more likely to do so, on difficult problems.) Lacking that, just use a home-brewed multi-start method, calling fsolve repeatedly with random start points..
5 Comments
Walter Roberson
on 16 Feb 2025
For example, the non-linear equations could include a term
-1/((x-C1)^N + C2)
as long as N is even, and the quantities involved are all real-valued, and C2 is positive, then this cannot generate a singularity (if x == C1 exactly then (x-C1)^N would be 0, but you then add the positive C2 to get a positive result.)
The graph of this kind of function is a well. The higher the N, the narrower the well. For high enough N and small enough C2, this approaches a delta function, but it is continuous -- just potentially rather narrow. At x==C1 it reduces to -1/C2 and for C2=10^(-P) that would be -10^(P) -- so it can be a pretty deep well.
Now the thing about such wells is that outside of the edges of the well, they can easily be fairly flat, with no significant contribution to the overall function. If you do not happen to start your search very close to the well, the search is unlikely to find the well.
The ultimate non-continuous version of this is -delta(x-C1)/C2 -- a term which is 0 except where x == C1 exactly but is -1/C2 at x == C1 exactly. Such terms are impossible to find through searches, except searches that just happen to test at x == C1 exactly
So... in general if you have non-linear terms, the "catchment basin" of individual linear terms might be quite small.
The -1/((x-C1)^N + C2) form is variation of a Gaussian term by the way, so it is not so uncommon...
Walter Roberson
on 16 Feb 2025
fsolve() cannot handle stochastic equations. Neither can any of the optimization routines, with the exception of patternsearch from the Global Optimization Toolbox; https://www.mathworks.com/help/gads/noisy-objective-function.html
Torsten
on 17 Feb 2025
If the function does not vary much from step to step, use the solution of the i-th call to "fsolve" as initial guess for the (i+1)th call.
Categories
Find more on Choose a Solver in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!