- Define the loss function:
- Choose an optimization algorithm: Usually Convex functions are choosen since they have a minima.
- Calculate the gradient of the loss function: The gradient of the loss function represents the direction in which the loss function increases the most. In other words, it indicates how much each parameter should be adjusted based on the current state of the model.
- Update the model parameters: The optimization algorithm updates the model parameters by moving them in the direction of the negative gradient. This is accomplished by multiplying the gradient by a learning rate parameter and subtracting the resulting value from the current value of the model parameters.
- Repeat until convergence or N iterations.
How does the Parameter Estimator work?
2 views (last 30 days)
Show older comments
I understand that it is trying to adjust parameters to reduce error between the model and validation data.
How does it decide which parameters to adjust, going from one iteration to the next? I am curious because its able to reduce the error significantly in only 7 iterations while varying 7 parameters.
The help documentation doesn't provide much info on the inner workings
0 Comments
Answers (1)
Ishit
on 22 Jun 2023
Edited: Ishit
on 22 Jun 2023
To minimize the loss function in parameter estimation, we typically use an optimization algorithm that iteratively updates the model parameters in the direction of steepest descent of the loss function.
In summary, the goal of the optimization algorithm is to iteratively update the model parameters in a way that minimizes the loss function, which in turn results in a model that accurately predicts the observed data. We minimize loss function with respect to the model parameters.
0 Comments
See Also
Categories
Find more on Parameter Estimation in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!