resume
Syntax
Description
resumes a set of hyperparameter optimization problems for an additional default number of
iterations using the same fitting function and optimization settings used to create the
newAggregateResults = resume(AggregateResults)AggregateBayesianOptimization object AggregateResults. When
you resume the optimization problems, the resume function returns a new
AggregateBayesianOptimization object, but does not return any model
objects.
specifies additional options using one or more name-value arguments. For example, you can
specify which optimization problems in newAggregateResults = resume(AggregateResults,Name=Value)AggregateResults to
resume.
Examples
Load the ovarian cancer data set.
load ovariancancer.matCreate a HyperparameterOptimizationOptions object with the following specifications for two hyperparameter optimization problems:
Use
kfoldLossas the constraint, with the bounds [0, 0.1] for the first problem and [0, 0.15] for the second problem.Perform a maximum of 10 objective evaluations for each optimization problem.
Use the
'expected-improvement-plus'acquisition function (for reproducibility).Suppress the display of plots.
hpoOptions = hyperparameterOptimizationOptions(ConstraintType="loss", ... ConstraintBounds=[0.1; 0.15],MaxObjectiveEvaluations=10, ... AcquisitionFunctionName="expected-improvement-plus",ShowPlots=false);
Call the fitctree function to train a binary decision tree classification model and optimize the MinLeafSize hyperparameter for each optimization problem, using the options and constraints in hpoOptions. Because ConstraintType is 'loss', the software uses the size of the compact version of the model object as the objective.
rng(0,"twister"); % For reproducibility [Mdl,hpoResults]=fitctree(obs,grp,OptimizeHyperparameters="MinLeafSize", ... HyperparameterOptimizationOptions=hpoOptions);
|=====================================================================================================| | | |Objective : "CompactModelSize (bytes)" | |Constraint : "kfoldLoss" | |Constraint Bounds : [0 0.1] | | | |=====================================================================================================| | Iter | Eval | Objective | Objective | BestSoFar | BestSoFar | Constraint1 | MinLeafSize | | | result | | runtime | (observed) | (estim.) | violation | | |=====================================================================================================| | 1 | Infeas | 40027 | 66.373 | NaN | 40027 | 0.34 | 91 | | 2 | Infeas | 44103 | 0.60747 | NaN | 40230 | 0.0713 | 1 | | 3 | Infeas | 40707 | 0.35853 | NaN | 40027 | 0.025 | 22 | | 4 | Infeas | 42063 | 0.43956 | NaN | 40027 | 0.062 | 6 | | 5 | Infeas | 40027 | 0.3385 | NaN | 40019 | 0.025 | 79 | | 6 | Infeas | 42063 | 0.30684 | NaN | 39996 | 0.0759 | 10 | | 7 | Infeas | 44103 | 0.30746 | NaN | 40055 | 0.0574 | 3 | | 8 | Infeas | 42063 | 0.32436 | NaN | 40056 | 0.0805 | 7 | | 9 | Infeas | 42063 | 0.30914 | NaN | 40046 | 0.0759 | 10 | | 10 | Infeas | 40707 | 0.24694 | NaN | 40040 | 0.062 | 21 | __________________________________________________________ Optimization completed. MaxObjectiveEvaluations of 10 reached. Total function evaluations: 10 Total elapsed time: 85.1906 seconds Total objective function evaluation time: 69.6117 No feasible points were found. |=====================================================================================================| | | |Objective : "CompactModelSize (bytes)" | |Constraint : "kfoldLoss" | |Constraint Bounds : [0 0.15] | | | |=====================================================================================================| | Iter | Eval | Objective | Objective | BestSoFar | BestSoFar | Constraint1 | MinLeafSize | | | result | | runtime | (observed) | (estim.) | violation | | |=====================================================================================================| | 1 | Best | 40027 | 0.25714 | 40027 | 40027 | -0.0343 | 38 | | 2 | Infeas | 44103 | 0.34661 | 40027 | 40247 | 0.012 | 1 | | 3 | Accept | 40027 | 0.17625 | 40027 | 40027 | -0.00649 | 85 | | 4 | Infeas | 40707 | 0.40139 | 40027 | 40026 | 0.0259 | 14 | | 5 | Accept | 40027 | 0.28058 | 40027 | 40009 | -0.0343 | 57 | | 6 | Accept | 40027 | 0.22424 | 40027 | 40023 | -0.0343 | 55 | | 7 | Accept | 40027 | 0.27688 | 40027 | 40023 | -0.0343 | 46 | | 8 | Infeas | 40027 | 0.1058 | NaN | 40033 | 0.29 | 89 | | 9 | Accept | 40027 | 0.19001 | NaN | 40033 | -0.0111 | 80 | | 10 | Accept | 40707 | 0.34801 | NaN | 40031 | -0.0204 | 19 | __________________________________________________________ Optimization completed. MaxObjectiveEvaluations of 10 reached. Total function evaluations: 10 Total elapsed time: 4.3909 seconds Total objective function evaluation time: 2.6069 No feasible points were found.
Display a summary of the optimization results.
summary(hpoResults)
Objective: CompactModelSize (bytes)
Constraint: kfoldLoss
MinObjective ConstraintAtMinObjective ConstraintBounds ConstraintBoundsAreSatisfied Feasible LearnerAtMinObjective
____________ ________________________ ________________ ____________________________ ________ _____________________
Result_1 40027 0.12499 0 0.1 false false "ClassificationTree"
Result_2 40027 0.11573 0 0.15 true false "ClassificationTree"
The final attained models in the optimization problems are infeasible. The ConstraintAtMinObjective value of the final attained model in the second optimization problem (0.11573) satisfies the constraint bounds, but the model is infeasible because the attained point is outside the optimization confidence bounds.
Resume the optimization problems for an additional 30 iterations (the default number of additional iterations). Store the results in the AggregateBayesianOptimization object newResults.
newResults=resume(hpoResults);
|=====================================================================================================|
| |
|Objective : "CompactModelSize (bytes)" |
|Constraint : "kfoldLoss" |
|Constraint Bounds : [0 0.1] |
| |
|=====================================================================================================|
| Iter | Eval | Objective | Objective | BestSoFar | BestSoFar | Constraint1 | MinLeafSize |
| | result | | runtime | (observed) | (estim.) | violation | |
|=====================================================================================================|
| 11 | Infeas | 40027 | 0.43061 | NaN | 40028 | 0.025 | 36 |
| 12 | Infeas | 42743 | 0.41494 | NaN | 40044 | 0.0435 | 5 |
| 13 | Infeas | 40707 | 0.27159 | NaN | 40040 | 0.062 | 21 |
| 14 | Infeas | 40707 | 0.28638 | NaN | 40022 | 0.0852 | 13 |
| 15 | Infeas | 40027 | 0.25927 | NaN | 40017 | 0.025 | 28 |
| 16 | Infeas | 40707 | 0.25519 | NaN | 40016 | 0.025 | 24 |
| 17 | Infeas | 44103 | 0.32862 | NaN | 40045 | 0.0574 | 3 |
| 18 | Infeas | 44103 | 0.34216 | NaN | 40058 | 0.0574 | 3 |
| 19 | Infeas | 44103 | 0.38356 | NaN | 40061 | 0.0713 | 1 |
| 20 | Infeas | 44103 | 0.34886 | NaN | 40047 | 0.0667 | 2 |
|=====================================================================================================|
| Iter | Eval | Objective | Objective | BestSoFar | BestSoFar | Constraint1 | MinLeafSize |
| | result | | runtime | (observed) | (estim.) | violation | |
|=====================================================================================================|
| 21 | Infeas | 42063 | 0.32492 | NaN | 40053 | 0.0805 | 7 |
| 22 | Infeas | 40707 | 0.28227 | NaN | 40053 | 0.0481 | 19 |
| 23 | Infeas | 40707 | 0.27995 | NaN | 40048 | 0.025 | 32 |
| 24 | Infeas | 40707 | 0.25968 | NaN | 40054 | 0.025 | 33 |
| 25 | Infeas | 40707 | 0.23445 | NaN | 40063 | 0.025 | 35 |
| 26 | Infeas | 40027 | 0.27243 | NaN | 40065 | 0.025 | 30 |
| 27 | Infeas | 40027 | 0.24756 | NaN | 40052 | 0.025 | 40 |
| 28 | Infeas | 40027 | 0.23647 | NaN | 40042 | 0.025 | 40 |
| 29 | Infeas | 40707 | 0.27462 | NaN | 40040 | 0.025 | 31 |
| 30 | Infeas | 40027 | 0.17799 | NaN | 40036 | 0.025 | 75 |
| 31 | Infeas | 40707 | 0.2949 | NaN | 40036 | 0.062 | 17 |
| 32 | Infeas | 40707 | 0.19485 | NaN | 40055 | 0.025 | 45 |
| 33 | Infeas | 40027 | 0.18133 | NaN | 40043 | 0.025 | 51 |
| 34 | Infeas | 40707 | 0.28066 | NaN | 40042 | 0.062 | 16 |
| 35 | Infeas | 40027 | 0.17038 | NaN | 40030 | 0.025 | 57 |
| 36 | Infeas | 43423 | 0.37602 | NaN | 40029 | 0.0528 | 4 |
| 37 | Infeas | 40027 | 0.16661 | NaN | 40020 | 0.025 | 63 |
| 38 | Infeas | 42063 | 0.35504 | NaN | 40019 | 0.0898 | 8 |
| 39 | Infeas | 40027 | 0.19436 | NaN | 40017 | 0.025 | 48 |
| 40 | Infeas | 40707 | 0.23338 | NaN | 40016 | 0.025 | 23 |
__________________________________________________________
Optimization completed.
MaxObjectiveEvaluations of 40 reached.
Total function evaluations: 40
Total elapsed time: 98.7197 seconds
Total objective function evaluation time: 77.9708
No feasible points were found.
|=====================================================================================================|
| |
|Objective : "CompactModelSize (bytes)" |
|Constraint : "kfoldLoss" |
|Constraint Bounds : [0 0.15] |
| |
|=====================================================================================================|
| Iter | Eval | Objective | Objective | BestSoFar | BestSoFar | Constraint1 | MinLeafSize |
| | result | | runtime | (observed) | (estim.) | violation | |
|=====================================================================================================|
| 11 | Infeas | 42063 | 0.34939 | NaN | 40078 | 0.00277 | 8 |
| 12 | Infeas | 43423 | 0.40105 | NaN | 40082 | 0.012 | 4 |
| 13 | Infeas | 44103 | 0.43972 | NaN | 40084 | 0.012 | 1 |
| 14 | Accept | 40707 | 0.28792 | NaN | 40031 | -0.0204 | 20 |
| 15 | Infeas | 44103 | 0.37072 | NaN | 40111 | 0.0213 | 2 |
| 16 | Accept | 40707 | 0.32508 | NaN | 40032 | -0.00186 | 17 |
| 17 | Infeas | 40707 | 0.36003 | NaN | 40028 | 0.0167 | 11 |
| 18 | Infeas | 40707 | 0.30457 | NaN | 40028 | 0.0167 | 12 |
| 19 | Infeas | 44103 | 0.45343 | NaN | 40028 | 0.0213 | 2 |
| 20 | Infeas | 42063 | 0.38948 | NaN | 40101 | 0.00277 | 10 |
|=====================================================================================================|
| Iter | Eval | Objective | Objective | BestSoFar | BestSoFar | Constraint1 | MinLeafSize |
| | result | | runtime | (observed) | (estim.) | violation | |
|=====================================================================================================|
| 21 | Infeas | 40027 | 0.1309 | NaN | 40103 | 0.29 | 101 |
| 22 | Accept | 40027 | 0.22526 | NaN | 40106 | -0.0343 | 55 |
| 23 | Accept | 40027 | 0.30185 | NaN | 40108 | -0.0343 | 55 |
| 24 | Accept | 40027 | 0.20494 | 40027 | 40018 | -0.0343 | 55 |
| 25 | Accept | 40027 | 0.23858 | 40027 | 40011 | -0.0343 | 38 |
| 26 | Accept | 40027 | 0.1978 | 40027 | 40010 | -0.0343 | 46 |
| 27 | Infeas | 40027 | 0.084622 | NaN | 40095 | 0.29 | 108 |
| 28 | Accept | 40027 | 0.15048 | NaN | 40097 | -0.0343 | 55 |
| 29 | Accept | 40027 | 0.14118 | 40027 | 40016 | -0.0343 | 55 |
| 30 | Accept | 40027 | 0.18496 | 40027 | 40017 | -0.0343 | 76 |
| 31 | Accept | 40027 | 0.15109 | 40027 | 40015 | -0.0296 | 75 |
| 32 | Accept | 40027 | 0.28697 | 40027 | 40012 | -0.0343 | 38 |
| 33 | Accept | 40027 | 0.19405 | 40027 | 40011 | -0.0343 | 46 |
| 34 | Accept | 40027 | 0.23407 | 40027 | 40010 | -0.0343 | 30 |
| 35 | Accept | 40027 | 0.2241 | 40027 | 40009 | -0.0343 | 30 |
| 36 | Accept | 40027 | 0.22822 | 40027 | 40008 | -0.0343 | 30 |
| 37 | Accept | 40027 | 0.23876 | 40027 | 40007 | -0.0343 | 38 |
| 38 | Accept | 40027 | 0.12949 | 40027 | 40008 | -0.0296 | 75 |
| 39 | Accept | 40027 | 0.14375 | 40027 | 40008 | -0.00649 | 85 |
| 40 | Accept | 40027 | 0.22908 | 40027 | 40008 | -0.0343 | 30 |
__________________________________________________________
Optimization completed.
MaxObjectiveEvaluations of 40 reached.
Total function evaluations: 40
Total elapsed time: 19.3236 seconds
Total objective function evaluation time: 10.2084
Best observed feasible point:
MinLeafSize
___________
38
Observed objective function value = 40027
Estimated objective function value = 40034.0182
Function evaluation time = 0.25714
Observed constraint violations =[ -0.034265 ]
Best estimated feasible point (according to models):
MinLeafSize
___________
55
Estimated objective function value = 40007.5995
Estimated function evaluation time = 0.19489
Estimated constraint violations =[ -0.034265 ]
By default, the software resumes each optimization problem by calling fitctree with the same hyperparameter optimization options as before, with the exception of MaxObjectiveEvaluations, whose value is increased by 30. The resume function returns a new AggregateBayesianOptimization object, but does not alter the original final attained models in Mdl.
Display a summary of the new results.
summary(newResults)
Objective: CompactModelSize (bytes)
Constraint: kfoldLoss
MinObjective ConstraintAtMinObjective ConstraintBounds ConstraintBoundsAreSatisfied Feasible LearnerAtMinObjective
____________ ________________________ ________________ ____________________________ ________ _____________________
Result_1 40027 0.12499 0 0.1 false false "ClassificationTree"
Result_2 40027 0.11573 0 0.15 true true "ClassificationTree"
The summary indicates that the original final attained model in the second optimization problem is feasible with the new attained hyperparameter value.
Display the new attained hyperparameter value for the second optimization problem.
newResults.HyperparameterOptimizationResults{2}.XAtMinObjectiveans=table
MinLeafSize
___________
38
Train a new cross-validated binary decision tree classification model with the new attained hyperparameter value.
optimizedMdl=fitctree(obs,grp,MinLeafSize=38);
Display the compact size of the new model in bytes.
learnersize(optimizedMdl)
ans = 40027
Calculate the classification loss for the cross-validated model.
kfoldLoss(crossval(optimizedMdl))
ans = single
0.1389
The loss lies within the constraint bounds of [0, 0.15].
This example shows how to resume a set of hyperparameter optimization problems with modified variables. The example uses the gprdata2 data that ships with your software.
Load the data.
load('gprdata2.mat')The data has one predictor variable and continuous response. This is simulated data.
Create a structure that contains the following non-default settings for the hyperparameter optimization problems.
Use the compact model size as a constraint, with bounds between 0 and 10000 bytes for the first problem, and 0 and 20000 bytes for the second problem.
Use the
'expected-improvement-plus'acquisition function (for reproducibility).Perform a maximum of 10 objective function evaluations for each optimization problem.
Do not display any plots.
hyperopts = struct(AcquisitionFunctionName="expected-improvement-plus", ... ConstraintType="size", ConstraintBounds=[10000; 20000], ... MaxObjectiveEvaluations=10, ShowPlots=false);
For each hyperparameter optimization problem, train a GPR model and optimize the Sigma hyperparameter within the range [0 0.01] using the specified optimization settings. Use a squared exponential kernel function with default kernel parameters.
rng(0,'twister'); % For reproducibility [Mdl,hpoResults] = fitrgp(x,y,KernelFunction="squaredexponential",... OptimizeHyperparameters=optimizableVariable(Sigma=[0 0.01]), ... HyperparameterOptimizationOptions=hyperopts);
|=====================================================================================================|
| |
|Objective : "kfoldLoss" |
|Constraint : "CompactModelSize (bytes)" |
|Constraint Bounds : [0 10000] |
| |
|=====================================================================================================|
| Iter | Eval | Objective: | Objective | BestSoFar | BestSoFar | Constraint1 | Sigma |
| | result | log(1+loss) | runtime | (observed) | (estim.) | violation | |
|=====================================================================================================|
| 1 | Infeas | 0.29871 | 3.9912 | NaN | 0.29871 | 6.01e+03 | 0.0021539 |
| 2 | Infeas | 0.29856 | 3.2881 | NaN | 0.29864 | 6.01e+03 | 0.0057638 |
| 3 | Infeas | 0.29823 | 3.6449 | NaN | 0.29871 | 6.01e+03 | 0.0098837 |
| 4 | Infeas | 1.0384 | 3.2868 | NaN | 0.37861 | 6.01e+03 | 6.9219e-05 |
| 5 | Infeas | 0.29873 | 3.811 | NaN | 0.44652 | 6.01e+03 | 4.3883e-05 |
| 6 | Infeas | 0.29859 | 3.2586 | NaN | 0.42187 | 6.01e+03 | 0.0052323 |
| 7 | Infeas | 0.29839 | 3.0369 | NaN | 0.40423 | 6.01e+03 | 0.0082039 |
| 8 | Infeas | 0.29845 | 4.4232 | NaN | 0.39101 | 6.01e+03 | 0.0074221 |
| 9 | Infeas | 0.29823 | 4.3312 | NaN | 0.3807 | 6.01e+03 | 0.0098713 |
| 10 | Infeas | 1.5061 | 3.7571 | NaN | 0.49324 | 6.01e+03 | 0.0043666 |
__________________________________________________________
Optimization completed.
MaxObjectiveEvaluations of 10 reached.
Total function evaluations: 10
Total elapsed time: 39.9401 seconds
Total objective function evaluation time: 36.8291
No feasible points were found.
|=====================================================================================================|
| |
|Objective : "kfoldLoss" |
|Constraint : "CompactModelSize (bytes)" |
|Constraint Bounds : [0 20000] |
| |
|=====================================================================================================|
| Iter | Eval | Objective: | Objective | BestSoFar | BestSoFar | Constraint1 | Sigma |
| | result | log(1+loss) | runtime | (observed) | (estim.) | violation | |
|=====================================================================================================|
| 1 | Best | 0.3047 | 5.3322 | 0.3047 | 0.3047 | -3.99e+03 | 9.8148e-05 |
| 2 | Best | 0.30457 | 3.6673 | 0.30457 | 0.30457 | -3.99e+03 | 0.0054459 |
| 3 | Accept | 0.30466 | 3.4158 | 0.30457 | 0.30464 | -3.99e+03 | 0.0029412 |
| 4 | Accept | 1.1452 | 2.9128 | 0.30457 | 0.30462 | -3.99e+03 | 0.0089683 |
| 5 | Accept | 0.30462 | 3.7472 | 0.30457 | 0.30448 | -3.99e+03 | 0.0043144 |
| 6 | Accept | 0.30469 | 3.5671 | 0.30457 | 0.30432 | -3.99e+03 | 0.001374 |
| 7 | Accept | 0.30459 | 2.7226 | 0.30457 | 0.3032 | -3.99e+03 | 0.0049664 |
| 8 | Accept | 0.3047 | 3.0385 | 0.30457 | 0.30312 | -3.99e+03 | 0.00066593 |
| 9 | Accept | 1.1453 | 2.6288 | 0.30457 | 0.30459 | -3.99e+03 | 0.002184 |
| 10 | Accept | 1.8455 | 2.6282 | 0.30457 | 0.62685 | -3.99e+03 | 0.0010548 |
__________________________________________________________
Optimization completed.
MaxObjectiveEvaluations of 10 reached.
Total function evaluations: 10
Total elapsed time: 35.6301 seconds
Total objective function evaluation time: 33.6603
Best observed feasible point:
Sigma
_________
0.0054459
Observed objective function value = 0.30457
Estimated objective function value = 0.62685
Function evaluation time = 3.6673
Observed constraint violations =[ -3990.500000 ]
Best estimated feasible point (according to models):
Sigma
_________
0.0049664
Estimated objective function value = 0.62685
Estimated function evaluation time = 3.2901
Estimated constraint violations =[ -3990.500000 ]
Display a summary of the optimization results.
summary(hpoResults)
Objective: kfoldLoss
Constraint: CompactModelSize (bytes)
MinObjective ConstraintAtMinObjective ConstraintBounds ConstraintBoundsAreSatisfied Feasible LearnerAtMinObjective
____________ ________________________ ________________ ____________________________ ________ _____________________
Result_1 0.29823 16010 0 10000 false false "RegressionGP"
Result_2 0.30459 16010 0 20000 true true "RegressionGP"
The final model in the first optimization problem is infeasible, since its compact size lies outside the constraint bounds. The second optimization problem has a minimum objective value of 0.30065 and is feasible. Display its properties.
details(Mdl{2}) RegressionGP with properties:
IsActiveSetVector: [501×1 logical]
LogLikelihood: -726.5793
ActiveSetHistory: []
BCDInformation: []
Y: [501×1 double]
X: [501×1 double]
RowsUsed: []
W: [501×1 double]
ModelParameters: [1×1 classreg.learning.modelparams.GPParams]
NumObservations: 501
BinEdges: {}
HyperparameterOptimizationResults: [1×1 BayesianOptimization]
PredictorNames: {'x1'}
CategoricalPredictors: []
ResponseName: 'Y'
ExpandedPredictorNames: {'x1'}
ResponseTransform: 'none'
KernelFunction: 'SquaredExponential'
KernelInformation: [1×1 struct]
BasisFunction: 'Constant'
Beta: 7.9887
Sigma: 0.0050
PredictorLocation: []
PredictorScale: []
Alpha: [501×1 double]
ActiveSetVectors: [501×1 double]
FitMethod: 'Exact'
PredictMethod: 'Exact'
ActiveSetMethod: 'Random'
ActiveSetSize: 501
Methods, Superclasses
The final attained model in the second optimization problem has a Sigma value of 0.0293.
Resume the hyperparameter optimization of the second problem, and alter the constraint such that the Sigma parameter value must lie between 0 and 1. By default, the software resumes the optimization problem by calling fitrgp with the same hyperparameter optimization options as before, with the exception of MaxObjectiveEvaluations, whose value is increased by 30. The resume function returns the new AggregateBayesianOptimization object newResults, and does not alter the original final attained model Mdl.
newResults=resume(hpoResults, Results=2, ...
VariableDescriptions={optimizableVariable(Sigma=[0 1])});|=====================================================================================================|
| |
|Objective : "kfoldLoss" |
|Constraint : "CompactModelSize (bytes)" |
|Constraint Bounds : [0 20000] |
| |
|=====================================================================================================|
| Iter | Eval | Objective: | Objective | BestSoFar | BestSoFar | Constraint1 | Sigma |
| | result | log(1+loss) | runtime | (observed) | (estim.) | violation | |
|=====================================================================================================|
| 11 | Best | 0.21409 | 2.3345 | 0.21409 | 0.58932 | -3.99e+03 | 0.42134 |
| 12 | Accept | 0.41916 | 1.2331 | 0.21409 | 0.57514 | -3.99e+03 | 0.78937 |
| 13 | Accept | 0.42224 | 1.4673 | 0.21409 | 0.56338 | -3.99e+03 | 0.92588 |
| 14 | Accept | 0.41878 | 1.4059 | 0.21409 | 0.55305 | -3.99e+03 | 0.57528 |
| 15 | Accept | 0.29419 | 2.8614 | 0.21409 | 0.5358 | -3.99e+03 | 0.044621 |
| 16 | Best | 0.037932 | 2.1616 | 0.037932 | 0.50468 | -3.99e+03 | 0.40628 |
| 17 | Best | 0.037843 | 2.1971 | 0.037843 | 0.47722 | -3.99e+03 | 0.35807 |
| 18 | Best | 0.037799 | 1.7658 | 0.037799 | 0.37994 | -3.99e+03 | 0.29877 |
| 19 | Accept | 0.037959 | 1.9942 | 0.037799 | 0.32644 | -3.99e+03 | 0.17425 |
| 20 | Accept | 0.41884 | 1.216 | 0.037799 | 0.33267 | -3.99e+03 | 0.68206 |
|=====================================================================================================|
| Iter | Eval | Objective: | Objective | BestSoFar | BestSoFar | Constraint1 | Sigma |
| | result | log(1+loss) | runtime | (observed) | (estim.) | violation | |
|=====================================================================================================|
| 21 | Accept | 0.42224 | 1.6611 | 0.037799 | 0.33802 | -3.99e+03 | 0.99998 |
| 22 | Accept | 0.038774 | 1.6969 | 0.037799 | 0.31481 | -3.99e+03 | 0.11558 |
| 23 | Accept | 0.037833 | 1.8724 | 0.037799 | 0.26368 | -3.99e+03 | 0.23659 |
| 24 | Accept | 0.41775 | 1.3582 | 0.037799 | 0.27335 | -3.99e+03 | 0.50074 |
| 25 | Accept | 0.42108 | 1.6707 | 0.037799 | 0.27591 | -3.99e+03 | 0.85672 |
| 26 | Accept | 0.42008 | 1.3765 | 0.037799 | 0.27774 | -3.99e+03 | 0.62946 |
| 27 | Accept | 0.41894 | 1.4447 | 0.037799 | 0.27836 | -3.99e+03 | 0.7363 |
| 28 | Accept | 0.04287 | 1.9588 | 0.037799 | 0.27548 | -3.99e+03 | 0.085874 |
| 29 | Accept | 0.037806 | 2.4609 | 0.037799 | 0.21134 | -3.99e+03 | 0.26767 |
| 30 | Accept | 0.037884 | 2.2163 | 0.037799 | 0.17295 | -3.99e+03 | 0.20505 |
| 31 | Accept | 0.037812 | 2.0515 | 0.037799 | 0.14563 | -3.99e+03 | 0.32836 |
| 32 | Accept | 0.038065 | 1.7828 | 0.037799 | 0.12981 | -3.99e+03 | 0.14468 |
| 33 | Accept | 0.037881 | 1.875 | 0.037799 | 0.11592 | -3.99e+03 | 0.38218 |
| 34 | Accept | 0.42191 | 1.2999 | 0.037799 | 0.11426 | -3.99e+03 | 0.96471 |
| 35 | Accept | 0.28801 | 1.8456 | 0.037799 | 0.11272 | -3.99e+03 | 0.45958 |
| 36 | Accept | 0.039077 | 2.241 | 0.037799 | 0.1082 | -3.99e+03 | 0.10158 |
| 37 | Accept | 0.41781 | 2.1452 | 0.037799 | 0.10593 | -3.99e+03 | 0.53893 |
| 38 | Accept | 0.42012 | 2.6652 | 0.037799 | 0.10308 | -3.99e+03 | 0.82349 |
| 39 | Accept | 0.42202 | 4.2087 | 0.037799 | 0.099898 | -3.99e+03 | 0.89141 |
| 40 | Accept | 0.037856 | 10.716 | 0.037799 | 0.087188 | -3.99e+03 | 0.22074 |
__________________________________________________________
Optimization completed.
MaxObjectiveEvaluations of 40 reached.
Total function evaluations: 40
Total elapsed time: 115.2313 seconds
Total objective function evaluation time: 100.8449
Best observed feasible point:
Sigma
_______
0.29877
Observed objective function value = 0.037799
Estimated objective function value = 0.09704
Function evaluation time = 1.7658
Observed constraint violations =[ -3990.500000 ]
Best estimated feasible point (according to models):
Sigma
_______
0.22074
Estimated objective function value = 0.087188
Estimated function evaluation time = 2.4261
Estimated constraint violations =[ -3990.500000 ]
The new minimum objective and Lambda values are 0.23944 and 0.10002, respectively.
Input Arguments
Aggregate optimization results, specified as an AggregateBayesianOptimization object.
Name-Value Arguments
Specify optional pairs of arguments as
Name1=Value1,...,NameN=ValueN, where Name is
the argument name and Value is the corresponding value.
Name-value arguments must appear after other arguments, but the order of the
pairs does not matter.
Example: resume(
resumes the optimization of the first and third optimization problems in
AggregateResults,Results=[1 3])AggregateResults.
Note
The MaxTime and MaxObjectiveEvaluations
name-value arguments specify additional time or objective
evaluations, above the numbers stored in AggregateResults. For
example, the default number of evaluations is 30 in addition to the
original specification.
Optimization problems in AggregateResults to resume,
specified as "all", a numeric vector of positive integers
containing indices in the range [1,N], or a logical vector of
length N, where
N=numel(AggregateResults.HyperparameterOptimizationResults).
Example: Results=[1 3]
Data Types: single | double | logical
Variables to modify, specified as a P-by-1
cell array, where P must be equal to 1 or a
value that depends on the contents of Results.
Contents of Results | Value of P |
|---|---|
'all' | numel(AggregateResults.HyperparameterOptimizationResults) |
| Numeric values | numel(Results) |
| Logical values | sum(Results) |
Each cell of VariableDescriptions contains a
K-by-1 or
1-by-K array of optimizableVariable objects, where K is the number of
optimizable variables in hpoResults. The software applies the
contents of each cell to the
AggregateResults.HyperparameterOptimizationResults property of
the corresponding optimization problem with the index specified in
Results. If P=1, then the
software applies the cell contents to all optimization problems with the indices
specified in Results.
You can modify only the following properties of a variable in an optimization:
Rangeof real or integer variables. For example:xvar = optimizableVariable(x=[-10,10]); % Modify the range: xvar.Range = [1,5];Typebetween"integer"and"real". For example:xvar.Type = "integer";Transformof real or integer variables between"log"and"none". For example:xvar.Transform = "log";
Data Types: cell array
Maximum number of objective function evaluations, specified as a
P-by-1 array of positive integers or
[]. The value of P must be equal to
1 or a value that depends on the contents of Results.
Contents of Results | Value of P |
|---|---|
'all' | numel(AggregateResults.HyperparameterOptimizationResults) |
| Numeric values | numel(Results) |
| Logical values | sum(Results) |
If P=1, the software applies the
value of MaxObjectiveEvaluations to all optimization problems in
AggregrateOptimizationResults with the indices specified in
Results. Otherwise, the software applies each element of
MaxObjectiveEvaluations to the corresponding optimization
problem with the index specified in Results.
If MaxObjectiveEvaluations is [], the
default value depends on the fitting function and optimizer used to create
AggregrateOptimizationResults. For more information, see the
HyperparameterOptimizationOptions name-value argument description
on the documentation pages of the individual fitting functions.
Example: MaxObjectiveEvaluations=40
Data Types: single | double
Time limit for the optimization, specified as [] or a
P-by-1 array. The array must contain
nonnegative integers or Inf. The value of P must
be equal to 1 or a value that depends on the contents of Results.
Contents of Results | Value of P |
|---|---|
'all' | numel(AggregateResults.HyperparameterOptimizationResults) |
| Numeric values | numel(Results) |
| Logical values | sum(Results) |
If P=1, then the software
applies the value of MaxTime to all optimization problems in
AggregrateOptimizationResults with the indices specified in
Results. Otherwise, the software applies each value of
MaxTime to the corresponding optimization problem with the
index in Results.
If MaxTime is [], the default value is
Inf.
Example: MaxTime=[30 60]
Data Types: single | double
Output Arguments
Optimization results, returned as an AggregateBayesianOptimization object.
Version History
Introduced in R2024b
MATLAB Command
You clicked a link that corresponds to this MATLAB command:
Run the command by entering it in the MATLAB Command Window. Web browsers do not support MATLAB commands.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)