# aic

Akaike Information Criterion for estimated model

## Syntax

`am = aic(model)am = aic(model1,model2,...)`

## Description

`am = aic(model)` returns a scalar value of the Akaike's Information Criterion (AIC) for the estimated `model`.

`am = aic(model1,model2,...)` returns a row vector containing AIC values for the estimated models `model1,model2,...`.

## Arguments

`model`

Name of an `idtf`, `idgrey`, `idpoly`, `idproc`, `idss`, `idnlarx`, `idnlhw`, or `idnlgrey` model object.

collapse all

### Akaike's Information Criterion (AIC)

Akaike's Information Criterion (AIC) provides a measure of model quality by simulating the situation where the model is tested on a different data set. After computing several different models, you can compare them using this criterion. According to Akaike's theory, the most accurate model has the smallest AIC.

 Note:   If you use the same data set for both model estimation and validation, the fit always improves as you increase the model order and, therefore, the flexibility of the model structure.

Akaike's Information Criterion (AIC) is defined by the following equation:

$AIC=\mathrm{log}V+\frac{2d}{N}$

where V is the loss function, d is the number of estimated parameters, and N is the number of values in the estimation data set.

The loss function V is defined by the following equation:

$V=\mathrm{det}\left(\frac{1}{N}\sum _{1}^{N}\epsilon \left(t,{\stackrel{^}{\theta }}_{N}\right){\left(\epsilon \left(t,{\stackrel{^}{\theta }}_{N}\right)\right)}^{T}\right)$

where ${\theta }_{N}$ represents the estimated parameters.

For d<<N:

$AIC=\mathrm{log}\left(V\left(1+\frac{2d}{N}\right)\right)$

 Note:   AIC is approximately equal to log(FPE).

## References

Ljung, L. System Identification: Theory for the User, Upper Saddle River, NJ, Prentice-Hal PTR, 1999. See sections about the statistical framework for parameter estimation and maximum likelihood method and comparing model structures.

Get trial now