Neural Net Time Series
(To be removed) Solve nonlinear time series problem using dynamic neural networks
The Neural Net Time Series app will be removed in a future release. For more information, see Transition Legacy Neural Network Code to dlnetwork Workflows.
For advice on updating your code, see Version History.
Description
The Neural Net Time Series app lets you create, visualize, and train dynamic neural networks to solve three different kinds of nonlinear time series problems.
Using this app, you can:
Create three types of neural networks: NARX networks, NAR networks, and nonlinear input-output networks.
Import data from file, the MATLAB® workspace, or use one of the example data sets.
Split data into training, validation, and test sets.
Define and train a neural network.
Evaluate network performance using mean squared error and regression analysis.
Analyze results using visualization plots, such as autocorrelation plots or a histogram of errors.
Generate MATLAB scripts to reproduce results and customize the training process.
Generate functions suitable for deployment with MATLAB Compiler™ and MATLAB Coder™ tools, and export to Simulink® for use with Simulink Coder.
Tip
To interactively build and train deep neural networks for time series tasks, use the Time Series Modeler app. For more information, see Get Started with Time Series Forecasting.
Open the Neural Net Time Series App
Before R2026a: MATLAB Toolstrip: On the Apps tab, under Machine Learning and Deep Learning, click the app icon.
MATLAB command prompt: Enter
ntstool.
Algorithms
The Neural Net Time Series app provides built-in training algorithms that you can use to train your neural network.
| Training Algorithm | Description |
|---|---|
| Levenberg-Marquardt | Update weight and bias values according to Levenberg-Marquardt optimization. Levenberg-Marquardt training is often the fastest training algorithm, although it does require more memory than other techniques. To
implement this algorithm, the Neural Net Time Series
app uses the |
| Bayesian regularization | Bayesian regularization updates the weight and bias values according to Levenberg-Marquardt optimization. It then minimizes a combination of squared errors and weights, and determines the correct combination so as to produce a network that generalizes well. This algorithm typically takes longer but is good at generalizing to noisy or small data sets. To
implement this algorithm, the Neural Net Time Series
app uses the |
Scaled conjugate gradient backpropagation | Scaled conjugate gradient backpropagation updates weight and bias values according to the scaled conjugate gradient method. For large problems, scaled conjugate gradient is recommended as it uses gradient calculations which are more memory efficient than the Jacobian calculations used by Levenberg-Marquardt or Bayesian regularization. To
implement this algorithm, the Neural Net Time Series
app uses the |
Version History
See Also
Time Series
Modeler | Deep Network
Designer | Deep Network
Quantizer | Experiment
Manager | Classification Learner (Statistics and Machine Learning Toolbox) | Regression Learner (Statistics and Machine Learning Toolbox) | fitrnet (Statistics and Machine Learning Toolbox) | fitcnet (Statistics and Machine Learning Toolbox) | trainnet | trainingOptions | dlnetwork
