- Experiment with Normalization: Try standardizing inputs with 'zscore' in addition to normalizing with 'mapminmax'. 'zscore' standardizes data by centering it around the mean with a unit standard deviation, which is particularly useful when the scale of features is different.
- Incorporate Validation Set: You can include a validation set in your data split. This can help monitor the model's performance during training and prevent overfitting. Adjust the data division to include a validation set, e.g., 70% training, 10% validation, and 20% testing.
- Hyperparameter Tuning: Experiment with different network architectures, such as varying the number of neurons and layers. Adjust training parameters like learning rate, epochs, and regularization to find the optimal configuration. You can also experiment with different training approaches like 'traingd', 'traingdm' and 'trainlm'.
- Data Quantity and Quality: Model performance heavily depends on the quality and quantity of data. You can try increasing the number of data points (if the data size is less) to help the model learn and generalize better. Also try ensuring that the data is representative of the problem space and includes diverse examples to capture all relevant patterns.
- Monitor performance on validation set: Getting predictions on the validation data is a useful step to evaluate how well your model is performing during the training process.
This MATLAB script implements a neural network to predict a target variable (range) from a dataset containing four input features (weights).It give too much error can someone help to improve it t normalizes the inputs and outputs, defines a feedforwa
38 views (last 30 days)
Show older comments
clc
clear
close all
% Load the dataset from a CSV file with headers
data = readtable('C:\Users\PMLS\Downloads\output_file.csv');
% Convert the table to a numeric array
dataArray = table2array(data);
% Normalize inputs and outputs using mapminmax
X = mapminmax(dataArray(:, 1:4)', 0, 1); % Normalize input features to [0, 1]
Y = mapminmax(dataArray(:, 5)', 0, 1); % Normalize target variable to [0, 1]
% Set display format to long for better precision
format long;
%rng(0); % Set random seed for reproducibility
% Create a simpler network with one hidden layer of 20 neurons
%net = feedforwardnet([40,70]);
net = feedforwardnet([70,40]);
% Change activation functions
net.layers{1}.transferFcn = 'poslin'; % ReLU
net.layers{2}.transferFcn = 'logsig'; % logsig
% Set the training function to Resilient Backpropagation
net.trainFcn = 'trainrp';
% Enable data division for validation and testing
net.divideFcn = 'divideblock';
net.divideParam.trainRatio = 0.8; % 70% training data
%net.divideParam.valRatio = 0.15; % 15% validation data
net.divideParam.testRatio = 0.2; % 15% testing data
% Custom training parameters
net.trainParam.epochs = 8000;
net.trainParam.goal = 0.0001;
net.trainParam.min_grad = 1e-6;
net.trainParam.max_fail = 2000;
net.trainParam.lr = 0.001; % Learning rate
net.trainParam.momentum = 0.9; % Momentum
%net.trainParam.batchSize = 10; % Example batch size
% Add regularization
net.performParam.regularization = 0.01; % Example L2 regularization
%Train the neural network
[net, tr] = train(net, X, Y);
% Predict and save results
predicted_ranges = net(X);
outputTable = array2table([X' Y' predicted_ranges'], ...
'VariableNames', {'w1', 'w2', 'w3', 'wpl', 'Actual_Range', 'Predicted_Range'});
writetable(outputTable, 'predicted_ranges_with_trainrp.xlsx');
% Calculate performance metrics
Y = Y'; % Transpose to match predicted output dimensions
predicted_ranges = predicted_ranges';
MAE = mean(abs(Y - predicted_ranges)); % Mean Absolute Error
MSE = mean((Y - predicted_ranges).^2); % Mean Squared Error
RMSE = sqrt(MSE); % Root Mean Square Error
R_squared = 1 - sum((Y - predicted_ranges).^2) / sum((Y - mean(Y)).^2); % R-squared
% Display the results in the command window
disp('Performance Metrics:');
disp(['Mean Absolute Error (MAE): ', num2str(MAE)]);
disp(['Mean Squared Error (MSE): ', num2str(MSE)]);
disp(['Root Mean Square Error (RMSE): ', num2str(RMSE)]);
disp(['R-squared (R^2): ', num2str(R_squared)]);
% Plot performance
plotperform(tr);
view(net);
% Plot the actual and predicted values as lines
figure;
plot(Y, 'b-', 'LineWidth', 1.5); % Plot actual values as a blue line
hold on;
plot(predicted_ranges, 'r--', 'LineWidth', 1.5); % Plot predicted values as a red dashed line
xlabel('Sample');
ylabel('Normalized Range');
legend('Actual Range', 'Predicted Range');
title('Comparison of Actual and Predicted Ranges');
grid on;
0 Comments
Answers (1)
Shantanu Dixit
on 30 Oct 2024 at 6:39
Hi Farrukh,
You can experiment with the below techniques to improve your model generalization and better performance:
[net, tr] = train(net, X, Y);
%% extract validation data indices
valIndices = tr.valInd;
X_val = X(:, valIndices);
Y_val = Y(valIndices);
%% predict on validation
predicted_val_ranges = net(X_val);
I hope this helps in getting a better idea on how to improve your model's generalization and performance!
For more details refer to the following MathWorks documentation on different neural network training methods:
0 Comments
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!