Neural Network Loss Function: Mean (absolute) Cubic Error
9 views (last 30 days)
Show older comments
Hello,
for my neural network, it's very important to not have a high error-range, i.e. a higher mean-error is better than a higher error-range.
That's why I'd like to implement a different loss function. My network has a regressionLayer Output which computes loss based on mean squared error. To increase the weight of errors that lie further away, I'd like to change that into a mean cubic error.
The standard loss function of the regression Layer is and I'd like to perform a tiny change to or alternatively .
Is that possible in a not so complicated way?
Thank you for your help in advance,
Best regards
0 Comments
Answers (1)
Torsten
on 21 Mar 2022
You want the error to be negative if t_i < y_i ?
This won't work: The loss function should always be non-negative.
See Also
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!