Reduce learning rate after certain number of epoch
1 view (last 30 days)
Hi, I have a question about reducing the learning rate or another way to enhance accuracy during training a deep learning model.
Suppose the loss metric does not improve after a defined epoch (for example 8). Is there a way to reduce the learning rate?
If the accuracy metric (or loss metric) fails to improve after a certain number of epochs, give another solution to improve accuracy.
Thank you all for your time and consideration