how to reduce memory cost for transfer learning with semantic segmentation

1 view (last 30 days)
I am trying to use transfer learning with semantic segmentation to classify 960x720x3 images. I am using matlab on my surface with 8GB RAM and a build in GPU. When I try to run the code I get the message that the memory is not sufficient. To resolve the issue I thought about using PCA on my pixel data but I only found a comparable example that used parallel computing.
Therefore I would like to ask if there are any other solutions to reduce the memory cost of my code or if there is any way to get acces to more memory via matlab? (perhaps something like cloud computing)

Answers (1)

Srivardhan Gadila
Srivardhan Gadila on 15 Apr 2021
Try reducing the mini-batch size using the 'MiniBatchSize' option of trainingOptions.
If reducing the mini-batch size does not work, then try using a smaller network, reducing the number of layers, or reducing the number of parameters or filters in the layers.
If the gpu memory is still not sufficient then you can train the network on the cpu by using the 'ExecutionEnvironment' option of trainingOptions.
You can refer to Deep Learning in Parallel and in the Cloud & Deep Learning in the Cloud for information related to cloud computing.

Categories

Find more on Parallel and Cloud in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!