Using std2 with 3D Array on GPU
4 views (last 30 days)
Show older comments
I have a very large 3D image array (512 x 512 x 28000) uint16.
I am trying to calculate a standard deviation projection across the 3rd dimension, to create a 512 x 512 uint16 array, my current code is below
vol = uint16(rand(512 512 28000)) % placeholder array
stdVol = std(double(vol), [], 3);
stdVol = uint16(stdVol);
However due to the size of array, I get an out of memory error.
To address this I tried using tall arrays:
volTall = tall(vol);
stdVolTall = std(double(volTall), 0, 3);
stdVol = uint16(stdVolTall);
gather(stdVol);
But in this case I get the following error:
Error using tall/double Requested 7523532800x1 (56.1GB) array exceeds maximum array size preference. Creation of arrays greater than this limit may take a long time and cause MATLAB to become unresponsive. See array size limit or preference panel for more information. Learn more about errors encountered during GATHER.
I would like to use std2 on the GPU to solve this error and any help would be greatly appreciated.
0 Comments
Answers (0)
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!