How read extremely large HDF5 format data and resolve out of memory issue?
13 views (last 30 days)
Show older comments
I need to read several datasets from a 2TB HDF5 file which are needed for further computation.
If I simply coding as following,
varible1=h5read('path to .h5 file', 'path to dataset')
It would require ~500 GB array memory.
Is there any good way to solve this problem?
Thanks!
1 Comment
Answers (1)
ROSEMARIE MURRAY
on 3 May 2022
You could use a fileDatastore with the read function h5read, which would allow you to specify a certain amount to read at a time.
Or you could try this datastore: https://www.mathworks.com/matlabcentral/fileexchange/64919-hdf5-custom-file-datastore-for-timeseries-in-matlab
0 Comments
See Also
Categories
Find more on HDF5 in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!