Training a convolutional neural network with matconvnet using an hdf5 file.
Show older comments
Hi,
I have a large dataset (~1gb, hopefully with expansion to 100gb) stored hierarchically in an hdf5 file which I'd like to use with neural networks, specifically the MatConvNet package with MexConv3D (and R2016b). Ideally, I'd like to not have to load the entire file into memory, so is there a way of achieving something similar to the 'matfile' command with hdf5? Alternatively, is there a way to do something along the lines of (in pseudocode):
for all images in file:
image=hdfread('/Path/to/dataentry') #Load in one image
train_neural_net(image) #Do bit of training needed just on one image in the datafile.
deallocate(image) #Wipe image from memory, keeping just the net
Many thanks in advance
2 Comments
per isakson
on 26 Oct 2017
- "something similar to the 'matfile' command with hdf5" No.
- "Alternatively, is there a way to do something along the lines of (in pseudocode):" Yes, that's straightforward.
sally
on 14 Jan 2018
an unrelated comment to the answer.. but a question please, have u used the MexConv3D package as a CNN for 3D input images? if so, how to train this algorithm on a large dataset of 3D images? how to update the weights..
Accepted Answer
More Answers (0)
Categories
Find more on HDF5 in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!