During parfor-loop, suddenly get the error "unable to read file"
Show older comments
parfor k = 1:100000
% something else
tmpStruct = load(filename);
% something else
end
I have 3 scripts like the one above. I am running them on 3 different nodes of a cluster.
After some iterations, one job get the error "unable to read file, no such file or directory". This is confusing, since the file does exist and the other two jobs can read it.
I thought this is due to limited file handle of the linux system. But I don't understand why the load() function is related to file handle.
And, if they are related, how can I aviod this "limited file handle" problem? I tried to increase the file handle of linux, but it seems it will always exceed the limit if I run several jobs togegher.
By the way, I am definitely sure that I did not use fopen() in my script.
9 Comments
Mario Malic
on 12 Dec 2020
If workers are trying to read the same file at the same time, that will most likely cause an issue. Are you loading the same file, does anything in it changes so that you have to load it every single time?
Xingwang Yong
on 12 Dec 2020
Mario Malic
on 12 Dec 2020
It can be troublesome, yes. I am not able to tell exactly how you should solve your issue, someone with more expertise in parallel computing may help. Try loading the file into the table (or variable) before the parfor call and adjust your code for it. It'll save you a lot of computing time, because you don't have to load every single time. If it doesn't work, see parfeval, spmd.
Xingwang Yong
on 12 Dec 2020
Xingwang Yong
on 12 Dec 2020
Mario Malic
on 13 Dec 2020
You could also check composite and parallel.pool.Constant. I think it depends on the filesystem and the file format. I still think that you should completely avoid load, because it is inefficient.
You have mentioned that 3 nodes of cluster are running this, question is - how do they load the file? Do you supply full or relative path? Does each node has its copy of file?
This error message "unable to read file, no such file or directory" implies that file is either corrupted or MATLAB cannot read it (does not have read access), or path to file is not correct.
I don't know how load actually works, but when a file is opened, access to it may or may not be locked to other processes. Even though it takes almost no time to load the file, you have multiple workers that may access the same file (which might be problematic, I can't tell for sure) at the same time, and cause such error.
Xingwang Yong
on 13 Dec 2020
Mario Malic
on 13 Dec 2020
Edited: Mario Malic
on 13 Dec 2020
Search here for your issues, I have seen some comments that network drives can cause read access issues. You can try creating a copy of the file for each node, which still won't completely negate your issues as each node have workers that try to access the files. That's why I am recommending you to avoid load, at least within parfor loop.
Xingwang Yong
on 14 Dec 2020
Answers (0)
Categories
Find more on Parallel for-Loops (parfor) in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!