Failed to start a 'local' parpool of 32 workers
10 views (last 30 days)
I tried to start a 'local' parpool of 32 workers but it failed. This is MatLab 2017b on a hpc system which have 32 MatLab DCS licenses.
>> c = parcluster()
Number Pending: 0
Number Queued: 0
Number Running: 0
Number Finished: 0
>> p = c.parpool(32)
Starting parallel pool (parpool) using the 'local' profile ...
Error using parallel.Cluster/parpool (line 86)
Failed to start a parallel pool. (For information in addition to the causing
error, validate the profile 'local' in the Cluster Profile Manager.)
Error using parallel.internal.pool.InteractiveClient>iThrowWithCause (line
Failed to start pool.
Error using parallel.Job/submit (line 351)
An unexpected error occurred accessing properties: "CaptureDiary"
"CreateDateTime" "CreateTime" "DependentFiles" "Diary" "Error"
"ErrorIdentifier" "ErrorMessage" "FinishDateTime" "FinishTime"
"Function" "InputArguments" "DiagnosticWarnings" "Name"
"NumOutputArguments" "OutputArguments" "StartDateTime" "StartTime"
Error using save
Error closing file
The file may be corrupt.
I tried distcomp.feature( 'LocalUseMpiexec', false ) but it didn't help solve the problem. Do you have any idea why?
Jason Ross on 2 Jan 2019
There seems to be an issue accessing your home directory. Try changing the JobStorageLocation to a local directory on the host, e.g. make a directory called /tmp/ninhdo/jobstorage and change the "JobStorageLocation" in the "Local" profile to point there. You can change this property through the Parallel > Manage Cluster Configurations menu, just edit the "Local" profile.
As for what the underlying issue is with your home directory is, it could be that the file is corrupt, it could be that it's being accessed by another local cluster elsewhere, it could be a permissions issue to/from this host, etc.