Matlab and running parralel jobs on our workstations and compute server

Making binaries and using them for running multiple jobs with different parameters and different settings on Hico and HyperspecLab*

Let’s say you want to create a binary of a Matlab file: In Matlab, do the following:
mcc -m Main_CDPCA_Veg_eigGram_pt001.m -a ./*

Do this for all Main* functions you want to compile.

Exit Matlab, and then, for every function, there will be a shell script in that folder (*.sh). Use nohup to run each of those (with “nohup”, you ensure that your job is running in the background, even if you exit the shell… for example, if you submitted the job remotely via a secure shell). This way, you will not be checking out a license for every run of Matlab, since you are only running the executable on your machine.

nohup ./run_Main_CDPCA_Veg_eigGram_pt001.sh /usr/local/MATLAB/R2012a/ >LogRDWTCDPCA_pt001.txt &

Now you can run as many executables as your system will allow without substantially slowing down the workstation. For our linux workstations (desktops), in my experience you can run 4-6 typical jobs involving comparison_analysis before the CPU usage goes to 100%.

Issues with MCRCache when working in the UH_EGR Unix environment, and how to resolve them:

Matlab creates some MCRCache folders and files in your home directory (~). However, in the UH Engineering unix environment , your home directory is not on your desktop, but rather on a server, and has  very little disk quota. As soon as you exhaust the quota, any attempts to compile into a binary from within Matlab will likely fail. A solution is to change the folder where these temporary files are stored by Matlab. Your machine has a /data folder where you can do this fix, by, for example:

mkdir /data/saurabh/tmp
TMPDIR=/data/saurabh/tmp/
export MCR_CACHE_ROOT=$TMPDIR

That solved my problems with compiling code and such.

This entry was posted in Matlab. Bookmark the permalink.

Comments are closed.