...
The HPC cluster uses the university site license for SAS, w More information is available here: https://software.uconn.edu/software/sas/ .
...
Due to the nature of SAS and old library dependencies, SAS can only run in a containerized environment through single compute node job submissions and does not support multi-node submissions.
Using SAS on the cluster
...
You can view a list of all available MATLAB apptainer versions with:
Code Block |
---|
module avail apptainer |
...
To submit a job that uses a single core to run a MatLab SAS file TestClustermyprog.msas. Create a script called matlabSPsasSP.sh:
Code Block |
---|
#!/bin/bash #SBATCH --ntasks=1 #SBATCH --output=outputfile.txt #SBATCH --error=outputfile.txt module load apptainer apptainer exec --unsquash -H $HOME:/home -B /gpfs/sharedfs1/admin/hpc2.0/apps/sas/9.4:/gpfs/sharedfs1/admin/hpc2.0/apps/sas/9.4 SAS.sif /gpfs/sharedfs1/admin/hpc2.0/apps/sas/9.4/SASFoundation/9.4/sas myprog.sas |
NOTE: any ";" or " ' " after "-r" should be pre-appended by "\".Then submit the script:
Code Block |
---|
sbatch sasSP.sh |
Multithread job
To submit a job that uses 10 computational threads on one node, create a submission script sasMP.sh:
Code Block |
---|
#!/bin/bash
#SBATCH --nodes=1
#SBATCH --ntasks=10
#SBATCH --output=outputfile.txt
#SBATCH --error=outputfile.txt
module load apptainer
apptainer exec --unsquash -H $HOME:/home -B /gpfs/sharedfs1/admin/hpc2.0/apps/sas/9.4:/gpfs/sharedfs1/admin/hpc2.0/apps/sas/9.4 SAS.sif /gpfs/sharedfs1/admin/hpc2.0/apps/sas/9.4/SASFoundation/9.4/sas myprog.sas |
Then submit the script by:
Code Block |
---|
sbatch sasSPsasMP.sh |
...
Multithreaded GUI job
To submit a job that uses 10 computational threads on one node, create a submission script sasMPsasMPGPU.sh:
Code Block |
---|
#!/bin/bash #SBATCH --nodes=1 #SBATCH --ntasks=10 #SBATCH --partition=general-gpu #SBATCH --output=outputfile.txt #SBATCH --error=outputfile.txt #SBATCH --gres=gpu:1 module load apptainer apptainer exec --nv --unsquash -H $HOME:/home -B /gpfs/sharedfs1/admin/hpc2.0/apps/sas/9.4:/gpfs/sharedfs1/admin/hpc2.0/apps/sas/9.4 SAS.sif /gpfs/sharedfs1/admin/hpc2.0/apps/sas/9.4/SASFoundation/9.4/sas myprog.sas |
Then submit the script by:
Code Block |
---|
sbatch sasMPsasMPGPU.sh |
GUI/Interactive SAS use with SLURM
...
To run an interactive SAS session with GUI functionality, you should "ssh -XY" to the cluster from a Linux machine or macOS , macOS machine with X11 enabled, or a Windows machine with X11 enabled. For more info, check out our guide to using X11 on the HPC here. Then, run the below commands.
To open an interactive Matlab SAS window with 10 cores available, you will want to do the following:
Code Block |
---|
[netidhere@login6 ~]$ srun --x11 -N 1 -n 10 --pty bash srun: job 4261315 queued and waiting for resources srun: job 4261315 has been allocated resources [netidhere@cn528 ~]$ module load apptainer [netidhere@cn528 ~]$ apptainer exec --unsquash -H $HOME:/home -B /gpfs/sharedfs1/admin/hpc2.0/apps/sas/9.4:/gpfs/sharedfs1/admin/hpc2.0/apps/sas/9.4 SAS.sif /gpfs/sharedfs1/admin/hpc2.0/apps/sas/9.4/SASFoundation/9.4/sas |
SAS should load all the needed SAS windows.
The following screenshot gives an example on how SAS should look:
...
When browsing for SAS files under a directory, feel free to use the File Drop down list and navigate to the needed directory where the SAS files are located.
Note |
---|
Please DO NOT FORGET to EXIT from the nodes so that the other users can use it. Exit out of all the SAS windows and then type |
...