...
When the job completes output will be written to helloworld.Rout
MPI
*If anybody has been using R 3.1.1 with Rmpi and knows what version works with it please let us know.
If you prefer to use one of the other MPI implementations compatible with Rmpi, such as MPICH, feel free to install your local package. This was how OpenMPI was installed in a session of R started with fisbatch (change the values in blue to whatever you want):
...
The Rmpi
package has to be installed to work with MPI in R. In addition, you have to either install locally or load the module of a specific MPI implementation.
An example of how to install Rmpi
using the module openmpi/gcc/64/1.10.7
can be found below. Note that, the package snow
has to be installed as well.
Code Block |
---|
module load r/34.12.1 mpi/openmpi module load openmpi/gcc/64/1.10.1-gcc R 7 R .libPaths("~/rlibs") # assuming you are installing your # packages at the ~/rlibs folder install.packages('"Rmpi'", lib configure.args='--with-Rmpi-include=/apps2/openmpi/1.10.1-gcc/include --with-Rmpi-libpath=/apps2/openmpi/1.10.1-gcc/lib --with-Rmpi-type=OPENMPI') # When prompted for the mirror, try TX (i.e. 121 at the time of writing) since some mirrors are prob |
Note |
---|
This part is to be updated. |
...
= "~/rlibs", repo = "https://cloud.r-project.org/",
configure.args = "--with-mpi=/cm/shared/apps/openmpi/gcc/64/1.10.7")
install.packages("snow", lib = "~/rlibs", repo = "https://cloud.r-project.org/") |
To submit a MPI slurm job, we created the submit-mpi.slurm
file (see code below). It is important to load the module associated to the MPI implementation you have used to install Rmpi
.
Code Block |
---|
#!/bin/bash #SBATCH -p general #SBATCH -n 30 source /etc/profile.d/modules.sh module purge module load r/4.2.1 mpi/openmpi/gcc/64/1.10.7 # If MPI tells you that forking is bad uncomment the line below # export OMPI_MCA_mpi_warn_on_fork=0 Rscript mpi.R |
...
Code Block |
---|
library(parallel)
.libPaths("~/rlibs")
hello_world <- function() {
## Print the hostname and MPI worker rank.
paste(Sys.info()["nodename"],Rmpi::mpi.comm.rank(), sep = ":")
}
cl <- makeCluster(Sys.getenv()["SLURM_NTASKS"], type = "MPI")
clusterCall(cl, hello_world)
stopCluster(cl) |
...
Read R's built-in "parallel" package documentation for tips on parallel programming in R: https://stat.ethz.ch/R-manual/R-devel/library/parallel/doc/parallel.pdfEach version of R may depend on a different version of MPI, what follows is the known dependencies as of Thu Jun 22 13:30:50 EDT 2017: