Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

The Schrödinger Suite is a collection of software for chemical and biochemical use. It offers various tools that facilitate the investigation of the structures, reactivity and properties of chemical systems. There is a campus site license for this software, supported by UITS. More information is available here: http://software.uconn.edu/schrodinger/ .

Info

It is recommended currently to use Schrodinger through an interactive session because of some issues encountered when submitting jobs through submission scripts.

Start an interactive session:

Code Block
srun --x11 -N 1 -n 126 -p general --constraint=epyc128 --pty bash
Info

Make sure to include the “--x11” flag for the GUI

Load Modules

Once a node is assigned to the interactive srun job in the previous section, schrodinger can be loaded from one of the various modules available on HPC.

Code Block
module load schrodinger/20222023-43

You can then see a list of executable programs:

Code Block
find /gpfs/sharedfs1/admin/hpc2.0/apps/schrodinger/20222023-43/ -maxdepth 1 -executable -type f -printf "%f\n" | sort | pr -tT -8 | column -t
autots    desmond   gfxinfo   jsc       material  phase_hy  qiksim    ska biolumin  elements  glidehppmap     jws  knime     mxmd      phase_qs  qpld      ssp
blast   ska
biolumin  epik      hppmapifd    knime   lambda_d  oned_scr  phase_sc  qsite     stassp
bmin blast     epikx     ifd  -md     licadmin  para_tes  pipeline  runquick_sh  sta
  bmin  structur confgen   fep_abso  ifd-mdimpact    ligand_s  pfam      prime     run    schrodin   structur
testappconfgen confgenx  fep_plus  impact  installa  ligprep   phase_bu  prime_mm  shape_scschrodin  vswtestapp
consensuconfgenx  fep_solu  jaguar  installa  machid    phase_da  primex    shape_sc  watermapvsw
constantconsensu  ffbuilde  jaguar jobcontr   macromod  phase_fi  qikfit    shape_sc  wscorewatermap
covalentconstant  generate  jsc  jobcontr     maestro   phase_fq  qikprop   shape_sc  wscore
covalent  gfxinfo   jws       material  phase_hy  qiksim    sitemap   xtb
desmond   glide

You can also see a list of utilities with the same find command above:

Code Block
find /gpfs/sharedfs1/admin/hpc2.0/apps/schrodinger/20222023-43/utilities/ -maxdepth 1 -executable -type f -printf "%f\n" | sort | pr -tT -8 | column -t
2d_sketc  canvasHC  ccp42cnscg_chsr   create_xw  jnannyjaguar_p    neutralimtzprint  project_  store_restructal
abs       canvasHC  cgcheck_chsr jo  create_x  jobcontrjaguar_t  numfreqcmultisim  project_  structalstructca
align_bi  canvasJocanvasKM  check_jore  create_x  jserver   obabel    jaguar_t  neutrali  project_  structco
align_hy  canvasKP  check_th  create_x  jnanny    numfreqc  propfilt  structcastructsh
align_hyli  canvasKMcanvasLC  checkch_thisost  custom_p  jobcontr  jserver_obabel  para_bmi  proplist  structcostructsu
align_lianharmon  canvasKPcanvasLi  ch_isostligan  desalter  lictest jresults  para_epibmi  protassi  structur
apbs   structsh anharmon  canvasLCcanvas_m  ch_liganwater  elim.sch  licutiljserver   para_ligepi  py.test   structurstu_add
apbsapplyhtr  canvasMC    canvasLi  ch_watercluster_  epharmac  ligandjserver_i  pathpara_finlig  query_gp  stu_adddele
applyhtrautoqsar  canvas_mcanvasMD  cluster_combinat  extract_  lictest ligfilte  pbspath_lic_fin  queue_bm  stu_deleexec
autoqsarAutoTSRe  canvasMCcanvasML  combinat  feature_  ligparselicutil   pdbconvepbs_lic_  randsub   stu_execextr
AutoTSRe  canvasMDcanvasMo  combinat  feature_  lpligand_filtei  phase_alpdbconve  refconve  stu_extrmodi
AutoTSRe  canvasMLcanvasNn  combinat  ffld_ser  lp_labelligfilte  phase_clal  render_k  stu_modiwork
AutoTSReAutoTSUn  canvasMocanvasPC  combinatcompare_  flex_ali  lp_noredligparse  phase_cocl  r_group_  stusystem_workb
babel     canvasNncanvasPC  compare_configur  flexlm_s  macrolp_pkfilte  phase_co  r_group_  system_btautomer
bandshap  canvasPC  configurconf_tem  fragment  maegearslp_label  phase_deco  ring_con  tautomerthermoch
buildloo  canvasPC  confconvert_tem  generate  maesubselp_nored  phase_hyde  ring_tem  thermochtimestam
canvas_acanvasBa  canvasPCcanvasPh  convert_  generate  maetopqrmacro_pk  phase_hy  rmsdcalc  timestamuffmin
canvasBacanvasCo  canvasPCcanvasPL  convert_  getpdb    mae_to_smaegears  phase_hy  rsync_pd  uffminunique_n
canvasCocanvasCS  canvasPhcanvasPr  convert_  glide_en  makejbasmaetopqr  phase_mmhy  sdconver  unique_nuniquesm
canvasCS  canvasPLcanvasPW  convert_corefind  glide_me  makemae_to_lins  phase_prmm  sdsubsetsecstruc  uniquesmupdate_B
canvasCS  canvasPrcanvasRP  corefindcreate_h  glide_so  make_r_llin  phase_qspr  secstrucseqconve  update_BP
canvasCScanvasDB  canvasPWcanvasSc  create_h  guardian  md5digesmake_r_l  phase_voqs  seqconveserial_s  vis2gc
canvasDBcanvasFP  canvasRPcanvasSD  create_h  hetgrp_f  merge_dumd5diges  postmortphase_vo  serialshape_ssc  visdump
canvasFP  canvasSccanvasSe  create_h  hit_expa  micromerge_pkdu  preminpostmort    shapeshow_scjoi  watermap
canvasFP  canvasSDcanvasSO  create_hi  impref    micro_pk  mol2convpremin  prepwiza  showsmiles_joit  wscore_m
canvasFP  canvasSecanvasSO  create_im  ionizer   moldescrmodify_s  profile_prepwiza  smiles_tspectrum  wscore_r
canvasFP  canvasSOcanvasTr  create_mr  ionizer_  mtzprintmol2conv  profile_  spectrumstereoiz  zip_temp
canvasFPcanvasHC  canvasSOccp42cns  create_s  jagconve  multisimmoldescr  projectprofile_  stereoizstore_re  ziputil
canvasHC  canvasTr  create_w  jaguar_p

Example Application Usage

...

Code Block
qsite -SAVE -PARALLEL 24 -HOST slurm-parallel-24 3IIS_Per1.in 
Launching JAGUAR under jobcontrol.
Exec: /gpfs/sharedfs1/admin/hpc2.0/apps/schrodinger/2022-4/jaguar-v11.8/bin/Linux-x86_64
JobId: job60-login5-1674022

Note that the numeric value of -PARALLEL should match the numeric value of the -HOSTn declaration that you specified in the previous srun command.

Jaguar

Code Block
jaguar run nameofinput.in

There is a way to target a specific Schrodinger application or utility with the following syntax:

You can then view the status of your running job with sacct.

...

Run Test Suite

Code Block
testapp -HOST slurm-parallel-24 -DEBUG
para_testapp -HOST slurm-parallel-48 -DEBUG

Installation Oddities

...

Code Block
"${SCHRODINGER}/utilities/multisim" -JOBNAME desmond_md_job_TREK1model_1ms < restOfCommandOptions >

Example Submission Script CPU

...

Launching and disconnecting from an interactive fisbatch Schrodinger job

Schrodinger can be run interactively through srun or fisbatch.

The srun solution above is good for a single interactive calculation that can be left up and running without any disconnections.

If there are network or power interruptions while the Interactive Schrodinger srun job is running, the srun job will end and progress will be lost.

An alternative to avoid potential network/power interruptions for an interactive SLURM srun job would be to submit an interactive fisbatch job to HPC.

Fisbatch is older and it does have some bugs.

Fisbatch allocates a compute node to the job session, which allows users to spawn a calculation interactively through a screen session that launches on the assigned compute node,.

Users can also disconnect from the fisbatch job, and reattach to the job to track the progress of various calculations.

Here is an example to allocate an AMD EPYC compute node with 126 cores through fisbatch under the general partition:

Code Block
fisbatch -N 1 -n 126 -p general --constraint='epyc128'

               # Request AMD EPYC node for the job
#SBATCH --mail-type=END                        # Event(s) that triggers email notification (BEGIN,END,FAIL,ALL)
#SBATCH --mail-user=first.lastname@uconn.edu   # Destination email address

module load schrodinger/2022-4

host=`srun hostname|head -1`
nproc=`srun hostname|wc -l`
<schrodinger program> -HOST ${host}:${nproc} <other options>

Example Submission Script GPU

Code Block
#!/bin/bash
#SBATCH --partition=general-gpu                # Name of Partition
#SBATCH --ntasks=20                            # Maximum CPU cores for job
#SBATCH --nodes=1                              # Ensure all cores are from the same node
#SBATCH --mem=128G                             # Request 128 GB of available RAM
#SBATCH --gres=gpu:2                           # Request 2 GPU cards for the job
#SBATCH --mail-type=END                        # Event(s) that triggers email notification (BEGIN,END,FAIL,ALL)
#SBATCH --mail-user=first.lastname@uconn.edu   # Destination email address

module load schrodinger/2022-4

host=`srun hostname|head -1`
nproc=`srun hostname|wc -l`
<schrodinger program> -HOST ${host}:${nproc} <other options>

FISBATCH -- waiting for JOBID jobidhere to start on cluster=slurm and partition=general
.........................!
Warning: Permanently added 'cXX,137.99.x.x' (ECDSA) to the list of known hosts.
FISBATCH -- Connecting to head node (cnXX)

Once a compute node is assigned and the fisbatch job is running, schrodinger can be loaded normally through the module.

Code Block
module load schrodinger/2023-3

Once schrodinger is loaded, the Schrodinger commands will become available and the Schrodinger calculations can be called through one of the many Schrodinger suites.

To disconnect from a fisbatch job, enter the following key strokes:

“Ctrl-a then Ctrl-d”

The screen session that fisbatch spawns on the compute node should detach and the fisbatch job will continue running.

To confirm that the job is still running, the following SLURM command can be entered:

shist

To reattach to the fisbatch job, the following command can be entered:

Code Block
 fisattach jobidhere

The fisbatch screen session should reattach and the session enabled for the specific job and the Schrodinger calculation should still be running.

If a network/power interruption happens while a fisbatch job is up and running, the job could potentially end. The network/power interruptions will not affect a job that is detached and running unless the specific assigned node runs into hardware/network problems.