Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 24 Current »

The Schrödinger Suite is a collection of software for chemical and biochemical use. It offers various tools that facilitate the investigation of the structures, reactivity and properties of chemical systems. There is a campus site license for this software, supported by UITS. More information is available here: http://software.uconn.edu/schrodinger/ .

It is recommended currently to use Schrodinger through an interactive session because of some issues encountered when submitting jobs through submission scripts.

Start an interactive session:

srun --x11 -N 1 -n 126 -p general --constraint=epyc128 --pty bash

Make sure to include the “--x11” flag for the GUI

Load Modules

Once a node is assigned to the interactive srun job in the previous section, schrodinger can be loaded from one of the various modules available on HPC.

module load schrodinger/2023-3

You can then see a list of executable programs:

find /gpfs/sharedfs1/admin/hpc2.0/apps/schrodinger/2023-3/ -maxdepth 1 -executable -type f -printf "%f\n" | sort | pr -tT -8 | column -t
autots    elements  hppmap    knime     mxmd      phase_qs  qpld      ska
biolumin  epik      ifd       lambda_d  oned_scr  phase_sc  qsite     ssp
blast     epikx     ifd-md    licadmin  para_tes  pipeline  quick_sh  sta
bmin      fep_abso  impact    ligand_s  pfam      prime     run       structur
confgen   fep_plus  installa  ligprep   phase_bu  prime_mm  schrodin  testapp
confgenx  fep_solu  jaguar    machid    phase_da  primex    shape_sc  vsw
consensu  ffbuilde  jobcontr  macromod  phase_fi  qikfit    shape_sc  watermap
constant  generate  jsc       maestro   phase_fq  qikprop   shape_sc  wscore
covalent  gfxinfo   jws       material  phase_hy  qiksim    sitemap   xtb
desmond   glide

You can also see a list of utilities with the same find command above:

find /gpfs/sharedfs1/admin/hpc2.0/apps/schrodinger/2023-3/utilities/ -maxdepth 1 -executable -type f -printf "%f\n" | sort | pr -tT -8 | column -t
2d_sketc  canvasHC  cg_chsr   create_w  jaguar_p  mtzprint  project_  structal
abs       canvasHC  check_jo  create_x  jaguar_t  multisim  project_  structca
align_bi  canvasKM  check_re  create_x  jaguar_t  neutrali  project_  structco
align_hy  canvasKP  check_th  create_x  jnanny    numfreqc  propfilt  structsh
align_li  canvasLC  ch_isost  custom_p  jobcontr  obabel    proplist  structsu
anharmon  canvasLi  ch_ligan  desalter  jresults  para_bmi  protassi  structur
apbs      canvas_m  ch_water  elim.sch  jserver   para_epi  py.test   stu_add
applyhtr  canvasMC  cluster_  epharmac  jserver_  para_lig  query_gp  stu_dele
autoqsar  canvasMD  combinat  extract_  lictest   path_fin  queue_bm  stu_exec
AutoTSRe  canvasML  combinat  feature_  licutil   pbs_lic_  randsub   stu_extr
AutoTSRe  canvasMo  combinat  feature_  ligand_i  pdbconve  refconve  stu_modi
AutoTSRe  canvasNn  combinat  ffld_ser  ligfilte  phase_al  render_k  stu_work
AutoTSUn  canvasPC  compare_  flex_ali  ligparse  phase_cl  r_group_  system_b
babel     canvasPC  configur  flexlm_s  lp_filte  phase_co  r_group_  tautomer
bandshap  canvasPC  conf_tem  fragment  lp_label  phase_co  ring_con  thermoch
buildloo  canvasPC  convert_  generate  lp_nored  phase_de  ring_tem  timestam
canvasBa  canvasPh  convert_  generate  macro_pk  phase_hy  rmsdcalc  uffmin
canvasCo  canvasPL  convert_  getpdb    maegears  phase_hy  rsync_pd  unique_n
canvasCS  canvasPr  convert_  glide_en  maetopqr  phase_hy  sdconver  uniquesm
canvasCS  canvasPW  corefind  glide_me  mae_to_s  phase_mm  secstruc  update_B
canvasCS  canvasRP  create_h  glide_so  make_lin  phase_pr  seqconve  update_P
canvasDB  canvasSc  create_h  guardian  make_r_l  phase_qs  serial_s  vis2gc
canvasFP  canvasSD  create_h  hetgrp_f  md5diges  phase_vo  shape_sc  visdump
canvasFP  canvasSe  create_h  hit_expa  merge_du  postmort  show_joi  watermap
canvasFP  canvasSO  create_i  impref    micro_pk  premin    smiles_t  wscore_m
canvasFP  canvasSO  create_m  ionizer   modify_s  prepwiza  spectrum  wscore_r
canvasFP  canvasTr  create_r  ionizer_  mol2conv  profile_  stereoiz  zip_temp
canvasHC  ccp42cns  create_s  jagconve  moldescr  profile_  store_re  ziputil

Example Application Usage

qsite

qsite -SAVE -PARALLEL 24 3IIS_Per1.in 

Note that the numeric value of -PARALLEL should match the numeric value of the -n declaration that you specified in the previous srun command.

Jaguar

jaguar run nameofinput.in

There is a way to target a specific Schrodinger application or utility with the following syntax:

You can then view the status of your running job with sacct.

sacct
       JobID    JobName  Partition    Account  AllocCPUS      State ExitCode 
------------ ---------- ---------- ---------- ---------- ---------- -------- 
39148       j3IIS_Per1   hi-core   abc12345         24    RUNNING      0:0 
391148.0       hostname              abc12345         24  COMPLETED      0:0

Run Test Suite

testapp -DEBUG
para_testapp -DEBUG

Installation Oddities

Schrödinger comes pre-packaged with an outdated version of mpi(< 1.8.1), meaning an old bug in the MPI->SLURM interface needs to be manually patched by appending the following line to schrodinger's mpi's default config file:

plm_slurm_args = --cpu_bind=boards

Command to call a Schrodinger utility

"${SCHRODINGER}/utilities/multisim" -JOBNAME desmond_md_job_TREK1model_1ms < restOfCommandOptions >

Launching and disconnecting from an interactive fisbatch Schrodinger job

Schrodinger can be run interactively through srun or fisbatch.

The srun solution above is good for a single interactive calculation that can be left up and running without any disconnections.

If there are network or power interruptions while the Interactive Schrodinger srun job is running, the srun job will end and progress will be lost.

An alternative to avoid potential network/power interruptions for an interactive SLURM srun job would be to submit an interactive fisbatch job to HPC.

Fisbatch is older and it does have some bugs.

Fisbatch allocates a compute node to the job session, which allows users to spawn a calculation interactively through a screen session that launches on the assigned compute node,.

Users can also disconnect from the fisbatch job, and reattach to the job to track the progress of various calculations.

Here is an example to allocate an AMD EPYC compute node with 126 cores through fisbatch under the general partition:

fisbatch -N 1 -n 126 -p general --constraint='epyc128'

FISBATCH -- waiting for JOBID jobidhere to start on cluster=slurm and partition=general
.........................!
Warning: Permanently added 'cXX,137.99.x.x' (ECDSA) to the list of known hosts.
FISBATCH -- Connecting to head node (cnXX)

Once a compute node is assigned and the fisbatch job is running, schrodinger can be loaded normally through the module.

module load schrodinger/2023-3

Once schrodinger is loaded, the Schrodinger commands will become available and the Schrodinger calculations can be called through one of the many Schrodinger suites.

To disconnect from a fisbatch job, enter the following key strokes:

“Ctrl-a then Ctrl-d”

The screen session that fisbatch spawns on the compute node should detach and the fisbatch job will continue running.

To confirm that the job is still running, the following SLURM command can be entered:

shist

To reattach to the fisbatch job, the following command can be entered:

 fisattach jobidhere

The fisbatch screen session should reattach and the session enabled for the specific job and the Schrodinger calculation should still be running.

If a network/power interruption happens while a fisbatch job is up and running, the job will end. The network/power interruptions will not affect a job that is detached and running unless the specific assigned node runs into hardware/network problems.

  • No labels