Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 19 Next »

The Schrödinger Suite is a collection of software for chemical and biochemical use. It offers various tools that facilitate the investigation of the structures, reactivity and properties of chemical systems. There is a campus site license for this software, supported by UITS. More information is available here: http://software.uconn.edu/schrodinger/ .

It is recommended currently to use Schrodinger through an interactive session because of some issues encountered when submitting jobs through submission scripts.

Start an interactive session:

srun --x11 -N 1 -n 126 -p general --constraint=epyc128 --pty bash

Make sure to include the “--x11” flag for the GUI

Load Modules

module load schrodinger/2023-3

You can then see a list of executable programs:

find /gpfs/sharedfs1/admin/hpc2.0/apps/schrodinger/2023-3/ -maxdepth 1 -executable -type f -printf "%f\n" | sort | pr -tT -8 | column -t
autots    elements  hppmap    knime     mxmd      phase_qs  qpld      ska
biolumin  epik      ifd       lambda_d  oned_scr  phase_sc  qsite     ssp
blast     epikx     ifd-md    licadmin  para_tes  pipeline  quick_sh  sta
bmin      fep_abso  impact    ligand_s  pfam      prime     run       structur
confgen   fep_plus  installa  ligprep   phase_bu  prime_mm  schrodin  testapp
confgenx  fep_solu  jaguar    machid    phase_da  primex    shape_sc  vsw
consensu  ffbuilde  jobcontr  macromod  phase_fi  qikfit    shape_sc  watermap
constant  generate  jsc       maestro   phase_fq  qikprop   shape_sc  wscore
covalent  gfxinfo   jws       material  phase_hy  qiksim    sitemap   xtb
desmond   glide

You can also see a list of utilities with the same find command above:

find /gpfs/sharedfs1/admin/hpc2.0/apps/schrodinger/2023-3/utilities/ -maxdepth 1 -executable -type f -printf "%f\n" | sort | pr -tT -8 | column -t
2d_sketc  canvasHC  cg_chsr   create_w  jaguar_p  mtzprint  project_  structal
abs       canvasHC  check_jo  create_x  jaguar_t  multisim  project_  structca
align_bi  canvasKM  check_re  create_x  jaguar_t  neutrali  project_  structco
align_hy  canvasKP  check_th  create_x  jnanny    numfreqc  propfilt  structsh
align_li  canvasLC  ch_isost  custom_p  jobcontr  obabel    proplist  structsu
anharmon  canvasLi  ch_ligan  desalter  jresults  para_bmi  protassi  structur
apbs      canvas_m  ch_water  elim.sch  jserver   para_epi  py.test   stu_add
applyhtr  canvasMC  cluster_  epharmac  jserver_  para_lig  query_gp  stu_dele
autoqsar  canvasMD  combinat  extract_  lictest   path_fin  queue_bm  stu_exec
AutoTSRe  canvasML  combinat  feature_  licutil   pbs_lic_  randsub   stu_extr
AutoTSRe  canvasMo  combinat  feature_  ligand_i  pdbconve  refconve  stu_modi
AutoTSRe  canvasNn  combinat  ffld_ser  ligfilte  phase_al  render_k  stu_work
AutoTSUn  canvasPC  compare_  flex_ali  ligparse  phase_cl  r_group_  system_b
babel     canvasPC  configur  flexlm_s  lp_filte  phase_co  r_group_  tautomer
bandshap  canvasPC  conf_tem  fragment  lp_label  phase_co  ring_con  thermoch
buildloo  canvasPC  convert_  generate  lp_nored  phase_de  ring_tem  timestam
canvasBa  canvasPh  convert_  generate  macro_pk  phase_hy  rmsdcalc  uffmin
canvasCo  canvasPL  convert_  getpdb    maegears  phase_hy  rsync_pd  unique_n
canvasCS  canvasPr  convert_  glide_en  maetopqr  phase_hy  sdconver  uniquesm
canvasCS  canvasPW  corefind  glide_me  mae_to_s  phase_mm  secstruc  update_B
canvasCS  canvasRP  create_h  glide_so  make_lin  phase_pr  seqconve  update_P
canvasDB  canvasSc  create_h  guardian  make_r_l  phase_qs  serial_s  vis2gc
canvasFP  canvasSD  create_h  hetgrp_f  md5diges  phase_vo  shape_sc  visdump
canvasFP  canvasSe  create_h  hit_expa  merge_du  postmort  show_joi  watermap
canvasFP  canvasSO  create_i  impref    micro_pk  premin    smiles_t  wscore_m
canvasFP  canvasSO  create_m  ionizer   modify_s  prepwiza  spectrum  wscore_r
canvasFP  canvasTr  create_r  ionizer_  mol2conv  profile_  stereoiz  zip_temp
canvasHC  ccp42cns  create_s  jagconve  moldescr  profile_  store_re  ziputil

Example Application Usage

qsite

qsite -SAVE -PARALLEL 24 -HOST slurm-parallel-24 3IIS_Per1.in 
Launching JAGUAR under jobcontrol.
Exec: /gpfs/sharedfs1/admin/hpc2.0/apps/schrodinger/2022-4/jaguar-v11.8/bin/Linux-x86_64
JobId: job60-login5-1674022

Note that the numeric value of -PARALLEL should match the numeric value of the -HOST that you specified.

You can then view the status of your running job with sacct.

sacct
       JobID    JobName  Partition    Account  AllocCPUS      State ExitCode 
------------ ---------- ---------- ---------- ---------- ---------- -------- 
39148       j3IIS_Per1   hi-core   abc12345         24    RUNNING      0:0 
391148.0       hostname              abc12345         24  COMPLETED      0:0

Run Test Suite

testapp -HOST slurm-parallel-24 -DEBUG
para_testapp -HOST slurm-parallel-48 -DEBUG

Installation Oddities

Schrödinger comes pre-packaged with an outdated version of mpi(< 1.8.1), meaning an old bug in the MPI->SLURM interface needs to be manually patched by appending the following line to schrodinger's mpi's default config file:

plm_slurm_args = --cpu_bind=boards

Command to call a Schrodinger utility

"${SCHRODINGER}/utilities/multisim" -JOBNAME desmond_md_job_TREK1model_1ms < restOfCommandOptions >

Example Submission Script CPU

#!/bin/bash
#SBATCH --partition=general                    # Name of Partition
#SBATCH --ntasks=126                           # Maximum CPU cores for job
#SBATCH --nodes=1                              # Ensure all cores are from the same node
#SBATCH --mem=492G                             # Request 128 GB of available RAM
#SBATCH --constraint='epyc128'                 # Request AMD EPYC node for the job
#SBATCH --mail-type=END                        # Event(s) that triggers email notification (BEGIN,END,FAIL,ALL)
#SBATCH --mail-user=first.lastname@uconn.edu   # Destination email address

module load schrodinger/2022-4

<schrodinger program> <other options>

Example Submission Script GPU

#!/bin/bash
#SBATCH --partition=general-gpu                # Name of Partition
#SBATCH --ntasks=20                            # Maximum CPU cores for job
#SBATCH --nodes=1                              # Ensure all cores are from the same node
#SBATCH --mem=128G                             # Request 128 GB of available RAM
#SBATCH --gres=gpu:2                           # Request 2 GPU cards for the job
#SBATCH --mail-type=END                        # Event(s) that triggers email notification (BEGIN,END,FAIL,ALL)
#SBATCH --mail-user=first.lastname@uconn.edu   # Destination email address

module load schrodinger/2022-4

<schrodinger program> <other options>

  • No labels