Welcome to the Storrs High-Performance Computing (HPC) facility! This page is your friendly starting point for launching your computational research here at UConn. With instructions on everything from connecting to the HPC to submitting jobs, our step-by-step guide is tailored to make your journey into high-performance computing as smooth as possible. We’ve also designed this page to be adaptable to meet the needs of both newcomers and experts.
Let's get started so you can take advantage of everything our HPC has to offer!
Step 1: Make an account
The first step to using the HPC is to request an account. If you don't already have an account, please fill out the cluster application form. It’s free!
Setting up your account can take a bit of depending on how busy the Storrs HPC Admins are. While you’re waiting for your request to be filled, consider reading through the General HPC Guidelines. We’ll email you when your account is all set up. Then you can proceed to Step 2.
Step 2: Are you connected to UConn-Secure Wifi or an on-campus ethernet port?
Yes
Since you are on campus, you do not need to use the UConn VPN. If you’re in a rush—or excited to get on the HPC ASAP!—feel free to move directly to Step 3. If you’re up for it though, it’s probably worth reading how to log into the HPC when off campus. To learn how, just click “No” to this question.
No
To keep everyone’s data safe and secure, we only allow users to access the HPC from computers/tablets/phones that are connected to the UConn-Secure Wifi or ethernet ports at UConn. But don’t worry! You can still log in to the HPC from home! There’s just a couple of extra steps.
To connect to HPC from off campus, you will first need to connect to the UConn Virtual Private Network (VPN). The VPN makes sure you belong to the UConn community by having you log in with your netID and password. Once you connect to UConn’s VPN, you will be able to log in to the HPC.
Once you are connected to the UConn VPN, you can move to Step 3.
Step 3: What kind of computer do you have?
The instructions for logging into the HPC are slightly different depending on what kind of computer you have. Please select what kind you have from the below options:
Windows/PC
To log in to the HPC from a Windows computer, you need to use an SSH client. The Storrs HPC Admins recommend MobaXterm. Here are some abridged instructions:
Download MobaXterm
Install MobaXterm
Open MobaXterm
Click Session (top left corner )
Click SSH
For “Remote Host,” enter the following address: hpc2.storrs.hpc.uconn.edu
Check the “Specify username” checkbox
In the text box to the right of “Specify username,” enter your UConn netID
Click “OK”
A new tab should pop up asking for your password
Enter your password
Success! You are logged into the cluster and ready to move on to Step 4.
Note: For optimal use of some programs, Windows users may also need to install VcXsrv.
Mac
To log in to the HPC from a Mac, you need to use an SSH client and an X-server. We recommend XQuartz as the X-server. Here are some abridged logging in.
Note: Terminal is an app that is installed by default on all Macs.
Below is the basic command for logging into Terminal. Please replace “netid” with your actual netID (format: abc12345). Then hit “Enter.” It should ask for your Password.
ssh -Y netid@hpc2.storrs.hpc.uconn.edu
If it asks for your password, type in your password. Then hit enter.
Once connected, you should see a terminal prompt like this:
Welcome to the Storrs HPC cluster.
If you need technical support, please read the wiki or email us at hpc@uconn.edu
[netID@login4 ~]$
Once you are connected, you can move on to Step 4.
If the above instructions do not work, please see our troubleshooting instructions below.
Troubleshooting for Mac Users
Sometimes the default settings for logging into HPC from a Mac can get messed up, especially after big updates to the Mac operating system (Mac OSX).
To resolve this problem, a couple of lines may need to be added to the user's ~/.ssh/config file. You will have to open and edit the file from “Terminal” using a text editor, like Vim. Vim is installed by default on all Macs. Here are some abridged instructions on how to edit the file using Vim.
Open Terminal
Open the file like this:
vi ~/.ssh/config
Scroll to the bottom of the file
Click the letter “i”
Copy and paste the lines below into the file. HostkeyAlgorithms +ssh-rsa PubkeyAcceptedAlgorithms +ssh-rsa
Hit the “Esc” key.
Type :wq
Hit enter.
Once the above two lines are added to the local ~/.ssh/config on a Mac, you should be able to access the HPC using the normal instructions we mentioned above. If you are still experiencing issues, please feel free to reach out to the Storrs HPC Admins for help by emailing us at hpc@uconn.edu.
Linux
To log in to the HPC, aka “the cluster,” you need to use an SSH client. On Linux, open the terminal app and simply run:
(Where 'Your_NetID' is your own NetID consisting of 3 letters and 5 numbers)
Once connected, you should see a terminal prompt like this:
If the above works, then you can move on to Step 4. If you are experiencing issues, please feel free to reach out to the Storrs HPC Admins for help by emailing us at hpc@uconn.edu.
Note: We recommend that you also set up and use SSH keys, which have improved security and allow users to login without entering their password. But it is not required.
Step 4: Have you used the “Unix shell” before?
No…. What is this “shell” you speak of…?
Yes, but I’d love a refresher.
Yes, I use the Unix shell and command line regularly.
Step 5: Have you used an HPC before?
No
Yes
Step 6: Do the commands srun or fisbatch mean anything to you?
No / I am familiar with the commands but would like a refresher.
Yes, I have a long history of starting interactive jobs with various srun options on HPCs.
Step 7: Are you familiar with how batch jobs work on HPCs?
I’m a beginner
Yes, but I’d like a brief refresher
Yes, I have significant experience using sbatch to submit SLURM jobs
Step 8: Are you familiar with the modules system for loading software?
No
I would like a refresher
Yes
Step 9: Summary of Resources Available on the Storrs HPC
Standard Cluster Nodes
There are two main classes of standard nodes (i.e., without graphics cards a.k.a GPUs) available on the HPC cluster.
Name
Nodes
Availablecores per node**
Cores Total
RAM (GB)
Names
Name
Nodes
Availablecores per node**
Cores Total
RAM (GB)
Names
Epyc64
41
62
2,542
503
cn410-447,453-455
Epyc128
148
126
18,648
503
cn456-603
**2 cores are reserved per node for the OS and storage processes.
GPU Cluster Nodes
There are a variety of GPU nodes available on the Storrs HPC as well. Most of our GPU nodes are Epyc64 nodes with between 1 and 3 GPUs.
Partitions
Storrs HPC has several different “partitions,” groups of nodes with similar hardware and usage types. All users have access to general partitions, while additional access to priority nodes can be purchased. The HPC provides various computational setups (a.k.a. “architectures”) with different strengths too—such as high core count, ample RAM, or GPUs—allowing users to select the best hardware for their research.
If you’re new to HPCs, also consider checking out our glossary where we explain the components of an HPC and the basics of how SLURM works.
HPC applications
We have created helpful software guides to demonstrate how to effectively use popular scientific applications on the HPC cluster.
Troubleshooting
If you have trouble using the HPC, please view our troubleshooting guide where we explain how to document errors and get help from Storrs HPC Admins (i.e., “submit a ticket”). Another good stop is our Frequently Asked Questions (FAQ) page where we address many common errors.