Scripts for connecting VS Code to a non-interactive HPC compute node managed by the SLURM workload manager.
This repo has been forked from vscode-remote-hpc and (in parts) heavily modified: the server-side installation requires administrative access to the cluster head node(s), the client side installation supports macOS, Linux and Windows (PowerShell) and does not need special privileges.
Table of Contents
This script is designed to be used with the Remote- SSH extension for Visual Studio Code.
- Automatically starts a batch job, or reuses an existing one, for VS Code to connect to.
- No need to manually execute the script on the HPC, just connect from the remote
explorer and the script handles everything automagically using an ssh
ProxyCommand
.
Open PowerShell and run the following command
irm https://raw.githubusercontent.com/esi-neuroscience/vscode-remote-hpc/refs/heads/main/client/setup.ps1 | iex
Open a terminal (Terminal.App
in macOS) and run the following command:
curl -fsSL https://raw.githubusercontent.com/esi-neuroscience/vscode-remote-hpc/refs/heads/main/client/setup.sh | bash
The vscode-remote-hpc
host is now available in the VS Code Remote Explorer.
Connecting to this host will automatically launch a sbatch job on a HPC compute node,
wait for it to start, and connect to the node as soon as the job is running.
Thus, controlling VS Code remote HPC sessions can be done exclusively from
within VS Code itself.
Running jobs are automatically reused. If a running job is found, the script simply connects to it. You can safely open many remote windows and they will all share the same SLURM job.
Note that disconnecting the remote session in VS Code will not kill the corresponding SLURM job. If you close the remote window the SLURM job keeps running. Jobs are automatically killed by the SLURM controller when they reach their runtime limit. You can manually kill the job by logging on to the cluster head node and running the command
vscode-remote cancel
or manually by using squeue --me
to find the right SLURM job id followed by
scancel <jobid>
.
The vscode-remote
command installed on your HPC offers some additional commands
to list or cancel running jobs. You can invoke vscode-remote help
for more information.
You can customize the SLURM job for the vscode-remote-hpc
host by manually running
vscode-remote start <OPTIONS>
on your cluster head node. A custom partition can
be provided via
vscode-remote start <PARTITIONNAME>
For instance, on the ESI HPC cluster you could use vscode-remote start 32GBL
to
start a vscode-remote-hpc
host in the 32GBL partition (instead of the 8GBL default).
More sbatch
options can be provided using their respective sbatch
command flags, e.g.,
vscode-remote start mypartition -c 3 --mem-per-cpu 6000
To remove vscode-remote-hpc
either manually delete the "vscode-remote-hpc"
config block from your ssh configuration file and remove the generated ssh
key-pair (vscode-remote-hpc
+ vscode-remote-hpc.pub
) or run the respective
setup command again:
irm https://raw.githubusercontent.com/esi-neuroscience/vscode-remote-hpc/refs/heads/main/client/setup.ps1 | iex
curl -fsSL https://raw.githubusercontent.com/esi-neuroscience/vscode-remote-hpc/refs/heads/main/client/setup.sh | bash
The following applications must be executable for non-privileged users on on all compute nodes:
sshd
installed in/usr/sbin
or available in the$PATH
nc
(netcat) must be available on the login node(s)- compute node names must resolve to their internal IP addresses
- compute nodes must be accessible via IP from the login node
The client-side setup expects vscode-remote
as well as vscode-remote-job.sh
to reside in /usr/local/bin
. The recommended manner to set it up that way is
to clone this repository and use symlinks (so that future updates can be deployed
using a simple git pull
):
cd /path/to/cluter-fs/
git clone https://github.com/pantaray/vscode-remote-hpc.git
ln -s /path/to/cluter-fs/vscode-remote-hpc/server/vscode-remote.sh /usr/local/bin/vscode-remote
ln -s /path/to/cluter-fs/vscode-remote-hpc/server/vscode-remote-job.sh /usr/local/bin/vscode-remote-job.sh
Ensure that both scripts can be executed by non-privileged users.