Accessing Datarmor documentation

Datarmor documentation, predominantly in French, can be accessed via the Portial Domicile using your extranet username and password and then selecting Documentation calculateur. The documentation is also available from inside an Ifremer building here.

Making a connection

If you are outside of an Ifremer building, then you will need to connect in two distinct steps:

  1. Activate PulseSecure VPN using your Ifremer extranet account credentials (i.e., username and password)
  2. Starting an SSH session towards using your normal, non-extranet username and password

French instructions for both steps are provided at the documentation links mentioned above. For these steps, you will need to install the PulseSecure VPN client that can be downloaded from Ifremer and a SSH client. For Windows, the recommended SSH clients are putty (simple, but effective) and MobaXTerm (more modern and feature rich, including FTP and SFTP capabilities for file transfers in addition to SSH).

Basic linux commands

Comand Description Example
history Show previous commands history
mkdir Make a directory mkdir job-2021-10-12
ls List directory contents ls # List current directory
ls -alh # More details, including permissions and file size
ls mydir # List contents of specific directory
cd Change directory cd # Change to home directory
cd test # Change into directory named test
cd ~ # ~ = home directory
cd .. # Go back one level
cd . # . = current directory, so do nothing
rm Remove a file rm test.txt # Remove a single file
rm test*.txt # Remove all files starting with “test” and ending with “.txt”
cp Copy a file or directory cp test.txt newfile.txt
cp -r mydir mynewdir # Copy entire directory tree
pwd Print working directory pwd
cat Show contents of file to terminal cat test.txt
more Show file contents one page at a time. Use q to exit. more test.txt
echo Print text or variables to command line echo "My home directory=${HOME}"
grep Look for a regular expression in text files. Useful for finding words in text files. grep -i error *.txt
qsub, qstat, qdel Commands to submit and manage jobs in PBSPro queuing system See below
conda Manage conda environments (e.g., R or Python) See conda section below
module Load additional functionality into shell session See datarmor documentation & Ichthyop section below
man, info Ways to get help on commands man ls
info ls

For most commands, you can get basic assistance using -h or --help arguments (e.g., ls --help). There are also manuals for most commands that can be accessed with man and/or info (e.g., man ls or info ls).

Disk space on datarmor

The details of disk space on datarmor are available in the documentation, but briefly most users have 3 spaces that they can store data, listed in order of increasing space and decreasing backups/permanence:

Space Description
DATAHOME Little space, but regularly backed up
DATAWORK Decent amount of space and relatively regular backups
SCRATCH Lots of space, fast, but regularly deleted and no backups

Using the queuing system PBSPro / qsub

The best thing for learning the details of how to use qsub is to consult the Datarmor documentation. Here I just provide examples of typical PBS scripts (the files typically submitted to qsub) and a few examples of how to use them.

The table below provides links to a set of standard PBS scripts:

PBS.Script Description
run_r_script_array.pbs For running an R script on the cluster, either as a single job or as an array job
run_r_script_parallel.pbs For running an R script using foreach and doParallel
run_ichthyop.pbs For running an array of ichthyop jobs, each defined by a specific configuration file

These scripts should be modified before placing them on the cluster and using them, in particular to adjust the PBS options at the start of the file and set any specific directories or files you want to use (e.g., the location of the Ichthyop .jar file).

As an example of the used of these PBS scripts, to run a single R script contained in the file myscript.R, you can do:

qsub -J 1-10 -v RSCRIPTNAME="myscript.R" run_r_script_array.pbs

For running the same script, but this time in a configuration with 10 jobs in an array:

qsub -J 1-10 -v RSCRIPTNAME="myscript.R" run_r_script_array.pbs

To run a set of 11 ichthyop configuration files (with names starting with config.xml_) one can do:

bash # Change to a bash shell for generating file list
qsub -J 0-10 -v ICHTHYOP_CONF_LIST="$(ls config.xml_* | paste -s -d ' ')" run_ichthyop.pbs

To check on the status of your jobs, use:

qstat -u $USER

To kill a running job, use the qdel command with the job ID.

Suggestions for avoiding common issues

  • For each job on the cluster of any importance whatsoever always create a separate directory with a descriptive name and date and all necessary configuration files inside.
  • For debugging, make sure to print lots of diagnostic messages and save all possible outputs. I often echo all shell and R commands so that a detailed record of all steps is created. set -x in PBS scripts can be very useful.
  • Interactive jobs can be very useful for debugging. You can start an interactive job using the following command: qsub -I -l walltime=01:00:00. Once started, you can walk line by line through a PBS script to identify issues.
  • Think carefully about the amount of disk space you are going to use and where you will store outputs to avoid issues like lost data.

Setting up R with packages using Conda

In my experience, the easiest way of assuring that R works on the cluster with all the packages you need installed and functioning is to use conda. Using conda on the cluster will involve two steps:

  1. Setting up conda in your account
  2. Creating a specific R conda environment

Setting up your account

It is easiest to make conda available in every shell (e.g., SSH) session by adding them to the startup files. Execute the following commands in a shell session:

echo "source /appli/anaconda/latest/etc/profile.d/conda.csh" >> ~/.cshrc
echo "source /appli/anaconda/latest/etc/profile.d/" >> ~/.bashrc

You should also make standard conda environments already on the cluster available by executing in a shell session:

cat <<EOF > ~/.condarc
  - $DATAWORK/conda-env
  - /appli/conda-env
  - /appli/conda-env/2.7
  - /appli/conda-env/3.6
  - $DATAWORK/conda/pkgs

Creating an R environment

Execute the following commands in a SSH session to datarmor:

conda create -c conda-forge -n my_r_env r-essentials r-base r-sf r-rnetcdf r-rgeos r-rgdal

The name my_r_env is arbitrary and can be changed to whatever you prefer. Everything from r-sf onward is optional, but I often use them in my own work. You can add to the end of this list any additional packages that you need for your specific work. See Conda Forge for the full list of available packages.

Once created, you can make R available in a shell session (including within PBS scripts) using:

conda activate my_r_env

Uploading and downloading files

File transfers to and from the cluster can be done using at least a couple of different methods:

  1. For small transfers, one can use an SFTP session to the login node
  2. For larger transfers, copy the files you want to transfer into $SCRATCH/eftp and then connect using FTP and your extranet credentials to

For more instructions, see the documentation.

Installing Ichthyop on the cluster

The easiest way to install Ichthyop on the cluster is to install the version you intend to use on your own machine and then transfer the entire Ichthyop folder to datarmor using the transfer methods described above.

Once Ichthyop is on the cluster, you will need to activate java in each SSH session before using it:

module load java

This command can be added to the .cshrc and .bashrc files to assure that it is executed every time a shell (i.e., SSH) session is started.

To see the list of modules available and load a specific java version you could execute:

# List all available modules
module avail

# Load a specific Java version
module load java/1.8.0

# List loaded modules
module list