infra_bg
CC-IN2P3 Centre de Calcul de l'Institut National de Physique Nucléaire et de Physique des Particules

Environments & Singularity @CC-IN2P3

Plan
  • Singularity
    • What are containers?
    • What is Singularity?
    • How to use Singularity?
  • Singularity @CC-IN2P3
    • Images storage
    • Local configuration
    • Job submission on the batch
    • Job submission on the GPUs
Singularity

What are containers

What are containers?

Virtualisation Vs Containers

Virtualisation
Singularity

Description of the tool

Docker vs. Singularity

Docker

Micro-services

  • Required a root daemon
  • Abstraction layer
  • Docker images

Singularity

Scientific computing tasks

  • No daemon
  • GPU
  • Docker & Singularity images
Logo Docker Logo Singularity
Singularity

What is Singularity?

  • First developped in Python by Gregory Kurtzer from the Lawrence Berkeley National Laboratory
  • Fully rewritten in Go by the end of 2018 by Sylabs, which is currently maintaining it
  • Last release : 3.4.1 (09/23/2019)
  • Documentation
Singularity

Singularity

Download a container

Singularity: Download a container

Let's start!

Follow the following steps:


# Connect to the interactive machine using SSH
$ ssh -l user cca.in2p3.fr

# Activate the Singularity 3.3.0 environment
$ ccenv singularity 3.3.0
        
Singularity: Download a container

Pull

From Singularity Hub


$ singularity pull shub://vsoch/hello-world
        

From Docker Hub


$ singularity pull docker://godlovedc/lolcow
        
Singularity

Singularity

How-to

Singularity: Using a container

Shell (interactive mode)


$ singularity shell hello-world_latest.sif
Singularity: Invoking an interactive shell within container...

Singularity hello-world_latest.sif:~> cat /etc/issue
Ubuntu 14.04.6 LTS \n \l
        

To quit:
CTRL-d or exit

Singularity: Using a container

Exec


$ cat /etc/redhat-release 
CentOS Linux release 7.7.1908 (Core)

$ singularity exec hello-world_latest.sif cat /etc/issue
Ubuntu 14.04.6 LTS

$ singularity exec hello-world_latest.sif ls /
anaconda-post.log  etc   lib64       mnt   root  singularity  tmp
bin        home  lost+found  opt   run   srv          usr
dev        lib   media       proc  sbin  sys          var
        
Singularity

Singularity

Users and autorisations

Singularity: Users and autorisations

  user1@hst:~$ singularity shell hello-world_latest.sif
  Singularity hello-world_latest.sif:~> whoami
  user1
  Singularity hello-world_latest.sif:~> id
  uid=1000(user1) gid=1000(user1) groups=1000(user1),4(adm),[…]
        

→ The container is instanciated with the autorisations of the user spawning it.

Singularity

Singularity

Mount points

Singularity: Mount points

Default mount points

  • /home/${USER}
  • /tmp
  • /proc, /sys, /dev

user1@hst:~$ singularity shell hello-world_latest.sif
Singularity hello-world_latest.sif:~> pwd
/home/user1/

test@hst:/home/user1$ singularity shell hello-world_latest.sif
Singularity hello-world_latest.sif:~> pwd
/home/test
Singularity hello-world_latest.sif:~> ls /home/user1
ls: cannot access /home/user1: No such file or directory
        
Singularity: Mount points

Mounting other directories


Require the option --bind or -B
→ -B src:dst


$ ls ~/tests
test.py

$ singularity shell -B ~/tests:/mnt hello-world_latest.sif
Singularity hello-world_latest.sif:~> ls /mnt
test.py
        
Singularity: Mount points

Mounting other directories

No need to specify dst if src == dst:


$ cat /mnt/test.txt
Ceci est un test

$ singularity shell -B /mnt hello-world_latest.sif
Singularity hello-world_latest.sif:~> cat /mnt/test.txt
Ceci est un test
        

Mounting several directories at once:


$ singularity shell -B /mnt,~/tests:/mnt1 hello-world_latest.sif
        
Singularity: Mount points

Exercice

  • Use of Singularity 3.3.0-rc.1 (with no default mount points and without underlay)
  • Mount point : /mnt

ccenv singularity 3.3.0-rc.1
cd ~
echo "It's alive!" > test.txt
$ singularity exec -B [...] hello-world_latest.sif cat [...]/test.txt
It's alive!
        
How-to @ CC-IN2P3

Singularity @CC-IN2P3

How-to @ CC-IN2P3

Resources

CVMFS Images storage provided by the CC-IN2P3 /cvmfs/singularity.in2p3.fr/images/
PBS Your images

The images in CVMFS are organised by usage (HPC/HTC, CPU/GPU, ...) and are maintained by the CC-IN2P3

How to @ CC-IN2P3

Resources

In /pbs, you have:

  • HOME: /pbs/home (20Go) personal storage
  • THRONG: /pbs/throng (100Go) shared/group storage

How to upload an image to the CC-IN2P3:


$ scp mycontainer.sif formationX@cca.in2p3.fr:
        
How to @ CC-IN2P3

Local configuration

  • Default version available on the worker nodes: 2.6.1
  • More recent versions available here: /pbs/software/centos-7-x86_64/singularity/
  • By default the HOME directory is not mounted
  • All versions are available from the CC-IN2P3 batch system
How to @ CC-IN2P3

Singularity on the batch

How to submit on batch job using Singularity:

  • Require a wrapper
  • Submission of the wrapper using qsub

-- hello.sh --
#!/bin/bash
/bin/singularity exec /cvmfs/singularity.in2p3.fr/images/
  HTC/ubuntu/ubuntu1804-CC3D.simg /bin/hostname

-- Wrapper submission --
$ qsub -q long /pbs/home/john/hello.sh
        
How-to @ CC-IN2P3

Singularity @ CC-IN2P3

Singularity 3.X from a interactive machine:

  • Need to activate the right singularity environment

# List the Singularity versions available @ CC-IN2P3
ccenv singularity --list
Software:
  Version:
    singularity:
    - 3.0.3
    - 3.1.1
    - 3.3.0
# Activate the desired version
ccenv singularity 3.3.0
        
How-to @ CC-IN2P3

Softwares @ CC-IN2P3: ccenv tool

From a interactive machine:


# List the various softwares available @ CC-IN2P3
$ ccenv --list
Software:
- anaconda
- cctools
- cmake
# List the available version for a specific software
$ ccenv software --list
# Setup a specific version for a given software
$ ccenv software version
        
How-to @ CC-IN2P3

Singularity on the batch (2)

Singularity 3.X on a worker node, i.e. from the batch system:

  • Activate the right singularity version

source /pbs/software/centos-7-x86_64/singularity/ccenv.[c]sh 3.3.0
        
  • Create a wrapper and submit it with qsub
How-to @ CC-IN2P3

Singularity on the batch (3)

Submitting a job on the GPU farm?

How-to GPU farm
Voisinage
Images catalog(1/3)

You can find all the available images as well as the compatible Deep Learning Python modules here
https://gitlab.in2p3.fr/ccin2p3-support/c3/hpc/gpu

Catalogue d'images
Workflow
Workflow
Submission to the batch

Submission command (from an interactive machine)


qsub -l sps=1,GPU=<nb_gpus>,GPUtype=<K80-V100> -q <queue> -pe multicores_gpu 4
-o <output_path> -e <error_path> -V <path_to>/batch_launcher.sh
        

batch_launcher.sh (on the worker node)


#!/bin/bash
/bin/singularity exec --nv --bind /sps:/sps --bind /pbs:/pbs <image_path> <path_to>/start.sh
        

start.sh (executed through a Singularity image on the worker node)


#!/bin/bash
source <path_to_python_env> activate <env>
python <path_to>/program.py