19 | | === Run Singularity Images Interactively === |
| 19 | === Run Rockylinux Images Interactively === |
| 20 | Start an interactive session with '''centos7''' partition. |
| 21 | {{{ |
| 22 | idev -t 8 --partition=centos7 |
| 23 | }}} |
| 24 | Load Singularity module. |
| 25 | {{{ |
| 26 | module load singularity/3.9.0 |
| 27 | }}} |
| 28 | Setup the environmental variables |
| 29 | {{{ |
| 30 | source /lustre/project/singularity_images/setup_cypress.sh |
| 31 | }}} |
| 32 | Run bash shell in Rockylinux Images, |
| 33 | {{{ |
| 34 | singularity shell -s /bin/bash /lustre/project/singularity_images/rockylinux-9.2.sif |
| 35 | }}} |
| 36 | You will get a command line prompt. Now the glibc version is '''2.34'''. |
| 37 | {{{ |
| 38 | Apptainer> ldd --version |
| 39 | ldd (GNU libc) 2.34 |
| 40 | Copyright (C) 2021 Free Software Foundation, Inc. |
| 41 | This is free software; see the source for copying conditions. There is NO |
| 42 | warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. |
| 43 | Written by Roland McGrath and Ulrich Drepper. |
| 44 | }}} |
| 45 | |
| 46 | You can use command line tools as you usually do on Centos6/7 nodes. '''module''' command also works but some of the modules may not work because of library incompatibility. |
| 49 | === Run Scripts on Rockylinux in Bachjob === |
| 50 | An example Slurm script. |
| 51 | {{{ |
| 52 | #!/bin/bash |
| 53 | #SBATCH --job-name=Singularity # Job Name |
| 54 | #SBATCH --partition=centos7 # Partition must be centos7 |
| 55 | #SBATCH --qos=normal # Quality of Service normal or long |
| 56 | #SBATCH --time=0-01:00:00 # Wall clock time limit in Days-HH:MM:SS |
| 57 | #SBATCH --nodes=1 # Node count required for the job |
| 58 | #SBATCH --ntasks-per-node=1 # Number of tasks to be launched per Node |
| 59 | #SBATCH --cpus-per-task=1 # Number of threads per task (OMP threads 1-20) |
| 61 | # Load Singularity module |
| 62 | module load singularity/3.9.0 |
| 63 | |
| 64 | # Setup the environmental variables |
| 65 | SingularityImageDir=/lustre/project/singularity_images |
| 66 | source $SingularityImageDir/setup_cypress.sh |
| 67 | |
| 68 | # Run a command on RockyLinux |
| 69 | singularity exec $SingularityImageDir/rockylinux-9.2.sif gcc --version |
| 70 | }}} |
| 71 | The example above runs gcc --version on RockyLinux. Submitting job is the same as you usually do. (See [https://wiki.hpc.tulane.edu/trac/wiki/cypress/using#IntroductiontoManagedClusterComputing here].) |
| 72 | When the job is finished, the log file should contain the output from gcc --version command, which will be: |
| 73 | {{{ |
| 74 | gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4) |
| 75 | Copyright (C) 2021 Free Software Foundation, Inc. |
| 76 | This is free software; see the source for copying conditions. There is NO |
| 77 | warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. |
| 78 | }}} |
| 79 | |
| 80 | |
| 81 | If you want to run a bash script, assuming 'my_script.sh' is in the current working directory, modify the above as |
| 82 | {{{ |
| 83 | singularity run $SingularityImageDir/rockylinux-9.2.sif /bin/bash -l -c ./my_script.sh |
| 84 | }}} |
| 85 | |
| 86 | 'my_script.sh' must be executable. You can make it by 'chmod' command (See [https://wiki.hpc.tulane.edu/trac/wiki/cypress/BasicLinuxComands#chmod here].) |
| 87 | {{{ |
| 88 | chmod u+x my_script.sh |
| 89 | }}} |
| 90 | |
| 91 | For example, 'my_script.sh' is |
| 92 | {{{ |
| 93 | #!/bin/bash |
| 94 | module load intel-psxe/2016 |
| 95 | echo "Hello world from $0 running in ${SINGULARITY_NAME}" |
| 96 | pwd |
| 97 | date |
| 98 | icc --version |
| 99 | }}} |
| 100 | When the job is finished, the log file will have: |
| 101 | {{{ |
| 102 | Hello world from ./my_script.sh running in rockylinux-9.2.sif |
| 103 | /home/userid/test |
| 104 | Fri Jun 9 13:21:01 CDT 2023 |
| 105 | icc (ICC) 16.0.0 20150815 |
| 106 | Copyright (C) 1985-2015 Intel Corporation. All rights reserved. |
| 107 | }}} |
| 108 | |
| 109 | If you find missing packages in rockylinux-9.2.sif, please let us know. |