Changes between Version 1 and Version 2 of Workshops/cypress/SlurmPractice


Ignore:
Timestamp:
08/22/18 13:06:23 (6 years ago)
Author:
fuji
Comment:

Legend:

Unmodified
Added
Removed
Modified
  • Workshops/cypress/SlurmPractice

    v1 v2  
    5151cypress1
    5252}}}
    53 This code print a message, time, and the host name.
     53This code prints a message, time, and the host name on the screen.
    5454
    5555Look at 'slurmscript1'
     
    7979but those are directives for '''SLURM''' job scheduler.
    8080
    81 === qos, partition ===
     81==== qos, partition ====
    8282Those two lines determine the quality of service and the partition.
    8383{{{
     
    9696If you are using a workshop account, you can use only '''workshop''' qos and partition.
    9797
    98 === job-name ===
     98==== job-name ====
    9999{{{
    100100#SBATCH --job-name=python       # Job Name
     
    102102This is the job name that you can specify as you like.
    103103
    104 === time ===
     104==== time ====
    105105{{{
    106106#SBATCH --time=00:01:00         # WallTime
     
    110110After the walltime reaches the maximum, the job terminates regardless whether the job processes are still running or not.
    111111
    112 === Resource Rwquest ===
     112==== Resource Request ====
    113113{{{
    114114#SBATCH --nodes=1               # Number of Nodes
     
    123123'''#SBATCH --cpus-per-task=c'''  determines the number of cores/threads for a task. The details will be explained in Parallel Jobs below.
    124124
    125 
    126 
    127 
    128 
    129 
    130125This script requests one core on one node.
    131126
     
    135130
    136131[[Image(https://docs.google.com/drawings/d/e/2PACX-1vQR7ztCNSIQhIjyW28FyYaQn92XC4Zq_vZzoPwALkywmXoyRl8qC2MEpT1t68zMopZv2yHNt2unMf-i/pub?w=155&h=134)]]
     132
     133=== Submit a job ===
     134Let's run our program on the cluster.
     135To submit our script to SLURM, we invoke the '''sbatch''' command.
     136{{{
     137[fuji@cypress1 SerialJob]$ sbatch slurmscript1
     138Submitted batch job 773944
     139}}}
     140
     141Our job was successfully submitted and was assigned the job number 773944.
     142This python code, ''hello.py'' prints a message, time, and the host name on the screen.
     143But this time,  ''hello.py'' ran on one of the computing nodes and your terminal screen doesn't connect to it.
     144
     145After the job completed, you will see a new file, slurm-???????.out
     146{{{
     147[fuji@cypress1 SerialJob]$ ls
     148hello.py  slurm-773944.out  slurmscript1  slurmscript2
     149}}}
     150that contains
     151{{{
     152[fuji@cypress1 SerialJob]$ cat slurm-773944.out
     153Hello, world!
     1542018-08-22T12:51:34.436170
     155cypress01-117
     156}}}
     157The strings supposed to print on screen went to the file, slurm-???????.out. This is a default file name. You can change it by setting,
     158{{{
     159#SBATCH --output=Hi.out       ### File in which to store job output
     160#SBATCH --error=Hi.err        ### File in which to store job error messages
     161}}}
     162
     163
     164