Difference between revisions of "SLURM usage summary"

From crtc.cs.odu.edu
Jump to: navigation, search
 
(6 intermediate revisions by the same user not shown)
Line 18: Line 18:
 
* <code> -C <name> </code> OR <code> --constrain <name> </code>  e.g <code> --constrain "coreV2*" </code> for using only nodes with hostname starting with coreV2*
 
* <code> -C <name> </code> OR <code> --constrain <name> </code>  e.g <code> --constrain "coreV2*" </code> for using only nodes with hostname starting with coreV2*
 
* <code> --exclusive </code> block other jobs running in the node(s)
 
* <code> --exclusive </code> block other jobs running in the node(s)
 +
 +
= SLURM Cheetsheet =
 +
https://docs.hpc.odu.edu/#slurm-cheat-sheet
 +
 +
= HPC ODU Documentation =
 +
https://docs.hpc.odu.edu/
 +
 +
= Turing cluster =
 +
** TODO add GPU information **
 +
 +
{| class="wikitable"
 +
! hostname
 +
! nodes
 +
! memory (GB)
 +
! cache (MB)
 +
! Model Name
 +
! Turbo
 +
! CPUs per node
 +
! Threads per core
 +
! Cores per socket
 +
! sockets
 +
|-
 +
| coreV1-22-0*
 +
| 28
 +
| 127
 +
| 20
 +
| Intel(R) Xeon(R) CPU E5-2660    @ 2.20GHz
 +
| 3.00GHz
 +
| 16
 +
| 1
 +
| 8
 +
| 2
 +
|-
 +
| coreV2-22-0*
 +
| 36
 +
| 126
 +
| 25
 +
| Intel(R) Xeon(R) CPU E5-2660 v2 @ 2.20GHz
 +
| 3.00GHz
 +
| 20
 +
| 1
 +
| 10
 +
| 2
 +
|-
 +
| coreV2-25-0*
 +
| 76
 +
| 126
 +
| 25
 +
| Intel(R) Xeon(R) CPU E5-2670 v2 @ 2.50GHz
 +
| 3.30GHz
 +
| 20
 +
| 1
 +
| 10
 +
| 2
 +
|-
 +
| coreV2-23-himem-*
 +
| 4
 +
| 757
 +
| 16
 +
| Intel(R) Xeon(R) CPU E5-4610 v2 @ 2.30GHz
 +
| 2.70GHz
 +
| 32
 +
| 1
 +
| 8
 +
| 4
 +
|-
 +
| coreV3-23-0*
 +
| 50
 +
| 125
 +
| 40
 +
| Intel(R) Xeon(R) CPU E5-2698 v3 @ 2.30GHz
 +
| 3.20GHz
 +
| 32
 +
| 1
 +
| 16
 +
| 2
 +
|-
 +
| coreV4-21-0*
 +
| 30
 +
| 125
 +
| 40
 +
| Intel(R) Xeon(R) CPU E5-2683 v4 @ 2.10GHz
 +
| 3.00GHz
 +
| 32
 +
| 1
 +
| 16
 +
| 2
 +
|-
 +
| coreV4-21-himem-*
 +
| 3
 +
| 504
 +
| 40
 +
| Intel(R) Xeon(R) CPU E5-2683 v4 @ 2.10GHz
 +
| 3.00GHz
 +
| 32
 +
| 1
 +
| 16
 +
| 2
 +
|}
 +
 +
= Wahab cluster =
 +
{| class="wikitable"
 +
! hostname
 +
! nodes
 +
! memory (GB)
 +
! cache (MB)
 +
! Model Name
 +
! Turbo
 +
! CPUs per node
 +
! Threads per core
 +
! Cores per socket
 +
! sockets
 +
|-
 +
| any
 +
| 158
 +
| 384
 +
| -
 +
| Intel® Xeon® Gold 6148    @ 2.40GHz
 +
| 3.70GHz
 +
| 40
 +
| 1
 +
| 20
 +
| 2
 +
|}

Latest revision as of 15:57, 12 June 2019

View en-queued jobs

squeue -u <username>

View available nodes

sinfo --state=idle

Create and interactive jobs

salloc [SBATCH ARGUMENTS]

SBATCH Arguments

  • -c <number of cores> OR --cpus-per-task e.g -c 12 for allocating 12 cores
  • -N <number of nodes> OR --nodes <number of nodes> e.g -c 4 for allocating 4 nodes
  • -p <name> OR --partition <name> e.g --partition "himem" for using a node of the high memory group
  • -C <name> OR --constrain <name> e.g --constrain "coreV2*" for using only nodes with hostname starting with coreV2*
  • --exclusive block other jobs running in the node(s)

SLURM Cheetsheet

https://docs.hpc.odu.edu/#slurm-cheat-sheet

HPC ODU Documentation

https://docs.hpc.odu.edu/

Turing cluster

    • TODO add GPU information **
hostname nodes memory (GB) cache (MB) Model Name Turbo CPUs per node Threads per core Cores per socket sockets
coreV1-22-0* 28 127 20 Intel(R) Xeon(R) CPU E5-2660 @ 2.20GHz 3.00GHz 16 1 8 2
coreV2-22-0* 36 126 25 Intel(R) Xeon(R) CPU E5-2660 v2 @ 2.20GHz 3.00GHz 20 1 10 2
coreV2-25-0* 76 126 25 Intel(R) Xeon(R) CPU E5-2670 v2 @ 2.50GHz 3.30GHz 20 1 10 2
coreV2-23-himem-* 4 757 16 Intel(R) Xeon(R) CPU E5-4610 v2 @ 2.30GHz 2.70GHz 32 1 8 4
coreV3-23-0* 50 125 40 Intel(R) Xeon(R) CPU E5-2698 v3 @ 2.30GHz 3.20GHz 32 1 16 2
coreV4-21-0* 30 125 40 Intel(R) Xeon(R) CPU E5-2683 v4 @ 2.10GHz 3.00GHz 32 1 16 2
coreV4-21-himem-* 3 504 40 Intel(R) Xeon(R) CPU E5-2683 v4 @ 2.10GHz 3.00GHz 32 1 16 2

Wahab cluster

hostname nodes memory (GB) cache (MB) Model Name Turbo CPUs per node Threads per core Cores per socket sockets
any 158 384 - Intel® Xeon® Gold 6148 @ 2.40GHz 3.70GHz 40 1 20 2