Difference between revisions of "MARVIN Cluster"
Line 2: | Line 2: | ||
This is a private cluster. | This is a private cluster. | ||
− | + | =Hardware= | |
:* Head node: marvin.cac.cornell.edu. | :* Head node: marvin.cac.cornell.edu. | ||
:* access modes: ssh | :* access modes: ssh | ||
Line 11: | Line 11: | ||
:* Submit HELP requests: [https://www.cac.cornell.edu/help Help] OR by sending email to: help@cac.cornell.edu | :* Submit HELP requests: [https://www.cac.cornell.edu/help Help] OR by sending email to: help@cac.cornell.edu | ||
− | + | =File Systems= | |
− | + | ==Home Directories== | |
:* Path: ~ | :* Path: ~ | ||
User home directories are hosted on the head node and exported to the compute nodes via NFS. Unless special arrangements are made, data in user home directories are NOT backed up. | User home directories are hosted on the head node and exported to the compute nodes via NFS. Unless special arrangements are made, data in user home directories are NOT backed up. | ||
− | + | ==Globus Access== | |
User home directories can be accessed on [https://globus.org Globus]. Under "File Manager" tab in Globus web GUI: | User home directories can be accessed on [https://globus.org Globus]. Under "File Manager" tab in Globus web GUI: | ||
Line 23: | Line 23: | ||
# Authenticate using your CAC user name and password if prompted. | # Authenticate using your CAC user name and password if prompted. | ||
− | + | =Scheduler/Queues= | |
:* The cluster scheduler is '''Slurm'''. See [https://www.cac.cornell.edu/wiki/index.php?title=Slurm Slurm documentation] page for details. | :* The cluster scheduler is '''Slurm'''. See [https://www.cac.cornell.edu/wiki/index.php?title=Slurm Slurm documentation] page for details. | ||
:* Note: hyperthreading is enabled on the cluster, so Slurm considers each physical core to consist of two logical CPUs. See the [https://www.cac.cornell.edu/wiki/index.php?title=Slurm#Options_for_Submitting_Jobs slurm options] section for using the correct options for your job. | :* Note: hyperthreading is enabled on the cluster, so Slurm considers each physical core to consist of two logical CPUs. See the [https://www.cac.cornell.edu/wiki/index.php?title=Slurm#Options_for_Submitting_Jobs slurm options] section for using the correct options for your job. | ||
Line 45: | Line 45: | ||
|} | |} | ||
− | + | =Software= | |
==Work with Environment Modules== | ==Work with Environment Modules== | ||
Line 101: | Line 101: | ||
</pre> | </pre> | ||
+ | == Manage Modules in Your Python Virtual Environment == | ||
− | + | python 3.8.3 is installed. Users can manage their own python environment (including installing needed modules) using virtual environments. Please see [https://packaging.python.org/guides/installing-using-pip-and-virtual-environments the documentation on virtual environments on python.org] for details. | |
− | |||
− | == | + | === Load python/3.8.3 module === |
− | + | First load <code>python/3.8.3</code> to select python 3.8.3: | |
+ | <pre> | ||
+ | module load python/3.8.3 | ||
+ | </pre> | ||
=== Create Virtual Environment === | === Create Virtual Environment === | ||
Line 113: | Line 116: | ||
You can '''create''' as many virtual environments, each in their own directory, as needed. | You can '''create''' as many virtual environments, each in their own directory, as needed. | ||
− | + | <code>python3 -m venv <your virtual environment directory></code> | |
− | |||
− | |||
=== Activate Virtual Environment === | === Activate Virtual Environment === | ||
Line 185: | Line 186: | ||
</pre> | </pre> | ||
+ | ==Software List== | ||
::{| border="1" cellspacing="0" cellpadding="10" | ::{| border="1" cellspacing="0" cellpadding="10" | ||
! Software | ! Software |
Revision as of 14:26, 29 May 2020
This is a private cluster.
Hardware
- Head node: marvin.cac.cornell.edu.
- access modes: ssh
- OpenHPC v1.3.8 with CentOS 7.8
- 86 compute nodes with Dual 6-core X5670 CPUs @ 3 GHz, Hyperthreaded, 24 GB of RAM; 3 high memory nodes with 96 GB of RAM
- Cluster Status: Ganglia.
- "Why use a temporary directory?"
- Submit HELP requests: Help OR by sending email to: help@cac.cornell.edu
File Systems
Home Directories
- Path: ~
User home directories are hosted on the head node and exported to the compute nodes via NFS. Unless special arrangements are made, data in user home directories are NOT backed up.
Globus Access
User home directories can be accessed on Globus. Under "File Manager" tab in Globus web GUI:
- Access "cac#marvin" endpoint.
- Authenticate using your CAC user name and password if prompted.
Scheduler/Queues
- The cluster scheduler is Slurm. See Slurm documentation page for details.
- Note: hyperthreading is enabled on the cluster, so Slurm considers each physical core to consist of two logical CPUs. See the slurm options section for using the correct options for your job.
- Partitions:
Name Description Time Limit viz 3 visualization Ensight Servers, each has 96GB RAM none normal (default) all nodes except for those in viz queue none all all cluster nodes none
Software
Work with Environment Modules
Set up the working environment for each package using the module command. The module command will activate dependent modules if there are any.
To show currently loaded modules: (These modules are loaded by default system configurations)
-bash-4.2$ module list Currently Loaded Modules: 1) autotools 2) prun/1.3 3) gnu8/8.3.0 4) openmpi3/3.1.4 5) ohpc
To show all available modules (as of Sept 30, 2013):
-bash-4.2$ module avail -------------------- /opt/ohpc/pub/moduledeps/gnu8-openmpi3 -------------------- boost/1.70.0 netcdf/4.6.3 pnetcdf/1.11.1 fftw/3.3.8 phdf5/1.10.5 py3-scipy/1.2.1 ------------------------ /opt/ohpc/pub/moduledeps/gnu8 ------------------------- R/3.5.3 mpich/3.3.1 openblas/0.3.5 py3-numpy/1.15.3 hdf5/1.10.5 mvapich2/2.3.1 openmpi3/3.1.4 (L) -------------------------- /opt/ohpc/pub/modulefiles --------------------------- autotools (L) intel/19.0.2.187 prun/1.3 (L) clustershell/1.8.1 julia/1.2.0 valgrind/3.15.0 cmake/3.14.3 octave/5.1.0 vim/8.1 gnu8/8.3.0 (L) ohpc (L) visit/3.0.1 gurobi/8.1.1 pmix/2.2.2 Where: L: Module is loaded
To load a module and verify:
-bash-4.2$ module load R/3.5.3 -bash-4.2$ module list Currently Loaded Modules: 1) autotools 3) gnu8/8.3.0 5) ohpc 7) R/3.5.3 2) prun/1.3 4) openmpi3/3.1.4 6) openblas/0.3.5
To unload a module and verify:
-bash-4.2$ module list Currently Loaded Modules: 1) autotools 2) prun/1.3 3) gnu8/8.3.0 4) openmpi3/3.1.4 5) ohpc
Manage Modules in Your Python Virtual Environment
python 3.8.3 is installed. Users can manage their own python environment (including installing needed modules) using virtual environments. Please see the documentation on virtual environments on python.org for details.
Load python/3.8.3 module
First load python/3.8.3
to select python 3.8.3:
module load python/3.8.3
Create Virtual Environment
You can create as many virtual environments, each in their own directory, as needed.
python3 -m venv <your virtual environment directory>
Activate Virtual Environment
You need to activate a virtual environment before using it:
source <your virtual environment directory>/bin/activate
Install Python Modules Using pip
After activating your virtual environment, you can now install python modules for the activated environment:
- It's always a good idea to update
pip
first:
pip install --upgrade pip
- Install the module:
pip install <module name>
- List installed python modules in the environment:
pip list modules
- Examples: Install
tensorflow
andkeras
like this:
-bash-4.2$ python3 -m venv tensorflow -bash-4.2$ source tensorflow/bin/activate (tensorflow) -bash-4.2$ pip install --upgrade pip Collecting pip Using cached https://files.pythonhosted.org/packages/30/db/9e38760b32e3e7f40cce46dd5fb107b8c73840df38f0046d8e6514e675a1/pip-19.2.3-py2.py3-none-any.whl Installing collected packages: pip Found existing installation: pip 18.1 Uninstalling pip-18.1: Successfully uninstalled pip-18.1 Successfully installed pip-19.2.3 (tensorflow) -bash-4.2$ pip install tensorflow keras Collecting tensorflow Using cached https://files.pythonhosted.org/packages/de/f0/96fb2e0412ae9692dbf400e5b04432885f677ad6241c088ccc5fe7724d69/tensorflow-1.14.0-cp36-cp36m-manylinux1_x86_64.whl : : : Successfully installed absl-py-0.8.0 astor-0.8.0 gast-0.2.2 google-pasta-0.1.7 grpcio-1.23.0 h5py-2.9.0 keras-2.2.5 keras-applications-1.0.8 keras-preprocessing-1.1.0 markdown-3.1.1 numpy-1.17.1 protobuf-3.9.1 pyyaml-5.1.2 scipy-1.3.1 six-1.12.0 tensorboard-1.14.0 tensorflow-1.14.0 tensorflow-estimator-1.14.0 termcolor-1.1.0 werkzeug-0.15.5 wheel-0.33.6 wrapt-1.11.2 (tensorflow) -bash-4.2$ pip list modules Package Version -------------------- ------- absl-py 0.8.0 astor 0.8.0 gast 0.2.2 google-pasta 0.1.7 grpcio 1.23.0 h5py 2.9.0 Keras 2.2.5 Keras-Applications 1.0.8 Keras-Preprocessing 1.1.0 Markdown 3.1.1 numpy 1.17.1 pip 19.2.3 protobuf 3.9.1 PyYAML 5.1.2 scipy 1.3.1 setuptools 40.6.2 six 1.12.0 tensorboard 1.14.0 tensorflow 1.14.0 tensorflow-estimator 1.14.0 termcolor 1.1.0 Werkzeug 0.15.5 wheel 0.33.6 wrapt 1.11.2
Software List
Software Path Notes *GNU Compilers 8.3.0 /opt/ohpc/pub/compiler/gcc/8.3.0 module load gnu8/8.3.0 *openmpi 3.1.4 /opt/ohpc/pub/mpi/openmpi3-gnu8/3.1.4 or /opt/ohpc/pub/mpi/openmpi3-intel/3.1.4 module load openmpi3/3.1.4 Intel Parallel Studio XE 2020.1.217 /opt/ohpc/pub/compiler/intel/2020/ module swap gnu8 intel/20.1.2017 Intel MPI 2020.1.217 /opt/ohpc/pub/compiler/intel/2020/compilers_and_libraries_2020.1.217/linux/mpi module load impi/2020.1.217 mvapich2 2.3.2 /opt/ohpc/pub/mpi/mvapich2-gnu/2.3.2 or /opt/ohpc/pub/mpi/mvapich2-intel/2.3.2 module load mvapich2/2.3.2 fftw 3.3.8 /opt/ohpc/pub/libs/gnu8/openmpi3/fftw/3.3.8 or /opt/ohpc/pub/libs/gnu8/mvapich2/fftw/3.3.8 module load fftw/3.3.8 hypre 2.18.1 /opt/ohpc/pub/libs/gnu8/openmpi3/hypre/2.18.1, /opt/ohpc/pub/libs/gnu8/impi/hypre/2.18.1, /opt/ohpc/pub/libs/intel/openmpi3/hypre/2.18.1, or /opt/ohpc/pub/libs/intel/impi/hypre/2.18.1 module load hypre/2.18.1 ensight 10.1.4a /opt/ohpc/pub/apps/ensight/10.1.4a module load ensight/10.1.4a VisIt 3.0.1 /opt/ohpc/pub/apps/visit/3.0.1/bin module load visit/3.0.1 python 3.8.3 /opt/ohpc/pub/utils/python/3.8.3 module load python/3.8.3