Connecting to RCS HPC Systems and OpenFOAM: Difference between pages

From RCSWiki
(Difference between pages)
Jump to navigation Jump to search
 
 
Line 1: Line 1:
Research computing services operates and maintains various clusters and services which can only be connected to from within the University of Calgary campus network or via the University of Calgary IT General VPN.
= Background =


= How to connect/login to ARC  =
'''OpenFOAM''' (for "Open-source Field Operation And Manipulation") is a free, open-source, toolkit for creation of computational fluid dynamics (CFD) applications. It includes solver libraries and pre- and post-processing utilities.
Common variants of '''OpenFOAM''' include those from '''openfoam.org''' and '''openfoam.com''' .


One connects to ARC by using an SSH client.
Typically, researchers will install OpenFOAM on their own computers to learn how to use the software, run simulations that exceed their local hardware capabilities on ARC and then transfer output data back to their own computers for visualization.
Secure Shell (SSH) is an encrypted network protocol that allows secure communication between your computer and a server.
ARC only accepts connections from UofC campus network.
Thus, if you are trying to connect to ARC outside of the campus,  
you will have to connect to UofC VPN service first.


== From Linux and MacOS ==
There are '''three main variants of OpenFOAM''' software that are released as
free and open-source software under the GNU General Public License Version 3.


On Linux or MacOS computers, SSH is most likely installed and can be used by opening a terminal and running:
* '''OpenFOAM Foundation Inc.''' variant, released by The OpenFOAM Foundation Inc. (since 2012),
ssh username@arc.ucalgary.ca
: and transferred in 2015 to the English company The OpenFOAM Foundation Ltd.
: https://openfoam.org/


For work requiring X11 forwarding, pass in the <code>-X</code> flag.
ssh -X username@arc.ucalgary.ca


== From Windows ==
* '''OpenFOAM variant by OpenCFD Ltd.''' (with the name trademarked since 2007) first released as open-source in 2004.
: (Note that since 2012, OpenCFD Ltd is an affiliate of ESI Group.)
: https://www.openfoam.com/


On Windows, there are various clients that can be used to connect to our cluster including:
* PuTTY
* MobaXterm


=== Connecting with MobaXTerm ===
* '''FOAM-Extend''' variant by Wikki Ltd. (since 2009) - a fork of the OpenFOAM® open source library for Computational Fluid Dynamics (CFD).
: http://wikki.co.uk/index.php/foam-extend/
: http://foam-extend.org


'''MobaXTerm''' is an implementation of an SSH client and more.
= OpenFOAM Apptainer Containers =
: https://mobaxterm.mobatek.net/
Get the '''Home Edition Installer''' and run it on your Windows machine to install.


The more recent versions of OpenFOAM installed on ARC have been built as Apptainer containers
based off of the docker containers available on Dockerhub or custom build by ARC's team.


To connect to a remote computer, you have to know '''four''' things:
The path to the directory with the container files is <code>/global/software/openfoam/containers</code>.
* The name of the '''remote computer'''.
This directory contains three subdirectories,
* The '''protocol''' that is used to connect.
<code>com</code>, <code>extend</code>, and <code>org</code>.
* Your user name ('''account''') on that computer.
These directories are for containerized versions of OpenFOAM provided by
* Your '''password''' for that account on that computer.
[https://www.openfoam.com OpenCFD Ltd.],
[http://foam-extend.org Wikki Ltd.], and
[https://openfoam.org OpenFOAM Foundation Inc.], respectively.  




The name depends on the computer you want to use.
To see what versions of containerized OpenFOAM are available, simply list the content of the corresponding directory:
For the '''ARC cluster''' uses the login node named '''arc.ucalgary.ca'''.
<pre>
Thus, this would be the name of the '''remote host''' field in the dialog.
$ ls -l /global/software/openfoam/containers/
drwxr-xr-x 2 drozmano drozmano 4096 Jan 20  2023 com
drwxr-xr-x 2 drozmano drozmano 4096 Jan 20  2023 extend
drwxr-xr-x 2 drozmano drozmano 4096 Jan 20  2023 org


'''Remote host''' simply means "the name of the remote computer you want to connect to".
$ ls -l /global/software/openfoam/containers/com
-rwxr-xr-x 1 drozmano drozmano 1521639424 Jan 20  2023 openfoam2012.sif
-rwxr-xr-x 1 drozmano drozmano  418091008 Jan 20  2023 openfoam2206.sif


$ ls -l /global/software/openfoam/containers/extend/


Your '''user name''' on that computer corresponds to '''your account''' name on it.
$ ls -l /global/software/openfoam/containers/org
In the case of ARC it is your '''UofC IT name''', the part of your '''UofC email address''' before the '''@''' character.
-rwxr-xr-x 1 drozmano drozmano 853811200 Oct 16 15:53 openfoam10-paraview56-arc.sif
For example, your email is <code>user.name@ucalgary.ca</code>, then the account name is <code>user.name</code>.
-rwxr-xr-x 1 drozmano drozmano 894136320 Oct 16 15:53 openfoam11-paraview510-arc.sif
-rwxr-xr-x 1 drozmano drozmano 814657536 Oct 16 15:53 openfoam9-paraview56-arc.sif
</pre>


Currently, there are no version of '''OpenFOAM-Extend''' are provided.


The '''protocol''' you use to connect is the '''SSH''', you select it by choosing the SSH connection tab in the dialog.  
The '''OpenFOAM.org''' containers were slightly modified, to simplify their use. This is indicated by the <code>-arc</code> suffix in their filenames.
The protocol affects the '''connection port''' on the remote system.  
They are based on the [https://hub.docker.com/u/openfoam official containers from DockerHub].
It is the case of SSH it is set to '''22''' by default, which is correct for ARC.


The '''password''' you have to know and enter when prompted.
== Running commands from the containers ==
It is the '''same password''' that you use to access your '''UofC email'''.
If you ever change the UofC email password, it will automatically change for ARC as well.


= RCS Services =
You can run a command from the container of your choice with the command line like this:
$ apptainer exec <container> <command> [command arguments]


== Service Overview ==
=== OpenFOAM.org ===
RCS operates the following clusters and services. Most services can only be accessed from the campus network or through the IT General VPN.
{| class="wikitable"
! Cluster
!Service
! Accessible by
!Protocol
! Hostname
! VPN/Campus
!Citrix
|-
| rowspan="4" | ARC
|Login Node
| rowspan="4" | All ARC users
|SSH
| arc.ucalgary.ca
| Required
|Not required
|-
|[[Jupyter Notebooks|JupyterHub]]
|HTTPS
| https://jupyter.ucalgary.ca/
| Required
|Not required
|-
|[[Open OnDemand]]
|HTTPS
|https://ood-arc.rcs.ucalgary.ca/
|Not required
|Not required
|-
|[[How to transfer data|Data Transfer Node]]
|SSH
| arc-dtn.ucalgary.ca
| Required
|Not required
|-
| rowspan="2" | TALC
|Login Node
| rowspan="2" | All TALC users
|SSH
| talc.ucalgary.ca
| Required
|Not required
|-
|[[Jupyter Notebooks|JupyterHub]]
|HTTPS
|https://talc.ucalgary.ca/
|Required
|Not required
|-
|[[MARC_Cluster_Guide|MARC]]
|Login Node
| All MARC users
|SSH
| marc.ucalgary.ca
| N/A
|Required
|}


== Connecting to the University IT General VPN ==
For example, you want to use the container with OpenFOAM.org v10, <code>openfoam10-paraview56.sif</code>,
Connections to certain services must be accessed from the University of Calgary campus network or via the University of Calgary IT General VPN. You may only connect to one of our login nodes via SSH or the Citrix service from either the campus network or after you have connected to the IT General VPN.  
in the <code>org</code> subdirectory,
and you want to see the help page for the<code>buoyantReactingFoam</code> command.
This is how you can do that:
<pre>
$ $ apptainer exec /global/software/openfoam/containers/org/openfoam10-paraview56-arc.sif buoyantReactingFoam -help


=== Connecting using FortiClient VPN ===
Usage: buoyantReactingFoam [OPTIONS]
You may install the FortiClient VPN program to connect your computer to the IT General VPN.
options:
  -case <dir>      specify alternate case directory, default is the cwd
....
....
</pre>


Refer to the following pages based on your operating system:  
To make it easier to type the long container file name, it can be assigned to an environmental variable, <code>CONTAINER</code>, for example:
{| class="wikitable"
<pre>
!Operating System
$ export CONTAINER=/global/software/openfoam/containers/org/openfoam10-paraview56-arc.sif
!Documentation
|-
|Windows
|https://ucalgary.service-now.com/kb_view.do?sysparm_article=KB0033671
|-
|MacOS
|https://ucalgary.service-now.com/kb_view.do?sysparm_article=KB0033673
|-
|Linux
|https://ucalgary.service-now.com/kb_view.do?sysparm_article=KB0030086
|}
Once you are connected to the VPN, you may access any web services with your web browser or connect to login nodes via SSH using a SSH client of your choice.


$ apptainer exec $CONTAINER buoyantReactingFoam -help
....
....
</pre>
The variable will persist in your session and can be reused for other commands.


If you have '''difficulties connecting''' the FortiClient VPN to University's network, first please make sure that you have the latest
= Natively Installed OpenFOAM =
version of the client.


You can download and install it from the University software distribution centre:
== Activating a natively installed OpenFOAM ==
: https://iac01.ucalgary.ca/SDSWeb/LandingPage.aspx?ReturnUrl=%2fSDSWeb%2fdefault.aspx


For '''further assistance with connecting''' to the IT General VPN,
Due to technical specifics of the software, on ARC, no modules provided for '''OpenFOAM'''.
please contact the '''University IT team''' through [https://ucalgary.ca/it UService].
Instead, one has to <code>source</code> the activation script from '''OpenFOAM''' installation, <code><OFDIR>/etc/bashsrc</code>.


===Connecting using Fortinet Web Portal===
To see available installed version of '''OpenFOAM.org''':
The Fortinet web portal allows VPN access through your web browser. This option is available for users that prefer not to use the FortiClient program or cannot install the FortiClient VPN program.
<pre>
{| class="wikitable"
$ ls -ld /global/software/openfoam/OpenFOAM*
!VPN
!Notes
!URL
|-
|IT General VPN
|To log-in, click on 'Single Sign-On'.
|https://generalconnect.ucalgary.ca:10443/remote/login
|}
Once connected, you may use the 'Quick Connection' option to connect to a website or SSH to a login node.


====Connecting to a web site ====
drwxr-xr-x 11 drozmano drozmano 4096 Feb 22 15:20 /global/software/openfoam/OpenFOAM-10
* Click 'Quick Connection'
drwxr-xr-x 11 drozmano drozmano 4096 Feb 22 16:16 /global/software/openfoam/OpenFOAM-11
* Click HTTP/HTTPS
drwxr-xr-x 11 drozmano drozmano 4096 Feb 22 14:25 /global/software/openfoam/OpenFOAM-9
* Enter the URL that you want to go to i.e. jupyter.ucalgary.ca or talc.ucalgary.ca
drwxr-xr-x 11 drozmano drozmano 4096 Feb 23 16:10 /global/software/openfoam/OpenFOAM-8
* The website will open in a new browser tab and you may proceed normally.
</pre>
====Connecting via SSH====
* Click 'Quick Connection'
* Click SSH
* Fill out the "Host" box with <code>username@hostname</code>. For example, <code>uofcusername@arc.ucalgary.ca</code>
* A new browser tab will open with a prompt for your SSH username and password. If connecting to a login node, enter your UC credentials.
* The SSH terminal will open in a new tab and you can proceed typing commands into the terminal normally.


==Connecting with Citrix==
To '''activate''' the version of choice in your current terminal session (or in your job script):
Citrix Workspace app is only required for users connecting to the secured compute cluster MARC.  
<source lang=bash>
# Activate
$ module load openmpi/4.1.1-gnu
$ source /global/software/openfoam/OpenFOAM-10/etc/bashrc


Obtain the [https://www.citrix.com/downloads/workspace-app/ Citrix Workspace App] and then navigate to https://myappmf.ucalgary.ca/. You may access the Citrix Workspace without first connecting to the IT General VPN.
# Test if it works.
$ foamVersion
OpenFOAM-10


== Connecting with Open OnDemand ==
$
[[Open OnDemand]] is accessible both on and off campus without a VPN connection. You may access nodes via SSH using a browser-based SSH client and web services through a virtual desktop.
</source>
You have to '''source''' a special activation file from the installed OpenFOAM.  
The path to the activation files is '''<OpenFOAM_PATH>/etc/bashrc''', this is the path that has to be '''sourced'''.


== Connecting with SSH ==
Secure Shell (SSH) is an encrypted network protocol that allows secure communication between your computer and a server.


On Linux or MacOS computers, SSH is most likely installed and can be used by opening a terminal and running:
The '''activation''' is for the current '''bash session'''.
ssh username@arc.ucalgary.ca
If you start a new bash session, to use OpenFOAM you have to activate it again.
If you want to '''deactivate''' your OpenFOAM environment, you have to '''close the current bash session''' and start an new one.


For work requiring X11 forwarding, pass in the <code>-X</code> flag.
== Running OpenFOAM batch jobs on ARC ==
ssh -X username@arc.ucalgary.ca
Researchers using OpenFOAM on ARC are expected to be generally familiar with OpenFOAM capabilities, input file format and the use of restart files.


On Windows, there are various clients that can be used to connect to our cluster including:
Like other jobs on ARC, OpenFOAM calculations are run by submitting an appropriate script for batch scheduling using the sbatch command. See documentation on [[Running_jobs|running batch jobs]] for more information.
* PuTTY
* MobaXterm


=== Connecting with MobaXTerm ===
Several different versions of OpenFOAM have been installed on ARC under /global/software/openfoam, but, some researchers have chosen to install particular versions in their own home directories or take advantage of the wide range of [https://docs.computecanada.ca/wiki/Available_software versions installed on Compute Canada clusters (external link)].


'''MobaXTerm''' is an implementation of an SSH client and more.
Here is a sample script that was used to test OpenFOAM on ARC with one of the supplied tutorial cases ([http://cfd.direct/openfoam/user-guide/dambreak/ damBreakFine (external link)]), which used interFoam, modified to use the scotch decomposition option.  The job script and input files can be copied and run on ARC with:
: https://mobaxterm.mobatek.net/
Get the '''Home Edition Installer''' and run it on your Windows machine to install.


<syntaxhighlight lang="bash">
cp -R /global/software/openfoam/examples/damBreak/damBreakVeryFine_scotch_template dambreak
cd dambreak
sbatch dambreak.slurm
</syntaxhighlight >


To connect to a remote computer, you have to know '''four''' things:
The version ("6.x") of OpenFOAM used is from openfoam.org and was built with GNU 4.8.5 compilers and OpenMPI version 2.1.3. OpenFOAM build options used were WM_LABEL_SIZE=64 and FOAMY_HEX_MESH=yes.
* The name of the '''remote computer'''.
* The '''protocol''' that is used to connect.
* Your user name ('''account''') on that computer.
* Your '''password''' for that account on that computer.


<syntaxhighlight lang="bash">
#!/bin/bash


The name depends on the computer you want to use.
#SBATCH --nodes=2
For the '''ARC cluster''' uses the login node named '''arc.ucalgary.ca'''.
#SBATCH --ntasks-per-node=40    # number of MPI processes per node - adjust according to the partition
Thus, this would be the name of the '''remote host''' field in the dialog.
#SBATCH --mem=0                # Use --mem=0 to request all the available memory on a node
#SBATCH --time=05:00:00        # Maximum run time in hh:mm:ss, or d-hh:mm
#SBATCH --partition=pawson-bf,apophis-bf,razi-bf,cpu2019


'''Remote host''' simply means "the name of the remote computer you want to connect to".
# Check on some basics:


echo "Running on host: $(hostname)"
echo "Current working directory is: $(pwd)"
echo "Starting job at $(date)"


Your '''user name''' on that computer corresponds to '''your account''' name on it.
# Initialize OpenFOAM environment.
In the case of ARC it is your '''UofC IT name''', the part of your '''UofC email address''' before the '''@''' character.
module load openmpi/2.1.3-gnu
For example, your email is <code>user.name@ucalgary.ca</code>, then the account name is <code>user.name</code>.
export OMPI_MCA_mpi_cuda_support=0


source /global/software/openfoam/6x_20181025_gcc485_mpi_213gnu/OpenFOAM-6/etc/bashrc FOAMY_HEX_MESH=yes


The '''protocol''' you use to connect is the '''SSH''', you select it by choosing the SSH connection tab in the dialog.
export FOAM_RUN=$PWD
The protocol affects the '''connection port''' on the remote system.
It is the case of SSH it is set to '''22''' by default, which is correct for ARC.


The '''password''' you have to know and enter when prompted.
echo "Working in $PWD"
It is the '''same password''' that you use to access your '''UofC email'''.
If you ever change the UofC email password, it will automatically change for ARC as well.


= Connecting from Compute Canada systems =
CORES=$SLURM_NTASKS
echo "Running on $CORES cores."


'''Compute Canada clusters''' are allowed to connect to ARC (arc.ucalgary.ca) and ARC-DTN (arc-dtn.ucalgary.ca) directly, '''without VPN''' client, using an SSH client.
echo "Make a new decomposeParDict file"
The main purposes is to simplify data transfer between ARC and Compute Canada systems with either <code>scp</code> or <code>rsync</code>.
DATE=$(date)
Hence, if you are having difficulties connecting to ARC using FortiClient VPN and you have a Compute Canada account, you can connect to either '''Cedar''', '''Graham''', or '''Beluga''' clusters
and ssh to ARC from there.


= External Collaborators =
cat > system/decomposeParDict <<EOF
FoamFile
{
    version    2.0;
    format      ascii;
    class      dictionary;
    location    "system";
    object      decomposeParDict;
}


The main requirement for a researcher to have an account on ARC, or any other compute cluster at UofC is
// decomposeParDict created at ${DATE}.
that the researcher must have a '''UofC IT account''' and have a corresponding '''UofC e-mail address'''.
This is because access to the clusters is controlled by the central authentication system as to the rest of the electronic services on campus.
Thus, external collaborators cannot get an account on ARC unless they have a UofC IT account.  


numberOfSubdomains $CORES;


To address this issue and to facilitate a collaborative environment there is
method          scotch;
a hiring template for '''External Research Collaborators''' who need to remotely access the secure compute and
High Performance Compute (HPC) resources at the University of Calgary.


EOF


This new designation allows you to request '''General Associate (GA)''' access for external research collaborators,
echo "Forcing new decomposition"
that are not University of Calgary employees or associated with AHS, to our HPC and Secure Compute services in an expedited manner.
Researchers in this category require a '''Principal Investigator (PI)''' or a PI delegate to submit a '''Template Based Hire (TBH)'''
with the new GA template.
Please note that AHS external researchers have their own GA template available for them and they don’t need to use this new GA template.


decomposePar -force


Principal Investigators or their delegates can request the creation of a new General Associate External Collaborator
echo "Using mpiexec: $(which mpiexec)"
following the Template Base Hire form process in PeopleSoft and selecting the template
“'''UC_CWR_EXT_RES_CL – Gen Associate – External Research Collaborator'''”.
These requests will need to be approved by Research Computing Services, who is managing HPC.
Once the transaction is approved, it will go to HR to complete the hiring process and the new account will be ready for your associate.


FOAM=$(which interFoam)
echo "About to run $FOAM at $(date)"


Actions Required
mpiexec $FOAM -parallel > dambreakveryfine_scotch_arc_${CORES}cores_${SLURM_JOB_ID}.out
* For new External Research Collaborators, please follow the '''Template Base Hire''' described above.
* If you currently have '''External Collaborators''' under a '''different template''' but fits into the new category, please let us know their names and email addresses and we will transfer them to the proper template.


[[Category:Guides]]
echo "Finished at $(date)"
{{Navbox Guides}}
 
echo "Running reconstructPar at $(date)."
reconstructPar -newTimes
echo "Finished reconstructPar at $(date)."
echo "Manually delete processor directories if reconstruction succeeded. "
</syntaxhighlight >
 
OpenFOAM can produce large numbers of files per run when many processors (CPU cores) are used.  The reconstructPar command should be used to consolidate the per-processor files into a single directory per time step.  As noted in a comment in the above script, after you have verified that the reconstruction has succeeded you shoulld delete the processor directories.
 
= Support =
Please send any questions regarding using OpenFOAM on ARC to support@hpc.ucalgary.ca.
 
= Links =
 
[[ARC Software pages]]
 
[[Category:OpenFOAM]]
[[Category:Software]]
[[Category:ARC]]
{{Navbox ARC}}

Latest revision as of 14:33, 20 June 2024

Background

OpenFOAM (for "Open-source Field Operation And Manipulation") is a free, open-source, toolkit for creation of computational fluid dynamics (CFD) applications. It includes solver libraries and pre- and post-processing utilities. Common variants of OpenFOAM include those from openfoam.org and openfoam.com .

Typically, researchers will install OpenFOAM on their own computers to learn how to use the software, run simulations that exceed their local hardware capabilities on ARC and then transfer output data back to their own computers for visualization.

There are three main variants of OpenFOAM software that are released as free and open-source software under the GNU General Public License Version 3.

  • OpenFOAM Foundation Inc. variant, released by The OpenFOAM Foundation Inc. (since 2012),
and transferred in 2015 to the English company The OpenFOAM Foundation Ltd.
https://openfoam.org/


  • OpenFOAM variant by OpenCFD Ltd. (with the name trademarked since 2007) first released as open-source in 2004.
(Note that since 2012, OpenCFD Ltd is an affiliate of ESI Group.)
https://www.openfoam.com/


  • FOAM-Extend variant by Wikki Ltd. (since 2009) - a fork of the OpenFOAM® open source library for Computational Fluid Dynamics (CFD).
http://wikki.co.uk/index.php/foam-extend/
http://foam-extend.org

OpenFOAM Apptainer Containers

The more recent versions of OpenFOAM installed on ARC have been built as Apptainer containers based off of the docker containers available on Dockerhub or custom build by ARC's team.

The path to the directory with the container files is /global/software/openfoam/containers. This directory contains three subdirectories, com, extend, and org. These directories are for containerized versions of OpenFOAM provided by OpenCFD Ltd., Wikki Ltd., and OpenFOAM Foundation Inc., respectively.


To see what versions of containerized OpenFOAM are available, simply list the content of the corresponding directory:

$ ls -l /global/software/openfoam/containers/
drwxr-xr-x 2 drozmano drozmano 4096 Jan 20  2023 com
drwxr-xr-x 2 drozmano drozmano 4096 Jan 20  2023 extend
drwxr-xr-x 2 drozmano drozmano 4096 Jan 20  2023 org

$ ls -l /global/software/openfoam/containers/com
-rwxr-xr-x 1 drozmano drozmano 1521639424 Jan 20  2023 openfoam2012.sif
-rwxr-xr-x 1 drozmano drozmano  418091008 Jan 20  2023 openfoam2206.sif

$ ls -l /global/software/openfoam/containers/extend/

$ ls -l /global/software/openfoam/containers/org
-rwxr-xr-x 1 drozmano drozmano 853811200 Oct 16 15:53 openfoam10-paraview56-arc.sif
-rwxr-xr-x 1 drozmano drozmano 894136320 Oct 16 15:53 openfoam11-paraview510-arc.sif
-rwxr-xr-x 1 drozmano drozmano 814657536 Oct 16 15:53 openfoam9-paraview56-arc.sif

Currently, there are no version of OpenFOAM-Extend are provided.

The OpenFOAM.org containers were slightly modified, to simplify their use. This is indicated by the -arc suffix in their filenames. They are based on the official containers from DockerHub.

Running commands from the containers

You can run a command from the container of your choice with the command line like this:

$ apptainer exec <container> <command> [command arguments]

OpenFOAM.org

For example, you want to use the container with OpenFOAM.org v10, openfoam10-paraview56.sif, in the org subdirectory, and you want to see the help page for thebuoyantReactingFoam command. This is how you can do that:

$ $ apptainer exec /global/software/openfoam/containers/org/openfoam10-paraview56-arc.sif buoyantReactingFoam -help

Usage: buoyantReactingFoam [OPTIONS]
options:
  -case <dir>       specify alternate case directory, default is the cwd
....
....

To make it easier to type the long container file name, it can be assigned to an environmental variable, CONTAINER, for example:

$ export CONTAINER=/global/software/openfoam/containers/org/openfoam10-paraview56-arc.sif

$ apptainer exec $CONTAINER buoyantReactingFoam -help
....
....

The variable will persist in your session and can be reused for other commands.

Natively Installed OpenFOAM

Activating a natively installed OpenFOAM

Due to technical specifics of the software, on ARC, no modules provided for OpenFOAM. Instead, one has to source the activation script from OpenFOAM installation, <OFDIR>/etc/bashsrc.

To see available installed version of OpenFOAM.org:

$ ls -ld /global/software/openfoam/OpenFOAM*

drwxr-xr-x 11 drozmano drozmano 4096 Feb 22 15:20 /global/software/openfoam/OpenFOAM-10
drwxr-xr-x 11 drozmano drozmano 4096 Feb 22 16:16 /global/software/openfoam/OpenFOAM-11
drwxr-xr-x 11 drozmano drozmano 4096 Feb 22 14:25 /global/software/openfoam/OpenFOAM-9
drwxr-xr-x 11 drozmano drozmano 4096 Feb 23 16:10 /global/software/openfoam/OpenFOAM-8

To activate the version of choice in your current terminal session (or in your job script):

# Activate
$ module load openmpi/4.1.1-gnu
$ source /global/software/openfoam/OpenFOAM-10/etc/bashrc

# Test if it works.
$ foamVersion 
OpenFOAM-10

$

You have to source a special activation file from the installed OpenFOAM. The path to the activation files is <OpenFOAM_PATH>/etc/bashrc, this is the path that has to be sourced.


The activation is for the current bash session. If you start a new bash session, to use OpenFOAM you have to activate it again. If you want to deactivate your OpenFOAM environment, you have to close the current bash session and start an new one.

Running OpenFOAM batch jobs on ARC

Researchers using OpenFOAM on ARC are expected to be generally familiar with OpenFOAM capabilities, input file format and the use of restart files.

Like other jobs on ARC, OpenFOAM calculations are run by submitting an appropriate script for batch scheduling using the sbatch command. See documentation on running batch jobs for more information.

Several different versions of OpenFOAM have been installed on ARC under /global/software/openfoam, but, some researchers have chosen to install particular versions in their own home directories or take advantage of the wide range of versions installed on Compute Canada clusters (external link).

Here is a sample script that was used to test OpenFOAM on ARC with one of the supplied tutorial cases (damBreakFine (external link)), which used interFoam, modified to use the scotch decomposition option. The job script and input files can be copied and run on ARC with:

cp -R /global/software/openfoam/examples/damBreak/damBreakVeryFine_scotch_template dambreak
cd dambreak
sbatch dambreak.slurm

The version ("6.x") of OpenFOAM used is from openfoam.org and was built with GNU 4.8.5 compilers and OpenMPI version 2.1.3. OpenFOAM build options used were WM_LABEL_SIZE=64 and FOAMY_HEX_MESH=yes.

#!/bin/bash

#SBATCH --nodes=2
#SBATCH --ntasks-per-node=40    # number of MPI processes per node - adjust according to the partition
#SBATCH --mem=0                 # Use --mem=0 to request all the available memory on a node  
#SBATCH --time=05:00:00         # Maximum run time in hh:mm:ss, or d-hh:mm
#SBATCH --partition=pawson-bf,apophis-bf,razi-bf,cpu2019

# Check on some basics:

echo "Running on host: $(hostname)"
echo "Current working directory is: $(pwd)"
echo "Starting job at $(date)"

# Initialize OpenFOAM environment.
module load openmpi/2.1.3-gnu
export OMPI_MCA_mpi_cuda_support=0

source /global/software/openfoam/6x_20181025_gcc485_mpi_213gnu/OpenFOAM-6/etc/bashrc FOAMY_HEX_MESH=yes

export FOAM_RUN=$PWD

echo "Working in $PWD"

CORES=$SLURM_NTASKS
echo "Running on $CORES cores."

echo "Make a new decomposeParDict file"
DATE=$(date)

cat > system/decomposeParDict <<EOF
FoamFile
{
    version     2.0;
    format      ascii;
    class       dictionary;
    location    "system";
    object      decomposeParDict;
}

// decomposeParDict created at ${DATE}.

numberOfSubdomains $CORES;

method          scotch;

EOF

echo "Forcing new decomposition"

decomposePar -force

echo "Using mpiexec: $(which mpiexec)"

FOAM=$(which interFoam)
echo "About to run $FOAM at $(date)"

mpiexec $FOAM -parallel > dambreakveryfine_scotch_arc_${CORES}cores_${SLURM_JOB_ID}.out

echo "Finished at $(date)"

echo "Running reconstructPar at $(date)."
reconstructPar -newTimes
echo "Finished reconstructPar at $(date)."
echo "Manually delete processor directories if reconstruction succeeded. "

OpenFOAM can produce large numbers of files per run when many processors (CPU cores) are used. The reconstructPar command should be used to consolidate the per-processor files into a single directory per time step. As noted in a comment in the above script, after you have verified that the reconstruction has succeeded you shoulld delete the processor directories.

Support

Please send any questions regarding using OpenFOAM on ARC to support@hpc.ucalgary.ca.

Links

ARC Software pages