NERSC Spack Infrastructure

Welcome to NERSC Spack Infrastructure, this project contains the spack configuration for our spack stacks built for NERSC system such as Cori and Perlmutter. We leverage gitlab to automate spack deployments and project is located at https://software.nersc.gov/NERSC/spack-infrastructure. You must have a NERSC account in order to access our system and gitlab server.

There is a push mirror of this repo at https://github.com/NERSC/spack-infrastructure for public consumption.

Spack Infrastructure

The Spack Infrastructure Project makes use of spack package manager to install spack software stack on NERSC systems. This project contains spack configuration (spack.yaml) required to build the spack stacks. The spack stack is based on Extreme-Scale Scientific Software Stack (E4S) where we install spack packages provided by E4S and use the recommended spack branch. We leverage Gitlab CI to automate deployment to ensure reproducible and automated builds. For more details about this project you can see the documentation at https://nersc-spack-infrastructure.rtfd.io

Software Deployment Overview

The software deployment consist of the following steps

  1. Acquire Spack Configuration from E4S project https://github.com/E4S-Project/e4s

  2. Create one or more spack configuration files (spack.yaml) with list of E4S packages and integrate spack configuration for NERSC system

  3. Create a Gitlab Job to trigger the pipeline for TDS and Deployment system

  4. Create a Modulefile as entry point to stack

  5. Write User Documentation

  6. Share spack configuration with open-source community

  7. Send announcement to all NERSC users

Step 1: Acquire Spack Configuration

At NERSC, we plan our software deployment with E4S releases which is typically every 3 months however we perform deployment every 6 months. Once E4S has released the spack configuration we acquire the spack configuration which is typically found in https://github.com/E4S-Project/e4s/tree/master/environments. We also acquire the spack branch used by E4S team as our baseline, this would be documented in the release notes. The name of branch map to the E4S version so version 23.05 will have a branch e4s-23.05.

Next, we copy the packages into our project and create the spack configuration

Step 2: Create Spack Configuration

In this step we create the spack configuration. First we create a sub-directory in spack-configs with the naming convention to distinguish E4S version. This typically includes the name of the system such as cori or perlmutter followed by name of e4s version such as e4s-23.05.

$ tree -L 1 spack-configs
spack-configs
├── cori-e4s-20.10
├── cori-e4s-21.02
├── cori-e4s-21.05
├── cori-e4s-22.02
├── perlmutter-e4s-21.11
├── perlmutter-e4s-22.05
├── perlmutter-e4s-22.11
├── perlmutter-e4s-23.05
├── perlmutter-spack-develop
└── perlmutter-user-spack

10 directories, 0 files

Inside one of the stacks, you will see several sub-directories that are used for defining a sub-stack. These sub-stacks correspond to spack environments. The prod directory is used for production deployment to install from the buildcache.

$ tree -L 3 spack-configs/perlmutter-e4s-22.11
spack-configs/perlmutter-e4s-22.11
├── cce
│   └── spack.yaml
├── cuda
│   └── spack.yaml
├── definitions.yaml
├── gcc
│   └── spack.yaml
├── nvhpc
│   └── spack.yaml
└── prod
    ├── cce
    │   └── spack.yaml
    ├── cuda
    │   └── spack.yaml
    ├── gcc
    │   └── spack.yaml
    └── nvhpc
        └── spack.yaml

9 directories, 9 files

We create a special file named definitions.yaml that is used for declaring definitions that is referenced in spack.yaml. This file is appended to all spack configuration. We do this to ensure all specs are defined in one place.

During this step, we will create the spack configuration and specify our preferred compilers and package preference. We install software in buildcache so it can be relocated to production path. In order to accomplish this task, we use spack pipelines that uses spack ci generate and spack ci rebuild to perform parallel pipeline execution. During this step, we determine which packages to install from E4S and add our own packages to comply with our site preference.

Step 3: Create Gitlab Job for Automation

Once spack configuration is written, we create a gitlab job to trigger the pipeline. This can be done by specifying a job in .gitlab-ci.yml.

The gitlab job can be triggered through scheduled pipelines, web-interface, or merge request to the project. A typical gitlab job will look something like this. Shown below is for E4S 23.05 generate job. We make use of gitlab feature named extends which allows us to reuse configuration. The spack ci generate command will be the same for each substack. There is two jobs, first is the generate step performed by spack ci generate and this triggers the downstream job created by spack.

.perlmutter-e4s-23.05-generate:
  stage: generate
  needs: ["perlmutter:check_spack_dependencies"]
  tags: [perlmutter-e4s]
  interruptible: true
  allow_failure: true
  rules:
    - if: ($CI_PIPELINE_SOURCE == "schedule" || $CI_PIPELINE_SOURCE == "web") && ($PIPELINE_NAME == "PERLMUTTER_E4S_23.05")
    - if: ($CI_PIPELINE_SOURCE == "merge_request_event")
      changes:
      - spack-configs/perlmutter-e4s-23.05/$STACK_NAME/spack.yaml
      - spack-configs/perlmutter-e4s-23.05/definitions.yaml
  before_script:
    - *copy_perlmutter_settings
    - *startup_modules
  script:
    - *e4s_23_05_setup 
    - cd $CI_PROJECT_DIR/spack-configs/perlmutter-e4s-23.05/$STACK_NAME
    - cat $CI_PROJECT_DIR/spack-configs/perlmutter-e4s-23.05/definitions.yaml >> spack.yaml
    - spack env activate --without-view  .
    - spack env st
    #- spack -d concretize -f | tee $CI_PROJECT_DIR/concretize.log    
    - spack -d ci generate --check-index-only --artifacts-root "$CI_PROJECT_DIR/jobs_scratch_dir" --output-file "${CI_PROJECT_DIR}/jobs_scratch_dir/pipeline.yml"
  artifacts: 
    paths:
    - ${CI_PROJECT_DIR}/jobs_scratch_dir


perlmutter-e4s-23.05-cce-generate:
  extends: .perlmutter-e4s-23.05-generate
  variables:
    STACK_NAME: cce

perlmutter-e4s-23.05-cce-build:
  stage: build
  needs: ["perlmutter:check_spack_dependencies", "perlmutter-e4s-23.05-cce-generate"]
  allow_failure: true
  rules:
    - if: ($CI_PIPELINE_SOURCE == "schedule" || $CI_PIPELINE_SOURCE == "web") && ($PIPELINE_NAME == "PERLMUTTER_E4S_23.05")
    - if: ($CI_PIPELINE_SOURCE == "merge_request_event")
      changes:
      - spack-configs/perlmutter-e4s-23.05/cce/spack.yaml
      - spack-configs/perlmutter-e4s-23.05/definitions.yaml
  trigger:
    include:
      - artifact: jobs_scratch_dir/pipeline.yml
        job: perlmutter-e4s-23.05-cce-generate
    strategy: depend

Step 4: Create Modulefile

In this step, we create a modulefile as entry point to software stack and setup spack. We do not create spack generated modules for spack packages, instead one is expected to use spack load. Shown below are the modulefiles available on NERSC system, they are typically called e4s/<version> with a symbolic link to module spack/e4s-<version>

siddiq90@login37> ml -t av e4s
/global/common/software/nersc/pm-2022.12.0/extra_modulefiles:
e4s/22.05
e4s/22.11
spack/e4s-22.05
spack/e4s-22.11

Shown below is the content of our modulefile, the setup is subject to change

siddiq90@login37> ml --raw show e4s
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
   /global/common/software/nersc/pm-2022.12.0/extra_modulefiles/e4s/22.11.lua:
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
whatis([[
        The Extreme-scale Scientific Software Stack (E4S) is a collection of open source software packages for running scientific applications on high-performance computing (HPC) platforms.
        ]])
help([[ The Extreme-scale Scientific Software Stack (E4S) is a community effort to provide open source software packages for developing, deploying and running scientific applications on high-performance computing (HPC) platforms. E4S provides from-source builds and containers of a broad collection of HPC software packages.

References:
  - E4S User Docs: https://e4s.readthedocs.io/en/latest/index.html
  - E4S 22.11 Docs: https://docs.nersc.gov/applications/e4s/perlmutter/22.11/
  - E4S Homepage: https://e4s-project.github.io/
  - E4S GitHub: https://github.com/E4S-Project/e4s
        ]])

local root = "/global/common/software/spackecp/perlmutter/e4s-22.11/default/spack"

setenv("SPACK_GNUPGHOME", pathJoin(os.getenv("HOME"), ".gnupg"))
setenv("SPACK_SYSTEM_CONFIG_PATH", "/global/common/software/spackecp/perlmutter/spack_settings")
-- setup spack shell functionality
local shell = myShellType()
if (mode() == "load") then
    local spack_setup = ''
    if (shell == "sh" or shell == "bash" or shell == "zsh") then
         spack_setup = pathJoin(root, "share/spack/setup-env.sh")
    elseif (shell == "csh") then
         spack_setup = pathJoin(root, "share/spack/setup-env.csh")
    elseif (shell == "fish")  then
         spack_setup = pathJoin(root, "share/spack/setup-env.fish")
    end

    -- If we are unable to find spack setup script let's terminate now.
    if not isFile(spack_setup) then
        LmodError("Unable to find spack setup script " .. spack_setup .. "\n")
    end

    execute{cmd="source " .. spack_setup, modeA={"load"}}

    LmodMessage([[
    _______________________________________________________________________________________________________
     The Extreme-Scale Scientific Software Stack (E4S) is accessible via the Spack package manager.

     In order to access the production stack, you will need to load a spack environment. Here are some tips to get started:


     'spack env list' - List all Spack environments
     'spack env activate gcc' - Activate the "gcc" Spack environment
     'spack env status' - Display the active Spack environment
     'spack load amrex' - Load the "amrex" Spack package into your user environment

     For additional support, please refer to the following references:

       NERSC E4S Documentation: https://docs.nersc.gov/applications/e4s/
       E4S Documentation: https://e4s.readthedocs.io
       Spack Documentation: https://spack.readthedocs.io/en/latest/
       Spack Slack: https://spackpm.slack.com

    ______________________________________________________________________________________________________
    ]])
-- To remove spack from shell we need to remove a few environment variables, alias and remove $SPACK_ROOT/bin from $PATH
elseif (mode() == "unload" or mode() == "purge") then
    if (shell == "sh" or shell == "bash" or shell == "zsh") then
      execute{cmd="unset SPACK_ENV",modeA={"unload"}}
      execute{cmd="unset SPACK_ROOT",modeA={"unload"}}
      execute{cmd="unset -f spack",modeA={"unload"}}
    elseif (shell == "csh") then
      execute{cmd="unsetenv SPACK_ENV",modeA={"unload"}}
      execute{cmd="unsetenv SPACK_ROOT",modeA={"unload"}}
      execute{cmd="unalias spack",modeA={"unload"}}
    end

    -- Need to remove $SPACK_ROOT/bin from $PATH which removes the 'spack' command
    remove_path("PATH", pathJoin(root, "bin"))

    -- Remove alias spacktivate. Need to pipe to /dev/null as invalid alias can report error to stderr
    execute{cmd="unalias spacktivate > /dev/null",modeA={"unload"}}
end

Step 5: User Documentation

User documentation is fundamental to help assist users with using E4S at NERSC. We document every E4S release with its Release Date and End of Support date along with a documentation page outlining the software stack. Our E4S documentation is available at https://docs.nersc.gov/applications/e4s/. The release date is when documentation is live. We perform this action in conjunction with release of modulefile so that user gain access to software stack.

Upon completion of this task, we are ready to make announcement to our NERSC users

Step 6: Sharing spack configuration with open-source community

In this step, we share our spack configuration with open-source community that may benefit the wider community. We share our spack configuration at https://github.com/spack/spack-configs. In addition, we update the E4S Facility Dashboard that shows all the E4S deployments across all the facilities.

Step 7: Public Announcement

This is the final step of the deployment process, where we make a public announcement in NERSC weekly email, along with various slack channels such as Nersc User Group (NUG), Spack, ECP and E4S slack.

Current Challenges

There are several challenges with building spack stack at NERSC which can be summarized as follows

  • System OS + Cray Programming Environment (CPE) changes: A system upgrade such as change to glibc or upgrades in CPE can lead to full software stack rebuild, especially if you have external packages set for packages like cray-mpich, cray-libsci which generally change between versions

  • Incompatibile compilers: Some packages can’t be built with certain compilers (nvhpc, aocc) which could be due to several factors.

    • An application doesn’t have support though it was be added in newer version but you don’t have it in your spack release used for deployment

    • Lack of support in spack package recipe or spack-core base including spack-cray detection. This may require getting fix and cherry-pick commit or waiting for new version

    • Spack Cray detection is an important part in build errors including how one specifies externals via modules vs prefix both could be provided and it requires experimentation. An example of this is trying to get cray-mpich external one could set something like this with modules or prefix

      cray-mpich:
        buildable: false
        externals:
        - spec: cray-mpich@8.1.11 %gcc@9.3.0
          prefix: /opt/cray/pe/mpich/8.1.11/ofi/gnu/9.1
          modules:
          - cray-mpich/8.1.11
          - cudatoolkit/21.9_11.4
    
    • Spack concretizer prevent one from chosing a build configration for a spec. This requires a few troubleshooting step but usually boils down to:

      • Read the spack package file spack edit <package> for conflicts and try spack spec to see concretized spec.

      • Try different version, different compiler, different dependency. Some packages have conflicting variant for instance one can’t enable +openmp and +pthread it is mutually exclusive.

There is a document Spack E4S Issues on Permlutter outlining current issues with spack. If you need access to document please contact Shahzeb Siddiqui.

Contact

If you need elevated privledge or assistance with this project please contact one of the maintainers:

Spack Configuration

This page will show all of our Spack configuration files (spack.yaml) used for our production deployments. The Spack configuration located in spack-configs directory organized by each subdirectory.

At NERSC, we are building the Extreme-scale Scientific Software Stack (E4S) which is a collection of open-source products software packages part of Spack ecosystem for running scientific applications on high-performance computing (HPC) platforms. We acquire the Spack configuration from https://github.com/E4S-Project/e4s upon release with list of specs and reference Spack branch in order to build the E4S stack. Please see our E4S documentation at https://docs.nersc.gov/applications/e4s/

Perlmutter Spack Develop

This Spack configuration will build all packages using Spack develop branch on weekly basis. All specs are specified without any version in order to let Spack build the latest package which will evolve over time.

You may add the mirror into your spack environment by running:

spack mirror add perlmutter-spack-buildcache /global/common/software/spackecp/mirrors/perlmutter-spack-develop

Or you can explicitly add the following lines into your spack.yaml

mirrors:
  perlmutter-spack-buildcache: file:///global/common/software/spackecp/mirrors/perlmutter-spack-develop

Spack Configuration for spack@develop

spack:
  view: false
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree:
      padded_length: 128    
  mirrors:
    perlmutter-spack-buildcache: file:///global/common/software/spackecp/mirrors/perlmutter-spack-develop 
    source_mirror: file:///global/common/software/spackecp/mirrors/source_mirror
  cdash:
    build-group: DOE nightly E4S builds
    url: https://cdash.spack.io
    project: Spack
    site: NERSC - Perlmutter spack@develop 
     
  definitions:
  - gcc_compilers: ['%gcc@11.2.0']
  - nvhpc_compilers: ['%nvhpc@21.11']
  - cray_compilers: ['%cce@13.0.2']
  - cray_specs:
    - adios2   
    - fftw
    - hdf5    
    - papi
    - petsc +openmp +strumpack    
    - superlu
    - superlu-dist +openmp

  - gcc_specs:
    #- adios2
    - hdf5
    #- hypre +openmp +superlu-dist
    - papi
    #- petsc +openmp +strumpack
    - raja
    #- strumpack ~slate
    #- sundials +openmp +hypre
    #- superlu
    #- superlu-dist +openmp
    
  - cuda_specs:
    - amrex +cuda cuda_arch=80
    - blaspp +cuda cuda_arch=80
    - hipace compute=cuda
    - hpctoolkit +cuda +cray +mpi
    - hypre +cuda cuda_arch=80
    - kokkos-kernels +openmp +cuda cuda_arch=80 ^kokkos +openmp +wrapper +cuda cuda_arch=80
    - kokkos +openmp +wrapper +cuda cuda_arch=80
    - lapackpp ^blaspp +cuda cuda_arch=80
    - magma@2.6.1+cuda cuda_arch=80
    - mfem@4.3.0+cuda cuda_arch=80
    - petsc +cuda cuda_arch=80
    - py-warpx ^warpx dims=1 compute=cuda
    - py-warpx ^warpx dims=2 compute=cuda
    - py-warpx ^warpx dims=3 compute=cuda
    - py-warpx ^warpx dims=rz compute=cuda
    - qmcpack +cuda cuda_arch=80
    - raja +cuda cuda_arch=80
    - slepc +cuda cuda_arch=80
    - trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack
      +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox
      +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos
      +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
    - strumpack ~slate +cuda cuda_arch=80
    - slate +cuda cuda_arch=80
    - superlu-dist +openmp +cuda cuda_arch=80
    - sundials +openmp +cuda cuda_arch=80
    - upcxx +gasnet +mpi
    - umpire ~shared +cuda cuda_arch=80
    - upcxx +cuda
    - warpx dims=1 compute=cuda
    - warpx dims=2 compute=cuda
    - warpx dims=3 compute=cuda
    - warpx dims=rz compute=cuda
    - zfp +cuda cuda_arch=80
  - nvhpc_specs:
     #- adios2 failed due to libffi see https://github.com/libffi/libffi/issues/691
    - amrex +cuda cuda_arch=80
    - blaspp +cuda cuda_arch=80
    - hypre +cuda cuda_arch=80
    - kokkos +openmp +wrapper +cuda cuda_arch=80
    - kokkos-kernels +openmp +cuda cuda_arch=80 ^kokkos +openmp +wrapper +cuda cuda_arch=80
    - lapackpp ^blaspp +cuda cuda_arch=80
    - openpmd-api
    - petsc +cuda cuda_arch=80
    - py-numba
    - raja +cuda cuda_arch=80
    - umpire ~shared +cuda cuda_arch=80
    - upcxx +cuda
    - zfp +cuda cuda_arch=80

  - nersc_specs:
    #- amber+openmp requires tarball and license
    # skipping arm-forge for now this requires a license and gets stuck in CI job.
    #- arm-forge
    - abinit +wannier90
    - amdblis
    - amdfftw
    # requested by user INC0176750. See https://github.com/NVIDIA/AMGX/issues/165 
    - amgx +cuda cuda_arch=80 
    - amdscalapack
    - atompaw
    - berkeleygw
    - boost cxxstd=11
    - boost cxxstd=14
    - boost cxxstd=98
    - cmake
    - dpcpp +openmp
    - eigen
    - elpa
    - fpm
    - lammps
    - llvm-openmp
    - metis
    - mt-metis
    - mumps
    - nccmp
    - nco
    - octave
    - parmetis
    - parallel
    - plumed
    - qmcpack
    - quantum-espresso
    - scotch
    - sparskit
    - superlu-mt
    - wannier90
    - valgrind
    # cuda_arch=80 not supported in spack package yet. See https://github.com/spack/spack/issues/28554
    - cp2k +cuda cuda_arch=70 +elpa +cosma

  specs:
  #- matrix:
  #  - [$cray_specs]
  #  - [$cray_compilers]
  - matrix:
    - [$gcc_specs]
    - [$gcc_compilers]
  #- matrix:
  #  - [$cuda_specs]
  #  - [$gcc_compilers]
  #- matrix:
  #  - [$nvhpc_specs]
  #  - [$nvhpc_compilers]
  #- matrix:      
  #  - [$nersc_specs]
  #  - [$gcc_compilers]

  gitlab-ci:
    enable-artifacts-buildcache: false
    rebuild-index: true
    before_script:
    - module reset
    - module use /global/common/software/nersc/$(cat /etc/nersc_modules_rev)/extra_modulefiles
    - module load cpu
    - module list
    - source setup-env.sh
    - git clone ${SPACK_REPO}
    - pushd spack && git checkout ${SPACK_CHECKOUT_VERSION} && popd    
    - . spack/share/spack/setup-env.sh
    - spack --version
    - spack-python --path
    script:
    - cd ${SPACK_CONCRETE_ENV_DIR} 
    - spack env activate --without-view -d .
    - spack env st
    - spack -d ci rebuild
    after_script:
    - rm -rf $SPACK_ROOT
    service-job-attributes:
      tags: [perlmutter-e4s]
      script:
      - echo "End Pipeline"
    mappings:
    - match: [os=sles15]
      runner-attributes:
        tags: [perlmutter-e4s]

Perlmutter E4S 23.05

Production Spack Configuration

GCC spack environment

spack:
  view: false
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree: $spack/opt/spack
  concretizer:
    reuse: false
    unify: false
  mirrors:       
    perlmutter-e4s-23.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-23.05
  specs:
  - matrix:
    - [$gcc_specs]
    - [$gcc_compilers]
  - matrix:
    - [$nersc_specs]
    - [$gcc_compilers]
    

CCE Spack Environment

spack:
  view: false
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree: $spack/opt/spack
  concretizer:
    reuse: false
    unify: false
  mirrors:       
    perlmutter-e4s-23.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-23.05
  specs:
  - matrix:
    - [$cce_specs]
    - [$cce_compilers]

NVHPC Spack Environment

spack:
  view: false
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree: $spack/opt/spack
  concretizer:
    reuse: false
    unify: false
  mirrors:       
    perlmutter-e4s-23.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-23.05
  packages:
    cmake::
      require: '%gcc'
    # build failures with xz %nvhpc so reverting to gcc
    xz::
      require: '%gcc' 
  specs:
  - matrix:
    - [$nvhpc_specs]
    - [$nvhpc_compilers]
    

CUDA Spack Environment

spack:
  view: false
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree: $spack/opt/spack
  concretizer:
    reuse: false
    unify: false
  mirrors:       
    perlmutter-e4s-23.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-23.05
  specs:
  - matrix:
    - [$cuda_specs]
    - [$gcc_compilers]
    

DATA Spack Environment

spack:
  view: false
  include:
  - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
  - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree: $spack/opt/spack
  concretizer:
    reuse: false
    unify: false
  mirrors:
    perlmutter-e4s-23.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-23.05
  specs:
    - matrix:
      - [$data_specs]
      - [$gcc_compilers]
      

MATH-LIBS Spack Environment

spack:
  view: false
  include:
  - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
  - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree: $spack/opt/spack
  concretizer:
    reuse: false
    unify: false
  mirrors:
    perlmutter-e4s-23.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-23.05
  specs:
  - matrix:
    - [$math-libs]
    - [$gcc_compilers]
    

TOOLS Spack Environment

spack:
  view: false
  include:
  - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
  - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree: $spack/opt/spack
  concretizer:
    reuse: false
    unify: false
  mirrors:
    perlmutter-e4s-23.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-23.05
  specs:
  - matrix:
    - [$tools]
    - [$gcc_compilers]

Shown below is the list of definitions that is used for all of our spack environments.

Definitions for Spack Environments

  
  definitions:
  - cce_compilers: ['%cce@=15.0.0']
  - gcc_compilers: ['%gcc@=11.2.0']
  - nvhpc_compilers: ['%nvhpc@=22.7']

  - cce_specs:
    - fftw   
    - hdf5 +fortran +hl +shared
    - hypre    
    - kokkos +openmp
    - kokkos-kernels +openmp
    - netcdf-c
    - netcdf-fortran
    - papi
    - petsc
    - slepc
    - sundials
    - superlu
    - superlu-dist 

  - math-libs:    
    - arborx
    - fftw@=3.3.8
    - fftw@=3.3.9
    - fftw@=3.3.10
    - ginkgo
    - heffte
    - hypre
    - intel-mkl
    - openblas
    - petsc
    - phist
    - pumi
    - slate
    - slepc
    - sundials
    - superlu
    - superlu-dist

  - data_specs:
    - datatransferkit
    - hdf5
    - hdf5-vol-async
    - hdf5-vol-cache
    - netcdf-c
    - netcdf-fortran
    - parallel-netcdf

    #- hdf5-vol-log Build error

  - nvhpc_specs: 
    - amrex
    - boost
    - hdf5 ~mpi    
    - sundials
    - superlu

    #- superlu-dist
    #- hypre
    #- kokkos
    #- kokkos-kernels 
    #- papi   Build failud due to nvc-Error-Unknown switch: -Wno-error


  - cuda_specs:
    - adios2@=2.8.3 +cuda cuda_arch=80 # ecp-data-vis-sdk 
    - amrex +cuda cuda_arch=80
    - arborx +cuda cuda_arch=80 ^kokkos +wrapper
    - cabana +cuda ^kokkos +wrapper +cuda_lambda +cuda cuda_arch=80
    - caliper +cuda cuda_arch=80
    - chai ~benchmarks ~tests +cuda cuda_arch=80 ^umpire ~shared
    - cusz +cuda cuda_arch=80
    - flecsi +cuda cuda_arch=80
    - ginkgo +cuda cuda_arch=80
    - heffte +cuda cuda_arch=80
    - hpx max_cpu_count=512 +cuda cuda_arch=80
    - hypre +cuda cuda_arch=80
    - kokkos +wrapper +cuda cuda_arch=80
    - kokkos-kernels +cuda cuda_arch=80 ^kokkos +wrapper +cuda cuda_arch=80
    - lammps +cuda cuda_arch=80
    - legion +cuda cuda_arch=80
    - magma +cuda cuda_arch=80
    - mfem +cuda cuda_arch=80
    - mgard +serial +openmp +timing +unstructured +cuda cuda_arch=80
    - omega-h +cuda cuda_arch=80
    - parsec +cuda cuda_arch=80
    - petsc +cuda cuda_arch=80    
    - slate +cuda cuda_arch=80
    - slepc +cuda cuda_arch=80
    - strumpack ~slate +cuda cuda_arch=80
    - sundials +cuda cuda_arch=80
    - superlu-dist +cuda cuda_arch=80
    - tasmanian +cuda cuda_arch=80
    - umpire ~shared +cuda cuda_arch=80
    - zfp +cuda cuda_arch=80

    # - libpressio +bitgrooming +bzip2 +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +cusz +mgard +cuda cuda_arch=80 ^cusz +cuda cuda_arch=80    # concretization issue with cuda

    # CUDA NOARCH

    - hpctoolkit +cuda
    - papi +cuda
    - tau +mpi +cuda

    #- bricks +cuda
    # - ecp-data-vis-sdk ~rocm +adios2 ~ascent +hdf5 ~paraview ~pnetcdf ~sz +vtkm +zfp +cuda cuda_arch=80
    #- raja +cuda cuda_arch=80
  - gcc_specs:
    - alquimia
    - aml
    - amrex
    - arborx
    - argobots
    - bolt
    - boost
    - butterflypack
    - cabana
    - caliper
    - chai ~benchmarks ~tests
    - datatransferkit
    - dyninst ^intel-tbb
    - flecsi
    - flit
    - fortrilinos
    - ginkgo
    - gotcha
    - h5bench
    - hdf5
    - hdf5-vol-async
    - hdf5-vol-cache
    - heffte +fftw
    - hpctoolkit
    - hpx max_cpu_count=512 networking=mpi
    - hypre
    - kokkos +openmp
    - kokkos-kernels +openmp
    - lammps
    - legion
    - libnrm
    - libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf
    - libquo
    - libunwind
    - likwid
    - loki
    - mercury
    - metall
    - mfem
    - mgard +serial +openmp +timing +unstructured ~cuda
    - mpark-variant
    - nccmp
    - nco
    - netlib-scalapack
    - omega-h
    - openpmd-api
    - papi
    - papyrus
    - parsec ~cuda
    - pdt
    - petsc
    - phist
    - plasma
    - plumed
    - precice
    - pumi
    - py-h5py
    - py-jupyterhub
    - py-libensemble
    - py-petsc4py
    - qthreads scheduler=distrib
    - raja  
    - slate ~cuda
    - slepc
    - stc
    - strumpack ~slate
    - sundials
    - superlu
    - superlu-dist
    - swig@4.0.2-fortran
    - sz3
    - tasmanian
    - tau +mpi +python
    - trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
    - turbine
    - umap
    - umpire
    - unifyfs
    - variorum

    # build failures 
    # - axom ~mpi
    #- globalarrays
    #- wannier90
    # - rempi
    # - openfoam
    # - mpifileutils ~xattr
    # - lbann
    # - hdf5-vol-log
    # - flux-core
        # - ecp-data-vis-sdk ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 ~paraview +pnetcdf +sz +unifyfs +veloc ~visit +vtkm +zfp # visit: font-util: [Makefile:756: install-data-hook] Error 1 share/fonts/X11/cyrillic: failed to write cache paraview: ispc: lex.cpp:398:9: error: 'yywrap' macro redefined [-Werror,-Wmacro-redefined]
    # - exaworks
    # - conduit ~mpi     
    
  - nersc_specs:
    - chapel
    - gsl    
    - netcdf-c ~mpi 
    - netcdf-fortran
    - metis
    - parmetis
    - gromacs
    - cdo
    # trouble building esmf +mpi
    #- ncl ^esmf~mpi
    - ncl 
    - nco
    - ncview
    - libxc
    - libcint
    - libint tune=molgw-lmax-7
    - intel-mkl

  - tools:
    - autoconf
    - automake
    - ccache
    - cmake
    - git
    - gmake
    - gawk
    - nano
    - subversion

Each spack environment is built in a separate directory using spack ci in-order to push specs to buildcache. We have the following spack configuration for each spack environment.

Spack Environments for Spack CI

GCC Spack Environment

spack:
  view: false
  include:
  - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
  - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree:
      padded_length: 128
  concretizer:
    reuse: false
    unify: false
  mirrors:
    perlmutter-e4s-23.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-23.05
  ci:
    enable-artifacts-buildcache: false
    rebuild-index: true
    target: gitlab
    pipeline-gen:
    - submapping:
      - match: [os=sles15]
        build-job:
          tags: [perlmutter-e4s]
      match_behavior: first
    - any-job:
        before_script:
        - git clone -c feature.manyFiles=true -b e4s-23.05 $SPACK_REPO        
        - . spack/share/spack/setup-env.sh
        - which spack
        - spack --version
    - build-job:
        tags: [perlmutter-e4s]
        before_script:
        - module reset
        - module load cpu cray-pmi        
        - module list
        - source setup-env.sh
        - git clone -c feature.manyFiles=true -b e4s-23.05 $SPACK_REPO
        - . spack/share/spack/setup-env.sh
        - which spack
        - spack --version
        - spack-python --path
        script:
        - cd ${SPACK_CONCRETE_ENV_DIR}
        - spack env activate --without-view .
        - spack env st
        - export SPACK_GNUPGHOME=$HOME/.gnupg
        - spack gpg list
        - spack -d ci rebuild
    - reindex-job:
        tags: [perlmutter-e4s]
        script:
        - echo "End Pipeline"
    - noop-job:
        tags: [perlmutter-e4s]
        script:
        - echo "End Pipeline"
  specs:
  - matrix:
    - [$gcc_specs]
    - [$gcc_compilers]
  - matrix:
    - [$nersc_specs]
    - [$gcc_compilers]

CCE Spack Environment

spack:
  view: false
  include:
  - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
  - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree:
      padded_length: 128
  concretizer:
    reuse: false
    unify: false
  mirrors:
    perlmutter-e4s-23.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-23.05
  ci:
    enable-artifacts-buildcache: false
    rebuild-index: true
    target: gitlab
    pipeline-gen:
    - submapping:
      - match: [os=sles15]
        build-job:
          tags: [perlmutter-e4s]
      match_behavior: first
    - any-job:
        before_script:
        - git clone -c feature.manyFiles=true -b e4s-23.05 $SPACK_REPO        
        - . spack/share/spack/setup-env.sh
        - which spack
        - spack --version
    - build-job:
        tags: [perlmutter-e4s]
        before_script:
        - module reset
        - module load cpu cray-pmi
        - module list
        - source setup-env.sh        
        - git clone -c feature.manyFiles=true -b e4s-23.05 $SPACK_REPO        
        - . spack/share/spack/setup-env.sh
        - which spack
        - spack --version
        - spack-python --path
        script:      
        - cd ${SPACK_CONCRETE_ENV_DIR}
        - spack env activate --without-view .
        - spack env st
        - export SPACK_GNUPGHOME=$HOME/.gnupg
        - spack gpg list
        - spack -d ci rebuild
    - reindex-job:
        tags: [perlmutter-e4s]
        script:
        - echo "End Pipeline"
    - noop-job:
        tags: [perlmutter-e4s]
        script:
        - echo "End Pipeline"
  specs:
    - matrix:
      - [$cce_specs]
      - [$cce_compilers]

NVHPC Spack Environment

spack:
  view: false
  include:
  - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
  - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree:
      padded_length: 128
  concretizer:
    reuse: false
    unify: false
  packages:
    cmake::
      require: '%gcc'
    # build failures with xz %nvhpc so reverting to gcc
    xz::
      require: '%gcc'      
  mirrors:
    perlmutter-e4s-23.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-23.05
  
  ci:
    enable-artifacts-buildcache: false
    rebuild-index: true
    target: gitlab
    pipeline-gen:
    - submapping:
      - match: [os=sles15]
        build-job:
          tags: [perlmutter-e4s]
      match_behavior: first
    - any-job:
        before_script:
        - git clone -c feature.manyFiles=true -b e4s-23.05 $SPACK_REPO        
        - . spack/share/spack/setup-env.sh
        - which spack
        - spack --version      
    - build-job:
        tags: [perlmutter-e4s]
        before_script:
        - module reset
        - module load cpu cray-pmi        
        - module list
        - source setup-env.sh
        - git clone -c feature.manyFiles=true -b e4s-23.05 $SPACK_REPO
        - . spack/share/spack/setup-env.sh
        - which spack
        - spack --version
        - spack-python --path
        script:
        - cd ${SPACK_CONCRETE_ENV_DIR}
        - spack env activate --without-view .
        - spack env st
        - export SPACK_GNUPGHOME=$HOME/.gnupg
        - spack gpg list
        - spack -d ci rebuild
    - reindex-job:
        tags: [perlmutter-e4s]
        script:
        - echo "End Pipeline"
    - noop-job:
        tags: [perlmutter-e4s]
        script:
        - echo "End Pipeline"
  specs:
    - matrix:
      - [$nvhpc_specs]
      - [$nvhpc_compilers]

CUDA Spack Environment

spack:
  view: false
  include:
  - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
  - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree:
      padded_length: 128
  concretizer:
    reuse: false
    unify: false
  mirrors:
    perlmutter-e4s-23.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-23.05
  ci:
    enable-artifacts-buildcache: false
    rebuild-index: true
    target: gitlab
    pipeline-gen:
    - submapping:
      - match: [os=sles15]
        build-job:
          tags: [perlmutter-e4s]
      match_behavior: first
    - any-job:
       before_script:
       - git clone -c feature.manyFiles=true -b e4s-23.05 $SPACK_REPO        
       - . spack/share/spack/setup-env.sh
       - which spack
       - spack --version
    - build-job:
        tags: [perlmutter-e4s]
        before_script:
        - module reset        
        - module load cpu cray-pmi
        - module list
        - source setup-env.sh
        - git clone -c feature.manyFiles=true -b e4s-23.05 $SPACK_REPO
        - . spack/share/spack/setup-env.sh
        - which spack
        - spack --version
        - spack-python --path
        script:
        - cd ${SPACK_CONCRETE_ENV_DIR}
        - spack env activate --without-view .
        - spack env st
        - export SPACK_GNUPGHOME=$HOME/.gnupg
        - spack gpg list
    - reindex-job:
        tags: [perlmutter-e4s]
        script:
        - echo "End Pipeline"
    - noop-job:
        tags: [perlmutter-e4s]
        script:
        - echo "End Pipeline"
  specs:
    - matrix:
      - [$cuda_specs]
      - [$gcc_compilers]

DATA Spack Environment

spack:
  view: false
  include:
  - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
  - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree:
      padded_length: 128
  concretizer:
    reuse: false
    unify: false
  mirrors:
    perlmutter-e4s-23.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-23.05
  ci:
    enable-artifacts-buildcache: false
    rebuild-index: true
    target: gitlab
    pipeline-gen:
    - submapping:
      - match: [os=sles15]
        build-job:
          tags: [perlmutter-e4s]
      match_behavior: first
    - any-job:
        before_script:
        - git clone -c feature.manyFiles=true -b e4s-23.05 $SPACK_REPO        
        - . spack/share/spack/setup-env.sh
        - which spack
        - spack --version
    - build-job:
        tags: [perlmutter-e4s]
        before_script:
        - module reset
        - module load cpu cray-pmi        
        - module list
        - source setup-env.sh        
        - git clone -c feature.manyFiles=true -b e4s-23.05 $SPACK_REPO        
        - . spack/share/spack/setup-env.sh
        - which spack
        - spack --version
        - spack-python --path
        script:
        - cd ${SPACK_CONCRETE_ENV_DIR}
        - spack env activate --without-view .
        - spack env st
        - export SPACK_GNUPGHOME=$HOME/.gnupg
        - spack gpg list
        - spack -d ci rebuild
    - reindex-job:
        tags: [perlmutter-e4s]
        script:
        - echo "End Pipeline"
    - noop-job:
        tags: [perlmutter-e4s]
        script:
        - echo "End Pipeline"
  specs:
    - matrix:
      - [$data_specs]
      - [$gcc_compilers]

MATH-LIBS Spack Environment

spack:
  view: false
  include:
  - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
  - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree:
      padded_length: 128
  concretizer:
    reuse: false
    unify: false
  mirrors:
    perlmutter-e4s-23.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-23.05
  ci:
    enable-artifacts-buildcache: false
    rebuild-index: true
    target: gitlab
    pipeline-gen:
    - submapping:
      - match: [os=sles15]
        build-job:
          tags: [perlmutter-e4s]
      match_behavior: first
    - any-job:
        before_script:
        - git clone -c feature.manyFiles=true -b e4s-23.05 $SPACK_REPO        
        - . spack/share/spack/setup-env.sh
        - which spack
        - spack --version      
    - build-job:
        tags: [perlmutter-e4s]
        before_script:
        - module reset        
        - module load cpu cray-pmi        
        - module list
        - source setup-env.sh
        - git clone -c feature.manyFiles=true -b e4s-23.05 $SPACK_REPO
        - . spack/share/spack/setup-env.sh
        - which spack
        - spack --version
        - spack-python --path
        script:
        - cd ${SPACK_CONCRETE_ENV_DIR}
        - spack env activate --without-view .
        - spack env st
        - export SPACK_GNUPGHOME=$HOME/.gnupg
        - spack gpg list
        - spack -d ci rebuild
    - reindex-job:
        tags: [perlmutter-e4s]
        script:
        - echo "End Pipeline"
    - noop-job:
        tags: [perlmutter-e4s]
        script:
        - echo "End Pipeline"
  specs:
  - matrix:
    - [$math-libs]
    - [$gcc_compilers]

TOOLS Spack Environment

spack:
  view: false
  include:
  - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
  - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree:
      padded_length: 128
  concretizer:
    reuse: false
    unify: false
  mirrors:
    perlmutter-e4s-23.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-23.05
  ci:
    enable-artifacts-buildcache: false
    rebuild-index: true
    target: gitlab
    pipeline-gen:
    - submapping:
      - match: [os=sles15]
        build-job:
          tags: [perlmutter-e4s]
      match_behavior: first
    - any-job:
        before_script:
        - git clone -c feature.manyFiles=true -b e4s-23.05 $SPACK_REPO        
        - . spack/share/spack/setup-env.sh
        - which spack
        - spack --version
    - build-job:
        tags: [perlmutter-e4s]
        before_script:
        - module reset
        - module load cpu cray-pmi        
        - module list
        - source setup-env.sh
        - git clone -c feature.manyFiles=true -b e4s-23.05 $SPACK_REPO
        - . spack/share/spack/setup-env.sh
        - which spack
        - spack --version
        - spack-python --path
        script:
        - cd ${SPACK_CONCRETE_ENV_DIR}
        - spack env activate --without-view .
        - spack env st
        - export SPACK_GNUPGHOME=$HOME/.gnupg
        - spack gpg list
        - spack -d ci rebuild
    - reindex-job:
        tags: [perlmutter-e4s]
        script:
        - echo "End Pipeline"
    - noop-job:
        tags: [perlmutter-e4s]
        script:
        - echo "End Pipeline"
  specs:
  - matrix:
    - [$tools]
    - [$gcc_compilers]

Perlmutter E4S 22.11

Production Spack Configuration

GCC spack environment

spack:
  view: false
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree: $spack/opt/spack
  concretizer:
    reuse: false
  mirrors:       
    perlmutter-e4s-22.11: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.11
  specs:
  - matrix:
    - [$gcc_specs]
    - [$gcc_compilers]
  - matrix:
    - [$utilities]
    - [$gcc_compilers]
  - matrix:
    - [$nersc_specs]
    - [$gcc_compilers]

CCE Spack Environment

spack:
  view: false
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree: $spack/opt/spack
  concretizer:
    reuse: false
  mirrors:       
    perlmutter-e4s-22.11: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.11
  specs:
  - matrix:
    - [$cce_specs]
    - [$cce_compilers]

NVHPC Spack Environment

spack:
  view: false
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree: $spack/opt/spack
  concretizer:
    reuse: false
  mirrors:       
    perlmutter-e4s-22.11: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.11
  packages:
    # build failures with xz %nvhpc so reverting to gcc
    xz::
      require: '%gcc' 
  specs:
  - matrix:
    - [$nvhpc_specs]
    - [$nvhpc_compilers]

CUDA Spack Environment

spack:
  view: false
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree: $spack/opt/spack
  concretizer:
    reuse: false
  mirrors:       
    perlmutter-e4s-22.11: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.11
  specs:
  - matrix:
    - [$cuda_specs]
    - [$gcc_compilers]

Shown below is the list of definitions that is used for all of our spack environments.

Definitions for Spack Environments

  definitions:
  - cce_compilers: ['%cce@15.0.0']
  - gcc_compilers: ['%gcc@11.2.0']
  - nvhpc_compilers: ['%nvhpc@22.7']  

  - gcc_specs:
    - adios2@2.8.3
    - alquimia@1.0.10
    - aml@0.2.0
    - amrex@22.11
    - arborx@1.3
    - argobots@1.1    
    - axom@0.7.0 
    - bolt@2.0
    - butterflypack@2.2.2
    - cabana@0.5.0
    - caliper@2.8.0
    - chai@2022.03.0 ~benchmarks ~tests
    - conduit@0.8.4 ~blt_find_mpi
    - datatransferkit@3.1-rc3
    - dyninst@12.2.0 ^intel-tbb
    #- ecp-data-vis-sdk@1.0 ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 +paraview +pnetcdf +sz +unifyfs +veloc +visit +vtkm +zfp
    - flecsi@1.4.2
    - flit@2.1.0
    - flux-core@0.44.0
    - fortrilinos@2.1.0
    - gasnet@2022.9.0
    - ginkgo@1.4.0
    - globalarrays@5.8
    - gotcha@1.0.4
    - gptune@3.0.0
    - h5bench@1.3
    - hdf5@1.12.2 +fortran +hl +shared
    - hdf5-vol-async@1.3
    - heffte@2.3.0 +fftw
    - hpctoolkit@2022.10.01
    - hpx@1.8.1 networking=mpi
    - hypre@2.26.0
    - kokkos@3.7.00 +openmp
    - kokkos-kernels@3.7.00 +openmp
    - lammps@20220623
    - legion@21.03.0
    - libquo@1.3.1
    - libunwind@1.6.2
    - mercury@2.1.0
    - mfem@4.5.0
    - mpark-variant@1.4.0
    #- mpifileutils@0.11.1 ~xattr
    - nccmp@1.9.0.1
    - nco@5.0.1
    - netlib-scalapack@2.2.0
    - openblas@0.3.20 threads=openmp
    - omega-h@9.34.13
    - openpmd-api@0.14.5 ~adios2
    - papi@6.0.0.1
    - papyrus@1.0.2
    - parallel-netcdf@1.12.3
    - parsec@3.0.2209 ~cuda
    - pdt@3.25.1
    - petsc@3.18.1    
    - precice@2.5.0
    - pumi@2.2.7
    - py-libensemble@0.9.3
    - py-h5py +mpi
    - py-h5py ~mpi
    - py-petsc4py@3.18.1
    - py-warpx@22.10 ^warpx dims=2
    - py-warpx@22.10 ^warpx dims=3
    - py-warpx@22.10 ^warpx dims=rz
    - qt@5.14.2
    - qthreads@1.16 scheduler=distrib
    - quantum-espresso@7.1
    - raja@2022.03.0
    - slate@2022.07.0 ~cuda
    - slepc@3.18.1
    - strumpack@7.0.1 ~slate
    - sundials@6.4.1
    - superlu@5.3.0
    - superlu-dist@8.1.1
    - tasmanian@7.9
    - tau@2.32 +mpi +python
    - trilinos@13.0.1 +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext
      +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu
      +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos
      +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long    
    - umap@2.1.0
    - umpire@2022.03.1
    - upcxx@2022.9.0
    - vtk-m@1.9.0
    - zfp@0.5.5

    # - bricks@r0.1 build failure
    # - plasma@22.9.29 Could NOT find MKL (missing: MKL_INCLUDE_DIRS MKL_LIBRARIES) 
    # - phist@1.11.2 build error
    # - stc@0.9.0 build failure on swig
    # - turbine@1.3.0  failed to build swig
    # - upcxx+gasnet^gasnet@2022.9.2 # see https://software.nersc.gov/NERSC/spack-infrastructure/-/issues/45  Issue with getting checksum for gasnet   
     # - visit@3.2.2 ~hdf5
    # - wannier90@3.1.0 Error: Type mismatch between actual argument at (1) and actual argument at (2) (COMPLEX(8)/INTEGER(4)).../comms.F90:1214:22: 1214 |     call MPI_scatterv(rootglobalarray, counts, displs, MPI_double_precision, &
  - cuda_specs:
    - adios2@2.8.3 +cuda cuda_arch=80   # ecp-data-vis-sdk
    - amrex@22.11 +cuda cuda_arch=80
    - arborx@1.3 +cuda cuda_arch=80 ^kokkos@3.7.00 +wrapper
    - cabana@0.5.0 +cuda ^kokkos@3.7.00 +wrapper +cuda_lambda +cuda cuda_arch=80
    - caliper@2.8.0 +cuda cuda_arch=80
    - chai@2022.03.0 ~benchmarks ~tests +cuda cuda_arch=80 ^umpire@2022.03.1 ~shared
    - cusz@0.3 +cuda cuda_arch=80    
    - flecsi@2.1.0 +cuda cuda_arch=80
    - ginkgo@1.4.0 +cuda cuda_arch=80
    - heffte@2.3.0 +cuda cuda_arch=80
    - hpx@1.8.1 +cuda cuda_arch=80
    - hypre@2.26.0 +cuda cuda_arch=80
    - kokkos-kernels@3.7.00 +cuda cuda_arch=80 ^kokkos@3.7.00 +wrapper +cuda cuda_arch=80
    - kokkos@3.7.00 +wrapper +cuda cuda_arch=80    
    - mfem@4.5.0 +cuda cuda_arch=80
    - omega-h@9.34.13 +cuda cuda_arch=80
    - petsc@3.18.1 +cuda cuda_arch=80    
    - slate@2022.07.0 +cuda cuda_arch=80
    - slepc@3.18.1 +cuda cuda_arch=80
    - strumpack@7.0.1 ~slate +cuda cuda_arch=80
    - sundials@6.4.1 +cuda cuda_arch=80
    - superlu-dist@8.1.1 +cuda cuda_arch=80
    - tasmanian@7.9 +cuda cuda_arch=80    
    - umpire@2022.03.1 ~shared +cuda cuda_arch=80
    - vtk-m@1.9.0 +cuda cuda_arch=80    # ecp-data-vis-sdk
    - zfp@0.5.5 +cuda cuda_arch=80      # ecp-data-vis-sdk
        
    #- ascent@0.8.0 +cuda cuda_arch=80   # unable to build vtk-h -- Could NOT find MPI_C (missing: MPI_C_LIB_NAMES) (found version "3.1")    
    # - dealii@9.4.0 +cuda cuda_arch=80 # CUDA_LIBRARIES: *** Required variable "CUDA_cusparse_LIBRARY" set to NOTFOUND ***    
    #- magma@2.6.2 +cuda cuda_arch=80 # CMake Error: The following variables are used in this project, but they are set to NOTFOUND.    
    #- raja@2022.03.0 +cuda cuda_arch=80 # error: "RAJA::expt::Register<int32_t, RAJA::expt::avx2_register> &(const int32_t &)" contains a vector, which is not supported in device code
    #- trilinos@13.4.0 +cuda cuda_arch=80
    
  - nvhpc_specs:
    - hdf5@1.12.2 +fortran +hl +shared
    - kokkos@3.7.00 +openmp
    - kokkos-kernels@3.7.00 +openmp
    - sundials@6.4.1
    - superlu@5.3.0
    #- superlu-dist@8.1.1 # error: this OpenMP construct is not supported in NVIDIA subset: The 'taskloop' construct is not supported.

  - cce_specs:
    - fftw@3.3.10    
    - hdf5@1.12.2 +fortran +hl +shared
    - hypre@2.26.0    
    - kokkos@3.7.00 +openmp
    - kokkos-kernels@3.7.00 +openmp
    - netcdf-c@4.9.0
    - netcdf-fortran@4.6.0 
    - sundials@6.4.1
    - superlu@5.3.0
    - superlu-dist@8.1.1 

  - nersc_specs:
    - chapel@1.24.1
    - gsl@2.7.1
    - fftw@3.3.10
    - fftw@3.3.9
    - fftw@3.3.8        
    - netcdf-c@4.9.0 ~mpi 
    - netcdf-fortran@4.6.0 
    - metis@5.1.0    
    - parmetis@4.0.3
    - gromacs@2022.3
    - cdo
    # trouble building esmf +mpi
    - ncl ^esmf~mpi
    - nco
    - ncview
    - libxc
    - libcint
    - libint tune=molgw-lmax-7
    - intel-mkl
  - utilities:
    - autoconf
    - automake
    - cmake
    - git
    - gmake
    - gawk
    - nano
    - subversion

Each spack environment is built in a separate directory using spack ci in-order to push specs to buildcache. We have the following spack configuration for each spack environment.

Spack Environments for Spack CI

GCC Spack Environment

spack:
  view: false
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree:
      padded_length: 128
  concretizer:
    reuse: false
  mirrors:       
    perlmutter-e4s-22.11: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.11
    # source_mirror: file:///global/cfs/cdirs/m3503/mirrors/source_mirror    
  gitlab-ci:
    enable-artifacts-buildcache: false
    rebuild-index: true
    before_script:
    - module reset
    - module load cpu cray-pmi    
    - module list
    - source setup-env.sh
    - git clone -c feature.manyFiles=true -b e4s-22.11 $SPACK_REPO     
    - . spack/share/spack/setup-env.sh
    - which spack
    - spack --version
    - spack-python --path    
    script:
    - cd ${SPACK_CONCRETE_ENV_DIR} 
    - spack env activate --without-view . 
    - spack env st
    - export SPACK_GNUPGHOME=$HOME/.gnupg
    - spack gpg list
    - spack -d ci rebuild
    service-job-attributes:
      tags: [perlmutter-e4s]
      before_script:
        - git clone -b e4s-22.11 $SPACK_REPO
        - . spack/share/spack/setup-env.sh        
        - spack --version
        - ls -l /global/common/software/spackecp/mirrors/perlmutter-e4s-22.11/build_cache/_pgp
      script:
      - echo "End Pipeline"
    mappings:
    - match: [os=sles15]
      runner-attributes:
        tags: [perlmutter-e4s]
  specs:
  - matrix:
    - [$gcc_specs]
    - [$gcc_compilers]
  - matrix:
    - [$utilities]
    - [$gcc_compilers]
  - matrix:
    - [$nersc_specs]
    - [$gcc_compilers]

CCE Spack Environment

spack:
  view: false
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree:
      padded_length: 128
  concretizer:
    reuse: false
  mirrors:       
    perlmutter-e4s-22.11: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.11
  gitlab-ci:
    enable-artifacts-buildcache: false
    rebuild-index: true
    before_script:
    - module reset
    - module load cpu cray-pmi    
    - module list
    - source setup-env.sh
    - git clone -c feature.manyFiles=true -b e4s-22.11 $SPACK_REPO     
    - . spack/share/spack/setup-env.sh
    - which spack
    - spack --version
    - spack-python --path    
    script:
    - cd ${SPACK_CONCRETE_ENV_DIR} 
    - spack env activate --without-view . 
    - spack env st
    - export SPACK_GNUPGHOME=$HOME/.gnupg
    - spack gpg list
    - spack -d ci rebuild
    service-job-attributes:
      tags: [perlmutter-e4s]
      before_script:
        - git clone -b e4s-22.11 $SPACK_REPO
        - . spack/share/spack/setup-env.sh        
        - spack --version
        - ls -l /global/common/software/spackecp/mirrors/perlmutter-e4s-22.11/build_cache/_pgp
      script:
      - echo "End Pipeline"
    mappings:
    - match: [os=sles15]
      runner-attributes:
        tags: [perlmutter-e4s]
  specs:
  - matrix:
    - [$cce_specs]
    - [$cce_compilers]

NVHPC Spack Environment

spack:
  view: false
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree:
      padded_length: 128
  concretizer:
    reuse: false
  packages:
    # build failures with xz %nvhpc so reverting to gcc
    xz::
      require: '%gcc' 
  mirrors:       
    perlmutter-e4s-22.11: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.11  
  gitlab-ci:
    enable-artifacts-buildcache: false
    rebuild-index: true
    before_script:
    - module reset
    - module load cpu cray-pmi    
    - module list
    - source setup-env.sh
    - git clone -c feature.manyFiles=true -b e4s-22.11 $SPACK_REPO     
    - . spack/share/spack/setup-env.sh
    - which spack
    - spack --version
    - spack-python --path    
    script:
    - cd ${SPACK_CONCRETE_ENV_DIR} 
    - spack env activate --without-view . 
    - spack env st
    - export SPACK_GNUPGHOME=$HOME/.gnupg
    - spack gpg list
    - spack -d ci rebuild
    service-job-attributes:
      tags: [perlmutter-e4s]
      before_script:
        - git clone -b e4s-22.11 $SPACK_REPO
        - . spack/share/spack/setup-env.sh        
        - spack --version
        - ls -l /global/common/software/spackecp/mirrors/perlmutter-e4s-22.11/build_cache/_pgp
      script:
      - echo "End Pipeline"
    mappings:
    - match: [os=sles15]
      runner-attributes:
        tags: [perlmutter-e4s]
  specs:
  - matrix:
    - [$nvhpc_specs]
    - [$nvhpc_compilers]

CUDA Spack Environment

spack:
  view: false
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree:
      padded_length: 128
  concretizer:
    reuse: false
  mirrors:       
    perlmutter-e4s-22.11: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.11
    # source_mirror: file:///global/cfs/cdirs/m3503/mirrors/source_mirror    
  gitlab-ci:
    enable-artifacts-buildcache: false
    rebuild-index: true
    before_script:
    - module reset
    - module load cpu cray-pmi    
    - module list
    - source setup-env.sh
    - git clone -c feature.manyFiles=true -b e4s-22.11 $SPACK_REPO     
    - . spack/share/spack/setup-env.sh
    - which spack
    - spack --version
    - spack-python --path    
    script:
    - cd ${SPACK_CONCRETE_ENV_DIR} 
    - spack env activate --without-view . 
    - spack env st
    - export SPACK_GNUPGHOME=$HOME/.gnupg
    - spack gpg list
    - spack -d ci rebuild
    service-job-attributes:
      tags: [perlmutter-e4s]
      before_script:
        - git clone -b e4s-22.11 $SPACK_REPO
        - . spack/share/spack/setup-env.sh        
        - spack --version
        - ls -l /global/common/software/spackecp/mirrors/perlmutter-e4s-22.11/build_cache/_pgp
      script:
      - echo "End Pipeline"
    mappings:
    - match: [os=sles15]
      runner-attributes:
        tags: [perlmutter-e4s]
  specs:
  - matrix:
    - [$cuda_specs]
    - [$gcc_compilers]

Perlmutter E4S 22.05

Shown below is the production Spack configuration for Perlmutter E4S 22.05. You can access this stack via module load e4s/22.05 on Perlmutter. Please see our user documentation for this stack at https://docs.nersc.gov/applications/e4s/perlmutter/22.05/.

Production Spack Configuration

GCC spack environment

spack:
  view:
    default:
      root: $spack/var/spack/environments/gcc/views/default
      select: ['%gcc']
      exclude: ['py-warpx']
      link_type: symlink
      link: roots
      projections:
        all: '{name}/{version}-{compiler.name}-{compiler.version}'
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree: $spack/opt/spack
  mirrors:
    perlmutter-e4s-22.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.05
    source_mirror: file:///global/cfs/cdirs/m3503/mirrors/source_mirror  
  specs:
  - $nersc_specs
  - $utilities  
  - matrix:
    - [$gcc_specs]
    - [$gcc_compilers]
  - matrix:
    - - py-libensemble@0.9.1 ^py-numpy ~blas ~lapack
    - ['%gcc@10.3.0']

CCE Spack Environment

spack:
  view:
    default:
      root: $spack/var/spack/environments/cce/views/default
      select: ['%cce']
      link_type: symlink
      link: roots
      projections:
        all: '{name}/{version}-{compiler.name}-{compiler.version}'
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree: $spack/opt/spack
  mirrors:
    perlmutter-e4s-22.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.05
    source_mirror: file:///global/cfs/cdirs/m3503/mirrors/source_mirror  
  specs:
  - matrix:
    - [$cce_specs]
    - [$cce_compilers]

NVHPC Spack Environment

spack:
  view:
    default:
      root: $spack/var/spack/environments/nvhpc/views/default
      select: ['%nvhpc']
      link_type: symlink
      link: roots
      projections:
        all: '{name}/{version}-{compiler.name}-{compiler.version}'
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree: $spack/opt/spack
  mirrors:
    perlmutter-e4s-22.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.05
    source_mirror: file:///global/cfs/cdirs/m3503/mirrors/source_mirror  
  specs:
  - matrix:
    - [$nvhpc_specs]
    - [$nvhpc_compilers]

CUDA Spack Environment

spack:
  view:
    default:
      root: $spack/var/spack/environments/cuda/views/default
      select: ['%gcc +cuda']
      link_type: symlink
      link: roots
      projections:
        all: '{name}/{version}-{compiler.name}-{compiler.version}'
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree: $spack/opt/spack
  mirrors:
    perlmutter-e4s-22.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.05
    source_mirror: file:///global/cfs/cdirs/m3503/mirrors/source_mirror  
  specs:
  - matrix:
    - [$cuda_specs]
    - [$gcc_compilers]

Shown below is the list of definitions that is used for all of our spack environments.

Definitions for Spack Environments

  definitions:
  - cce_compilers: ['%cce@15.0.0']
  - gcc_compilers: ['%gcc@11.2.0']
  - nvhpc_compilers: ['%nvhpc@22.7']  
  - gcc_specs:
    - adios2@2.8.0
    - amrex@22.05
    - butterflypack@2.1.1
    - conduit@0.8.3
    - dyninst@12.1.0
    - fortrilinos@2.0.0
    - gasnet@2022.3.0
    - hdf5@1.10.7 +fortran +hl +shared
    - heffte@2.2.0 +fftw
    - hpctoolkit@2022.04.15
    - hpx@1.7.1 networking=mpi
    - hypre@2.24.0
    - kokkos@3.6.00 +openmp
    - kokkos-kernels@3.6.00 +openmp
    - lammps@20220107
    - libquo@1.3.1
    - nccmp@1.9.0.1
    - nco@5.0.1
    - mfem@4.4.0
    - openblas@0.3.20 threads=openmp
    - openpmd-api@0.14.4
    - papi@6.0.0.1
    - parallel-netcdf@1.12.2
    - pdt@3.25.1
    - petsc@3.17.1
    - plasma@21.8.29 ^openblas
    - py-warpx@22.05 ^warpx dims=2 ^openblas
    - py-warpx@22.05 ^warpx dims=3 ^openblas
    - py-warpx@22.05 ^warpx dims=rz ^openblas
    - qthreads@1.16 scheduler=distrib
    - raja@0.14.0
    - slate@2021.05.02 ~cuda
    - slepc@3.17.1
    - strumpack@6.3.1 ~slate
    - sundials@6.2.0
    - superlu@5.3.0
    - superlu-dist@7.2.0
    - tasmanian@7.7
    - trilinos@13.0.1 +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext
      +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu
      +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos
      +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
    - vtk-m@1.7.1
    - upcxx@2022.3.0
    - zfp@0.5.5
    #- ascent@0.8.0     # ascent (DIFFERENT ERROR in ParaTools deployment %gcc@11.2.0)
    #- plumed@2.6.3     # plumed (SAME ERROR in ParaTools deployment %gcc@11.2.0)    
    #- variorum@0.4.1   # variorum (SAME ERROR in ParaTools deployment %gcc@11.2.0)
    #- wannier90@3.1.0  # wannier90 (SAME ERROR in ParaTools deployment %gcc@11.2.0)
    # ascent: Could NOT find MPI_C (missing: MPI_C_LIB_NAMES) (found version "3.1")
    # plumed: tools/../../tools/lepton/../../lepton/Operation.h:902:39: error: 'numeric_limits' is not a member of 'std'
    # variorum (1/2): make[2]: *** [variorum/CMakeFiles/variorum.dir/build.make:196: variorum/libvariorum.so] Error 1
    # variorum (2/2): /usr/bin/ld: Intel/CMakeFiles/variorum_intel.dir/msr_core.c.o:(.bss+0x0): multiple definition of `g_platform'; CMakeFiles/variorum.dir/config_architecture.c.o:(.bss+0x0): first defined here
    # wannier90: Error: Type mismatch between actual argument at (1) and actual argument at (2) (COMPLEX(8)/INTEGER(4)).

  - cuda_specs:
    - adios2@2.8.0 +cuda cuda_arch=80
    - hypre@2.24.0 +cuda cuda_arch=80
    - kokkos@3.6.00 +wrapper +cuda cuda_arch=80
    - kokkos-kernels@3.6.00 +cuda cuda_arch=80 ^kokkos@3.6.00 +wrapper +cuda cuda_arch=80
    - lammps@20220107 +cuda cuda_arch=80 +cuda_mps
    - hpctoolkit@2022.04.15 +cuda +mpi ~papi
    - likwid +cuda
    - papi@6.0.0.1 +cuda
    - petsc@3.17.1+cuda cuda_arch=80
    - raja@0.14.0 +cuda cuda_arch=80
    - slate@2021.05.02 +cuda cuda_arch=80
    - slepc@3.17.1 +cuda cuda_arch=80
    - strumpack@6.3.1 ~slate +cuda cuda_arch=80
    - tau@2.31.1 +mpi +cuda
    - zfp@0.5.5 +cuda cuda_arch=80
    #- heffte@2.2.0 +cuda cuda_arch=80        # heffte (WORKED in ParaTools deployment %gcc@11.2.0)
    #- hpx@1.7.1 +cuda cuda_arch=80           # hpx (WORKED in ParaTools deployment %gcc@11.2.0)
    #- magma@2.6.2 +cuda cuda_arch=80         # magma (WORKED in ParaTools deployment %gcc@11.2.0)
    #- parsec@3.0.2012 +cuda cuda_arch=80     # parsec (SAME ERROR in ParaTools deployment %gcc@11.2.0)
    #- sundials@6.2.0 +cuda cuda_arch=80      # sundials (WORKED in ParaTools deployment %gcc@11.2.0)
    #- superlu-dist@7.2.0 +cuda cuda_arch=80  # superlu-dist (WORKED in ParaTools deployment %gcc@11.2.0)
    #- tasmanian@7.7 +cuda cuda_arch=80       # tasmanian (WORKED in ParaTools deployment %gcc@11.2.0)
    #- trilinos@13.2.0 +cuda cuda_arch=80     # trilinos (SAME ERROR in ParaTools deployment %gcc@11.2.0)
    #- vtk-m@1.7.1 +cuda cuda_arch=80         # vtk-m (DIFF ERROR in ParaTools deployment %gcc@11.2.0)
    # heffte: CMake Error: The following variables are used in this project, but they are set to NOTFOUND: CUDA_cufft_LIBRARY
    # hpx: CMake Error: The following variables are used in this project, but they are set to NOTFOUND: CUDA_cublas_LIBRARY
    # magma: CMake Error: The following variables are used in this project, but they are set to NOTFOUND: CUDA_cublas_LIBRARY
    # parsec: transfer.c:168: multiple definition of `parsec_CUDA_d2h_max_flows';
    # sundials: spack-src/examples/sunmatrix/cusparse/test_sunmatrix_cusparse.cu:167: undefined reference to `cusparseCreate'
    # superlu-dist: make[2]: *** No rule to make target '/opt/nvidia/hpc_sdk/Linux_x86_ 64/21.11/cuda/11.5/lib64/libcublas.so', needed by 'SRC/CMakeFiles/superlu_dist.dir/cmake_device_link.o'.
    # tasmanian: CMake Error at /global/cfs/cdirs/m3503/ci-builds/perlmutter/yUW7FC66... /Modules/FindPackageHandleStandardArgs.cmake:230 (message): Could NOT find TasmanianCudaMathLibs
    # trilinos +cuda: CMake Error at /global/cfs....CMakeTestCXXCompiler.cmake:62 (message): The C++ compiler "/opt/cray/pe/mpich/8.1.13/ofi/gnu/9.1/bin/mpicxx" is not able to compile a simple test program.
    # vtk-m: spack-src/vtkm/internal/brigand.hpp:1061:131: error: expected class-name before '{' token struct find<true, false, L1, L2, Ls...> : find<true , F<Ts..., L2>::value, L2, Ls...>
    
  - cce_specs:
    - adios2@2.8.0
    - hdf5@1.10.7 +fortran +hl +shared
    
    - kokkos-kernels@3.6.00 +openmp
    - kokkos@3.6.00 +openmp
    - petsc@3.17.1 ~strumpack
    - sundials@6.2.0
    - superlu-dist@7.2.0
    - superlu@5.3.0
    # - hypre@2.24.0 # error in compilation: clang-15: error: unknown argument: '-qsmp=omp'
    #- openblas@0.3.20 threads=openmp   # openblas (SAME ERROR in ParaTools deployment %cce@14.0.0)
    #- strumpack@6.3.1 ~slate           # butterflypack
    # openblas: ftn-2307 ftn: ERROR in command line The "-m" option must be followed by 0, 1, 2, 3 or 4. ftn-2103 ftn: WARNING in command line.The -W all option is not supported or invalid and will be ignored.

  - nvhpc_specs:
    - hdf5@1.10.7 +fortran +hl +shared
    - kokkos@3.6.00 +openmp
    - kokkos-kernels@3.6.00 +openmp
    - sundials@6.2.0
    - superlu@5.3.0
    - zfp@0.5.5

    # - raja@0.14.0 +cuda cuda_arch=80 build failure 
    # - slate@2021.05.02 +cuda cuda_arch=80
    # - strumpack@6.3.1 ~slate build failure 
    # - superlu-dist@7.2.0 NVIDIA subset: The 'taskloop' construct is not supported.

    # ==> Warning: Skipping build of superlu-dist-7.2.0-kmvqtie752awoenzob5pgemabeuzyaiv since parmetis-4.0.3-ew3itm7cvpheb7h4e6lmyteghbq7toeo failed
    # ==> Warning: Skipping build of hypre-2.24.0-s2nqxywgft5dfjeps4jgiim6dxvedz3y since superlu-dist-7.2.0-kmvqtie752awoenzob5pgemabeuzyaiv failed
    #- hypre@2.24.0 +cuda cuda_arch=80
    # failed to build HDF5 dependency with nvhpc
    #- raja@0.14.0 +cuda cuda_arch=80

  - nersc_specs:
    - ccache@4.5.1
    - cdo
    - gnuplot@5.4.3 +X
    - grads
    - gromacs@2021.5
    - gsl@2.7
    - metis@5.1.0
    - intel-mkl
    - ncl
    - ncview@2.1.8
    - parmetis@4.0.3
    - parallel
    - quantum-espresso@7.0    
    - nwchem@7.0.2 ^cray-libsci
  - utilities:
    - autoconf
    - automake
    - cmake
    - git
    - gmake
    - gawk
    - nano
    - subversion
    - xterm

Shown below is the list of spack environments that is used for building the stack into buildcache using spack ci.

Spack Environments for Spack CI

GCC Spack Environment

spack:
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree:
      padded_length: 128
  concretizer:
    reuse: false
  cdash:
    build-group: DOE nightly E4S builds
    url: https://cdash.spack.io
    project: Spack
    site: NERSC - Perlmutter E4S-22.05 
  mirrors:       
    perlmutter-e4s-22.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.05
  gitlab-ci:
    enable-artifacts-buildcache: false
    rebuild-index: true
    before_script:
    - module reset
    - module use /global/common/software/nersc/$(cat /etc/nersc_modules_rev)/extra_modulefiles
    - module load cpu
    - module load cray-pmi
    - module list
    - source setup-env.sh
    - git clone --depth=10 -b e4s-22.05 $SPACK_REPO     
    - . spack/share/spack/setup-env.sh
    - which spack
    - spack --version
    - spack-python --path    
    script:
    - cd ${SPACK_CONCRETE_ENV_DIR} 
    - spack env activate --without-view . 
    - spack env st
    - export SPACK_GNUPGHOME=$HOME/.gnupg
    - spack gpg list
    - spack -d ci rebuild
    service-job-attributes:
      tags: [perlmutter-e4s]
      before_script:
        - git clone --depth=10 -b e4s-22.05 $SPACK_REPO
        - . spack/share/spack/setup-env.sh        
        - spack --version
        - ls -l /global/common/software/spackecp/mirrors/perlmutter-e4s-22.05/build_cache/_pgp
      script:
      - echo "End Pipeline"
    mappings:
    - match: [os=sles15]
      runner-attributes:
        tags: [perlmutter-e4s]
  specs:
  - $nersc_specs
  - $utilities
  - matrix:
    - [$gcc_specs]
    - [$gcc_compilers]
  - matrix:
    - - py-libensemble@0.9.1 ^py-numpy ~blas ~lapack
    - ['%gcc@10.3.0']

CCE Spack Environment

spack: 
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree:
      padded_length: 128
  concretizer:
    reuse: false
  cdash:
    build-group: DOE nightly E4S builds
    url: https://cdash.spack.io
    project: Spack
    site: NERSC - Perlmutter E4S-22.05 
  mirrors:       
    perlmutter-e4s-22.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.05 
  gitlab-ci:
    enable-artifacts-buildcache: false
    rebuild-index: true
    before_script:
    - module reset
    - module use /global/common/software/nersc/$(cat /etc/nersc_modules_rev)/extra_modulefiles
    - module load cpu
    - module load cray-pmi
    - module list
    - source setup-env.sh
    - git clone --depth=10 -b e4s-22.05 $SPACK_REPO     
    - . spack/share/spack/setup-env.sh
    - which spack
    - spack --version
    - spack-python --path    
    script:
    - cd ${SPACK_CONCRETE_ENV_DIR} 
    - spack env activate --without-view . 
    - spack env st
    - export SPACK_GNUPGHOME=$HOME/.gnupg
    - spack gpg list
    - spack -d ci rebuild
    service-job-attributes:
      tags: [perlmutter-e4s]
      before_script:
        - git clone --depth=10 -b e4s-22.05 $SPACK_REPO
        - . spack/share/spack/setup-env.sh        
        - spack --version
        - ls -l /global/common/software/spackecp/mirrors/perlmutter-e4s-22.05/build_cache/_pgp
      script:
      - echo "End Pipeline"
    mappings:
    - match: [os=sles15]
      runner-attributes:
        tags: [perlmutter-e4s]
  specs:
  - matrix:
    - [$cce_specs]
    - [$cce_compilers]
    

NVHPC Spack Environment

spack:
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree:
      padded_length: 128
  concretizer:
    reuse: false
  cdash:
    build-group: DOE nightly E4S builds
    url: https://cdash.spack.io
    project: Spack
    site: NERSC - Perlmutter E4S-22.05 
  mirrors:       
    perlmutter-e4s-22.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.05   
  gitlab-ci:
    enable-artifacts-buildcache: false
    rebuild-index: true
    before_script:
    - module reset
    - module use /global/common/software/nersc/$(cat /etc/nersc_modules_rev)/extra_modulefiles
    - module load cpu
    - module load cray-pmi
    - module list
    - source setup-env.sh
    - git clone --depth=10 -b e4s-22.05 $SPACK_REPO     
    - . spack/share/spack/setup-env.sh
    - which spack
    - spack --version
    - spack-python --path    
    script:
    - cd ${SPACK_CONCRETE_ENV_DIR} 
    - spack env activate --without-view . 
    - spack env st
    - export SPACK_GNUPGHOME=$HOME/.gnupg
    - spack gpg list
    - spack -d ci rebuild
    service-job-attributes:
      tags: [perlmutter-e4s]
      before_script:
        - git clone --depth=10 -b e4s-22.05 $SPACK_REPO
        - . spack/share/spack/setup-env.sh        
        - spack --version
        - ls -l /global/common/software/spackecp/mirrors/perlmutter-e4s-22.05/build_cache/_pgp
      script:
      - echo "End Pipeline"
    mappings:
    - match: [os=sles15]
      runner-attributes:
        tags: [perlmutter-e4s]
  specs:
  - matrix:
    - [$nvhpc_specs]
    - [$nvhpc_compilers]
    

CUDA Spack Environment

spack:
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  config:
    concretization: separately
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    concretizer: clingo
    install_tree:
      padded_length: 128
  concretizer:
    reuse: false
  cdash:
    build-group: DOE nightly E4S builds
    url: https://cdash.spack.io
    project: Spack
    site: NERSC - Perlmutter E4S-22.05 
  mirrors:       
    perlmutter-e4s-22.05: file:///global/common/software/spackecp/mirrors/perlmutter-e4s-22.05   
  gitlab-ci:
    enable-artifacts-buildcache: false
    rebuild-index: true
    before_script:
    - module reset
    - module use /global/common/software/nersc/$(cat /etc/nersc_modules_rev)/extra_modulefiles
    - module load cpu
    - module load cray-pmi
    - module list
    - source setup-env.sh
    - git clone --depth=10 -b e4s-22.05 $SPACK_REPO     
    - . spack/share/spack/setup-env.sh
    - which spack
    - spack --version
    - spack-python --path    
    script:
    - cd ${SPACK_CONCRETE_ENV_DIR} 
    - spack env activate --without-view . 
    - spack env st
    - export SPACK_GNUPGHOME=$HOME/.gnupg
    - spack gpg list
    - spack -d ci rebuild
    service-job-attributes:
      tags: [perlmutter-e4s]
      before_script:
        - git clone --depth=10 -b e4s-22.05 $SPACK_REPO
        - . spack/share/spack/setup-env.sh        
        - spack --version
        - ls -l /global/common/software/spackecp/mirrors/perlmutter-e4s-22.05/build_cache/_pgp
      script:
      - echo "End Pipeline"
    mappings:
    - match: [os=sles15]
      runner-attributes:
        tags: [perlmutter-e4s]
  specs:
  - matrix:
    - [$cuda_specs]
    - [$gcc_compilers]

Perlmutter E4S 21.11

Shown below is the production Spack configuration for Perlmutter E4S 21.11. You can access this stack via module load e4s/21.11 or module load e4s/21.11 on Perlmutter. Please see our user documentation for this stack at https://docs.nersc.gov/applications/e4s/perlmutter/21.11/.

Production Spack Environment

# This is a Spack Environment file.
#
# It describes a set of packages to be installed, along with
# configuration settings.

spack:
  view: false
  config:
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    install_tree:
      concretizer: clingo
      root: $spack/opt/spack
  include:
    - /global/common/software/spackecp/perlmutter/spack_settings/compilers.yaml
    - /global/common/software/spackecp/perlmutter/spack_settings/packages.yaml
  specs:
  - matrix:
    - [$gcc_specs]
    - [$gcc_compilers]
  - matrix:
    - [$cuda_specs]
    - [$gcc_compilers]
  - $nersc_specs
  definitions:
  - gcc_compilers: ['%gcc@11.2.0']
  - gcc_specs:
    - adios2@2.7.1
    - amrex@21.11 +fortran +openmp +shared
    - conduit@0.7.2
    - dyninst@11.0.1
    - gasnet@2021.9.0
    - globalarrays@5.8
    - hdf5@1.12.1
    - kokkos-kernels@3.4.01 +openmp
    - kokkos@3.4.01 +openmp
    - mercury@2.0.1
    - mpark-variant@1.4.0
    - openpmd-api@0.14.3
    - papi@6.0.0.1
    - papyrus@1.0.2
    - pdt@3.25.1
    - qthreads@1.16 scheduler=distrib
    - raja@0.14.0
    - strumpack@6.1.0 ~slate
    - sundials@5.8.0 +openmp
    - superlu-dist@7.1.1 +openmp
    - superlu@5.3.0
    - swig@4.0.2
    - sz@2.1.12
    - tau +mpi +python
    - trilinos@13.0.1 +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
    - umap@2.1.0
    - upcxx@2021.9.0 +gasnet +mpi
    #- slepc@3.16.0 --dpends on petsc
    #- mfem@4.3.0 --depends on hypre
    #- petsc@3.16.1 +openmp +strumpack --failed to install
    #- hypre@2.23.0 +openmp +superlu-dist --failed to install
    #- parsec@3.0.2012 ~cuda --failed to install
    #- warpx dims=2
    #- warpx dims=3
  - cuda_specs:
    - kokkos-kernels@3.4.01 +openmp +cuda cuda_arch=80 ^kokkos +openmp +wrapper +cuda cuda_arch=80
    - kokkos@3.4.01 +openmp +wrapper +cuda cuda_arch=80
    - umpire@6.0.0 ~shared +cuda cuda_arch=80
    - upcxx@2021.9.0 +cuda
    - zfp@0.5.5 +cuda cuda_arch=80
    - raja@0.14.0+cuda cuda_arch=80 # CUB in your include path is not compatible with this release of Thrust
    #- mfem@4.3.0+cuda cuda_arch=80 --depends on hypre
    #- slepc@3.16.0 +cuda cuda_arch=80 ^petsc@3.16.1 +cuda cuda_arch=80 --depends on petsc
    #- petsc@3.16.1 +cuda cuda_arch=80 --failed to install
    #- hypre@2.23.0+cuda cuda_arch=80 --failed to install
    #- parsec@3.0.2012+cuda cuda_arch=80 # parsec/mca/device/cuda/transfer.c:168: multiple definition of `parsec_CUDA_d2h_max_flows'
    #- amrex@21.11 +cuda cuda_arch=80
    #- magma@2.6.1+cuda cuda_arch=80
    #- strumpack@6.1.0 ~slate +cuda cuda_arch=80
    #- slate@2021.05.02 +cuda cuda_arch=80
    #- superlu-dist@7.1.1 +openmp +cuda cuda_arch=80
    #- sundials@5.8.0 +openmp +cuda cuda_arch=80
  - nersc_specs:
    - chapel@1.24.1
    - gsl@2.7
    - fftw@3.3.10
    - nccmp@1.9.0.1
    - netcdf-c@4.8.1
    - netcdf-fortran@4.5.3
    - nco@5.0.1
    - metis@5.1.0
    - parallel-netcdf@1.12.2
    - parmetis@4.0.3
    - gromacs@2021.3
    #- plumed@2.6.3
    #- wannier90@3.1.0

Cori E4S 22.02

Production Spack Environment

# This is a Spack Environment file.
#
# It describes a set of packages to be installed, along with
# configuration settings.
spack:
  view: false
  config:
    build_stage: $spack/var/spack/stage
    misc_cache: $spack/var/spack/misc_cache
    install_tree:
      concretizer: clingo
      root: /global/common/software/spackecp/cori/e4s-22.02/software
    module_roots:
      tcl: /global/common/software/spackecp/cori/e4s-22.02/modules
  mirrors:
    source_mirror: file:///global/cfs/cdirs/m3503/mirrors/source_mirror 
  modules::
    prefix_inspections:
      bin:
        - PATH
      lib:
        - LIBRARY_PATH
        - LD_LIBRARY_PATH
      lib64:
        - LIBRARY_PATH
        - LD_LIBRARY_PATH
      include:
        - C_INCLUDE_PATH
        - CPLUS_INCLUDE_PATH
        - CPATH
      man:
        - MANPATH
      share/man:
        - MANPATH
      share/aclocal:
        - ACLOCAL_PATH
      lib/pkgconfig:
        - PKG_CONFIG_PATH
      lib64/pkgconfig:
        - PKG_CONFIG_PATH
      share/pkgconfig:
        - PKG_CONFIG_PATH
      '':
        - CMAKE_PREFIX_PATH
    enable:
    - tcl
    tcl:
      blacklist_implicits: true
      hash_length: 0
      naming_scheme: '{name}/{version}-{compiler.name}-{compiler.version}'
      all:
        autoload: direct
        conflict:
        - '{name}'
        environment:
          set:
            '{name}_ROOT': '{prefix}'
      darshan-runtime:
        conflict:
        - darshan
      darshan-util:
        conflict:
        - darshan
      projections:
        all: '{name}/{version}-{compiler.name}-{compiler.version}'
        warpx dims=1: '{name}/{version}-{compiler.name}-{compiler.version}-dims1'
        warpx dims=2: '{name}/{version}-{compiler.name}-{compiler.version}-dims2'
        warpx dims=3: '{name}/{version}-{compiler.name}-{compiler.version}-dims3'
  compilers:
  - compiler:
      spec: gcc@11.2.0
      paths:
        cc: /opt/cray/pe/craype/default/bin/cc
        cxx: /opt/cray/pe/craype/default/bin/CC
        f77: /opt/cray/pe/craype/default/bin/ftn
        fc: /opt/cray/pe/craype/default/bin/ftn
      flags: {}
      operating_system: cnl7
      target: any
      modules:
      - PrgEnv-gnu
      - gcc/11.2.0
      - craype-haswell
  - compiler:
      spec: intel@19.1.2.254
      paths:
        cc: /opt/cray/pe/craype/default/bin/cc
        cxx: /opt/cray/pe/craype/default/bin/CC
        f77: /opt/cray/pe/craype/default/bin/ftn
        fc: /opt/cray/pe/craype/default/bin/ftn
      flags: {}
      operating_system: cnl7
      target: any
      modules:
      - PrgEnv-intel
      - intel/19.1.2.254
      - craype-haswell
  definitions:
  - gcc_compilers: ['%gcc@11.2.0']
  - intel_compilers: ['%intel@19.1.2.254']
  - gcc_specs:
    - adios2@2.7.1
    - amrex@22.02
    - aml@0.1.0
    - arborx@1.1
    - argobots@1.1
    - axom@0.6.1
    - bolt@2.0
    - caliper@2.7.0
    - chai@2.4.0 ~benchmarks ~tests
    - conduit@0.8.2
    - darshan-runtime@3.3.1
    - darshan-util@3.3.1
    - dyninst@12.0.1
    - faodel@1.2108.1
    - flecsi@1.4.2
    - flit@2.1.0
    - gasnet@2021.9.0
    - ginkgo@1.4.0
    - globalarrays@5.8
    - gotcha@1.0.3
    - hdf5@1.10.7 +fortran +hl +shared
    - hdf5@1.12.1 +fortran +hl +shared
    - hdf5@1.13.0 +fortran +hl +shared
    - heffte@2.2.0 +fftw
    - hpx@1.7.1 networking=mpi
    - hypre@2.24.0
    - kokkos@3.5.00 +openmp +wrapper
    - kokkos-kernels@3.5.00 +openmp ^kokkos@3.5.00 +openmp +wrapper
    - legion@21.03.0
    - libquo@1.3.1
    - libunwind@1.5.0
    - mercury@2.1.0
    - metall@0.17
    - mfem@4.3.0
    - mpark-variant@1.4.0
    - nccmp@1.9.0.1 ^netcdf-c@4.8.1
    - nco@5.0.1
    - netlib-scalapack@2.1.0
    - ninja@1.10.2
    - nvhpc@22.1
    - openpmd-api@0.14.4
    - papi@6.0.0.1
    - papyrus@1.0.1
    - parallel-netcdf@1.12.2
    - parsec@3.0.2012 ~cuda
    - pdt@3.25.1
    - petsc@3.16.4 +openmp
    - pumi@2.2.6
    - qthreads@1.16 scheduler=distrib
    - raja@0.14.0
    - stc@0.9.0
    - strumpack@6.3.0~butterflypack~slate
    - sundials@6.1.1
    - superlu@5.3.0
    - superlu-dist@7.2.0
    - swig@4.0.2
    - sz@2.1.12
    - tasmanian@7.7
    - tau@2.31 +mpi +python
    - turbine@1.3.0
    - umap@2.1.0
    - umpire@6.0.0
    - upcxx@2021.9.0
    - veloc@1.5
    - vtk-m@1.7.1
    - zfp@0.5.5
   #- hpctoolkit@2022.01.15
   #- phist@1.9.5 #Unable to locate cray-libsci headers in /opt/cray/pe/libsci/20.09.1/gnu/8.1/include
   #- mpifileutils@0.11.1 ~xattr # failed to install libcircle Unable to find suitable MPI Compiler. Try setting MPICC.
   #- plumed@2.6.3  # Build Failed
   #-precice@2.3.0 # Build failed due to petsc
   #- py-warpx@22.02 ^warpx dims=rz
   #- rempi@1.1.0 failed to find MPICC
   #- slate@2021.05.02 ~cuda # Build failure on blaspp. BLAS++ requires a BLAS library and none was found.
   #- scr@3.0rc2  Build failure
   #- trilinos@13.0.1 +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
    #- unifyfs@0.9.1 # Build failure on dependency mercury
    #- wannier90@3.1.0 #  Error: A PrgEnv-* modulefile must be loaded.
  - intel_specs:
    - adios2@2.7.1
    - arborx@1.1
    - argobots@1.1
    - caliper@2.7.0
    - conduit@0.8.2
    - chai@2.4.0 ~benchmarks ~tests
    - darshan-runtime@3.3.1
    - darshan-util@3.3.1
    - faodel@1.2108.1
    - flecsi@1.4.2
    - flit@2.1.0
    - gasnet@2021.9.0
    - ginkgo@1.4.0
    - globalarrays@5.8
    - gotcha@1.0.3
    - hdf5@1.10.7 +fortran +hl +shared
    - hdf5@1.12.1 +fortran +hl +shared
    - hdf5@1.13.0 +fortran +hl +shared
    - heffte@2.2.0 +fftw
    - hypre@2.24.0
    - kokkos@3.5.00 +openmp +wrapper
    - legion@21.03.0
    - libquo@1.3.1
    - libunwind@1.5.0
    - loki@0.1.7
    - mercury@2.1.0
    - metall@0.17
    - mfem@4.3.0
    - mpark-variant@1.4.0
    - nccmp@1.9.0.1 ^netcdf-c@4.8.1
    - netlib-scalapack@2.1.0
    - ninja@1.10.2
    - openpmd-api@0.14.4
    - papi@6.0.0.1
    - parallel-netcdf@1.12.2
    - parsec@3.0.2012 ~cuda
    - papyrus@1.0.1
    - petsc@3.16.4 +openmp
    - pdt@3.25.1
    - precice@2.3.0
    - pumi@2.2.6
    - qthreads@1.16 scheduler=distrib
    - raja@0.14.0
    - slepc@3.16.2
    - strumpack@6.3.0~butterflypack ~slate
    - sundials@6.1.1
    - superlu@5.3.0
    - superlu-dist@7.2.0
    - swig@4.0.2
    - sz@2.1.12
    - tasmanian@7.7
    - turbine@1.3.0
    - umap@2.1.0
    - umpire@6.0.0
    - upcxx@2021.9.0
    - variorum@0.4.1
    - veloc@1.5
    - vtk-m@1.7.1
    - wannier90@3.1.0
    - warpx dims=1
    - warpx dims=2
    - warpx dims=3
    - zfp@0.5.5
    #- axom@0.6.1 # Build failure in cmake for axom
    #- butterflypack@2.1.0 # Build failure  sed: can't read *.inc: No such file or directory
    #- dyninst@12.0.1 # %intel conflict
    # - hpx@1.7.1 networking=mpi # Failed on asio
    #- lammps@20220107 # Build Failure during cmake
    #- kokkos-kernels@3.5.00 +openmp ^kokkos@3.5.00 +openmp +wrapper  # Build failure
    #- plasma@21.8.29 # %intel conflict
    #- phist@1.9.5  # Error: NoHeadersError: Unable to locate cray-libsci headers in /opt/cray/pe/libsci/20.09.1/intel/16.0/include
    #- plumed@2.6.3
    #- rempi@1.1.0  # FAILED to find MPICC
    #- slate@2021.05.02 ~cuda # %intel conflict
    #- scr@3.0rc2   Build failure
    #- trilinos@13.0.1 +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
    #- warpx dims=rz failed on blaspp

  - nersc_specs:
    - cdo@2.0.3 +curl
    - chapel@1.24.1
    - ffmpeg@4.4.1
    - elpa@2021.11.001
    - grads@2.2.1
    - gsl@2.7
    - gromacs@2021.5
    - libxc@5.1.7
    - libxsmm@1.17 +shared
    - libint@2.6.0
    - nano@4.9
    - maven@3.8.4
    - metis@5.1.0
    - octave@6.4.0
    - openjdk@11.0.12_7
    - parallel@20210922
    - parmetis@4.0.3
    - texlive
    - xerces-c@3.2.3

    #- abinit+openmp failed to install netcdf-fortran
    #- gnuplot build failure on gnuplot
    #- valgrind@3.18.1 Build failure
  specs:
  - matrix:
    - [$gcc_specs]
    - [$gcc_compilers]
  - matrix:
    - [$intel_specs]
    - [$intel_compilers]
  - matrix:
    - [$nersc_specs]
    - [$gcc_compilers]
  packages:
    all:
      compiler: [gcc@11.2.0, intel@19.1.2.254]
      providers:
        blas: [cray-libsci, intel-mkl]
        fftw-api: [cray-fftw]
        mpi: [cray-mpich]
        scalapack: [cray-libsci, intel-mkl]
    amrex:
      variants: +fortran +hypre +openmp +shared
    bash:
      buildable: false
      externals:
      - spec: bash@4.4.23
        prefix: /
    bzip2:
      version: [1.0.6]
      externals:
      - spec: bzip2@1.0.6
        prefix: /usr
    coreutils:
      buildable: false
      version: [8.29]
      externals:
      - spec: coreutils@8.29
        prefix: /usr
    cpio:
      buildable: false
      externals:
      - spec: cpio@2.12
        prefix: /
    cray-libsci:
      buildable: false
      version: [20.09.1]
      externals:
      - spec: cray-libsci@20.09.1 %gcc
        prefix: /opt/cray/pe/libsci/20.09.1/gnu/8.1
        modules:
        - cray-libsci/20.09.1
      - spec: cray-libsci@20.09.1 %intel
        prefix: /opt/cray/pe/libsci/20.09.1/intel/16.0
        modules:
        - cray-libsci/20.09.1
    cray-fftw:
      buildable: false
      externals:
      - spec: cray-fftw@3.3.8.10
        modules:
        - cray-fftw/3.3.8.10
    cray-mpich:
      buildable: false
      externals:
      - spec: cray-mpich@7.7.19 %intel
        prefix: /opt/cray/pe/mpt/7.7.19/gni/mpich-intel/16.0
        modules:
        - cray-mpich/7.7.19
      - spec: cray-mpich@7.7.19 %gcc
        prefix: /opt/cray/pe/mpt/7.7.19/gni/mpich-gnu/8.2
        modules:
        - cray-mpich/7.7.19
    curl:
      externals:
      - spec: curl@7.66.0+gssapi+ldap+nghttp2
        prefix: /usr
    diffutils:
      version: [3.6]
      externals:
      - spec: diffutils@3.6
        prefix: /usr
    elfutils:
      version: [0.168]
      externals:
      - spec: elfutils@0.168
        prefix: /usr
    findutils:
      version: [4.6.0]
      externals:
      - spec: findutils@4.6.0
        prefix: /usr
    hdf5:
      variants: +fortran +hl +shared api=v18
      version: [1.12.1]
      externals:
      - spec: hdf5@1.12.1.1%intel+shared+fortran+hl~mpi
        modules:
        - cray-hdf5/1.12.1.1
      - spec: hdf5@1.12.1.1%intel+shared+fortran+hl+mpi
        modules:
        - cray-hdf5-parallel/1.12.1.1
      - spec: hdf5@1.12.1.1%gcc+shared+fortran+hl~mpi
        modules:
        - cray-hdf5/1.12.1.1
      - spec: hdf5@1.12.1.1%gcc+shared+fortran+hl+mpi
        modules:
        - cray-hdf5-parallel/1.12.1.1
    hypre:
      variants: +openmp +superlu-dist
    gawk:
      buildable: false
      externals:
      - spec: gawk@4.2.1
        prefix: /usr
    git:
      version: [2.26.2]
      buildable: false
      externals:
      - spec: git@2.26.2
        prefix: /usr
    gmake:
      buildable: false
      externals:
      - spec: gmake@4.2.1
        prefix: /usr
    intel-mkl:
      buildable: false
      externals:
      - spec: intel-mkl@19.1.2.254
        modules:
        - intel/19.1.2.254
    krb5:
      buildable: false
      externals:
      - spec: krb5@1.16.3
        prefix: /usr/lib/mit
    libunwind:
      variants: +pic +xz
    mercury:
      variants: ~bmi
    mesa:
      variants: ~llvm
    mesa18:
      variants: ~llvm
    m4:
      buildable: false
      externals:
      - spec: m4@1.4.18
        prefix: /usr
    mpich:
      variants: ~wrapperrpath
    ncurses:
      variants: +termlib
      externals:
      - spec: ncurses@6.1
        prefix: /usr
    netcdf-c:
      version: [4.8.1.1]
      externals:
      - spec: netcdf-c@4.8.1.1~mpi
        modules:
        - cray-netcdf/4.8.1.1
      - spec: netcdf-c@4.8.1.1+mpi
        modules:
        - cray-netcdf-hdf5parallel/4.8.1.1
    openssl:
      buildable: false
      version: [1.1.1d]
      externals:
      - spec: openssl@1.1.1d
        prefix: /usr
    openssh:
      buildable: false
      externals:
      - spec: openssh@8.1p1
        prefix: /usr
    pdsh:
      buildable: false
      externals:
      - spec: pdsh@2.33
        prefix: /usr
    petsc:
      variants: +openmp
    perl:
      version: [5.32.0]
    pkg-config:
      buildable: false
      version: [0.29.2]
      externals:
      - spec: pkg-config@0.29.2
        prefix: /usr
    readline:
      version: [7.0]
      buildable: false
      externals:
      - spec: readline@7.0
        prefix: /usr
    sed:
      buildable: false
      externals:
      - spec: sed@4.4
        prefix: /usr
    slurm:
      buildable: false
      version: [20-11-8-1]
      externals:
      - spec: slurm@20-11-8-1
        prefix: /usr
    superlu-dist:
      variants: +openmp
    strumpack:
      variants: ~slate
    tar:
      version: [1.3]
      buildable: false
      externals:
      - spec: tar@1.30
        prefix: /usr
    tcsh:
      version: [6.20.0]
      buildable: false
      externals:
      - spec: tcsh@6.20.0
        prefix: /usr
    trilinos:
      variants: +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext
        +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu
        +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos
        +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
    unzip:
      version: [6.0]
      buildable: false
      externals:
      - spec: unzip@6.0
        prefix: /usr
    util-linux-uuid:
      version: [2.31.1]
      buildable: false
      externals:
      - spec: util-linux-uuid@2.31.1
        prefix: /usr
    zsh:
      version: [5.6]
      buildable: false
      externals:
      - spec: zsh@5.6
        prefix: /usr

Cori E4S 21.05

Production Spack Environment

spack:
  view: false
  concretization: separately
  config:
    install_tree:
      root: /global/common/software/spackecp/e4s-21.05/software
    module_roots:
      tcl: /global/common/software/spackecp/e4s-21.05/modules/
    build_stage: $tempdir/user/spack-stage
  modules:
    enable:
    - tcl
    tcl:
      blacklist_implicits: true
      hash_length: 0
      naming_scheme: '{name}/{version}-{compiler.name}-{compiler.version}'
      all:
        conflict:
        - '{name}'
        environment:
          set:
            '{name}_ROOT': '{prefix}'
      darshan-runtime:
        conflict:
        - darshan
      darshan-util:
        conflict:
        - darshan
      projections:
        all: '{name}/{version}-{compiler.name}-{compiler.version}'
        py-warpx ^warpx dims=2: '{name}/{version}-dims2'
        py-warpx ^warpx dims=3: '{name}/{version}-dims3'
        py-warpx ^warpx dims=rz: '{name}/{version}-dimsRZ'
        warpx dims=2: '{name}/{version}-dims2'
        warpx dims=3: '{name}/{version}-dims3'
        warpx dims=rz: '{name}/{version}-dimsRZ'
        boost cxxstd=98: '{name}/{version}-cxxstd=98'
        boost cxxstd=17: '{name}/{version}-cxxstd=17'
        kokkos +openmp: '{name}/{version}-openmp'
        kokkos ~openmp: '{name}/{version}'
  mirrors:
    cori-e4s-21.05: https://cache.e4s.io/21.05
  compilers:
  - compiler:
      spec: intel@19.1.3.304
      paths:
        cc: cc
        cxx: CC
        f77: ftn
        fc: ftn
      flags: {}
      operating_system: cnl7
      target: any
      modules:
      - PrgEnv-intel
      - intel/19.1.3.304
      environment: {}
      extra_rpaths: []
  packages:
    all:
      compiler:
      - intel@19.1.3.304
      providers:
        blas:
        - openblas
        mpi:
        - mpich
      target:
      - haswell
      variants: +mpi
    slurm:
      buildable: false
      version: [20-02-4-1]
      externals:
      - spec: slurm@20-02-4-1
        prefix: /usr
    mpich:
      buildable: false
      externals:
      - spec: mpich@3.1
        modules:
        - cray-mpich/7.7.10
    autoconf:
      version:
      - '2.69'
    automake:
      version:
      - 1.16.3
    berkeley-db:
      version:
      - 18.1.40
    binutils:
      variants: +ld +gold +headers +libiberty ~nls +plugins
      version:
      - 2.33.1
    boost:
      version:
      - 1.75.0
    bzip2:
      version:
      - 1.0.8
    c-blosc:
      version:
      - 1.21.0
    cmake:
      version:
      - 3.20.2
    curl:
      version:
      - 7.76.0
    diffutils:
      version:
      - 3.7
    elfutils:
      version:
      - 0.182
      variants: +bzip2 ~nls +xz
    expat:
      version:
      - 2.2.10
    findutils:
      version:
      - 4.8.0
    gdbm:
      version:
      - 1.18.1
    gettext:
      version:
      - 0.21
    git:
      version:
      - 2.31.0
    glib:
      version:
      - 2.66.8
    hdf5:
      variants: +fortran +hl +shared
      version:
      - 1.10.7
    help2man:
      version:
      - 1.47.16
    hwloc:
      version:
      - 2.4.1
    json-c:
      version:
      - 0.13.1
    libbsd:
      version:
      - 0.10.0
    libfabric:
      version:
      - 1.12.1
      variants: fabrics=sockets,tcp,udp,rxm
    libiconv:
      version:
      - 1.16
    libsigsegv:
      version:
      - 2.12
    libpciaccess:
      version:
      - 0.16
    libtool:
      version:
      - 2.4.6
    libunwind:
      version:
      - 1.5.0
      variants: +pic +xz
    libxml2:
      version:
      - 2.9.10
    lz4:
      version:
      - 1.9.3
    m4:
      version:
      - 1.4.18
    mesa:
      variants: ~llvm
    mesa18:
      variants: ~llvm
    ncurses:
      version:
      - 6.2
      variants: +termlib
    numactl:
      version:
      - 2.0.14
    openblas:
      version:
      - 0.3.10
      variants: threads=openmp
    perl:
      version:
      - 5.32.0
    pkgconf:
      version:
      - 1.7.3
    python:
      version:
      - 3.8.10
    readline:
      version:
      - 8
    sqlite:
      version:
      - 3.34.0
    tar:
      version:
      - 1.32
    texinfo:
      version:
      - 6.5
    xz:
      version:
      - 5.2.5
      variants: +pic
    zlib:
      version:
      - 1.2.11
    zstd:
      version:
      - 1.4.9

  definitions:
  - cuda_specs:
    - amrex@21.05 +cuda cuda_arch=70
    - axom@0.5.0 +cuda cuda_arch=70 ^umpire~shared
    - caliper@2.5.0 +cuda cuda_arch=70
    - chai@2.3.0 +cuda ~benchmarks ~tests cuda_arch=70 ^umpire~shared
    - ginkgo@1.3.0 +cuda cuda_arch=70
    - hpx@1.6.0 +cuda cuda_arch=70
    - kokkos@3.4.00 +cuda +wrapper cuda_arch=70
    - kokkos-kernels@3.2.00 +cuda cuda_arch=70 ^kokkos +cuda +wrapper cuda_arch=70
    - magma@2.5.4 cuda_arch=70
    - raja@0.13.0 +cuda cuda_arch=70
    - slate@2021.05.02 +cuda cuda_arch=70
    - strumpack@5.1.1 +cuda ~slate cuda_arch=70
    - sundials@5.7.0 +cuda cuda_arch=70
    - superlu-dist@6.4.0 +cuda cuda_arch=70
    - tasmanian@7.5 +cuda cuda_arch=70
    - umpire@4.1.2 +cuda ~shared cuda_arch=70
    - zfp +cuda cuda_arch=70
    #- ascent@0.7.1 +cuda ~shared cuda_arch=70
    #- hypre@2.20.0 +cuda cuda_arch=70
    #- mfem@4.2.0 +cuda cuda_arch=70

  - default_specs:
    - adios2@2.7.1
    - adios@1.13.1
    - aml@0.1.0
    - amrex@21.05
    - arborx@1.0
    - argobots@1.1
    - ascent@0.7.1 ~fortran
    - bolt@2.0
    - cabana@0.3.0
    - caliper@2.5.0
    - chai@2.3.0 ~benchmarks ~tests
    - conduit@0.7.2
    - darshan-runtime@3.3.0
    - darshan-util@3.3.0
    - faodel@1.1906.1
    - flecsi@1.4 +cinch
    - flit@2.1.0
    - gasnet@2021.3.0
    - ginkgo@1.3.0
    - globalarrays@5.8
    - gmp@6.2.1
    - gotcha@1.0.3
    - hdf5@1.10.7
    - hypre@2.20.0
    - kokkos-kernels@3.2.00 +openmp
    - kokkos@3.4.00 +openmp
    - legion@21.03.0
    - libnrm@0.1.0
    - libquo@1.3.1
    - libunwind@1.5.0
    - loki@0.1.7
    - mercury@2.0.1
    - metall@0.13
    - mfem@4.2.0
    - mpark-variant@1.4.0
    - ninja@1.10.2
    - openpmd-api@0.13.4
    - papi@6.0.0.1
    - papyrus@1.0.1
    - parallel-netcdf@1.12.2
    - pdt@3.25.1
    - petsc@3.15.0
    - precice@2.2.1
    - pumi@2.2.5
    - py-libensemble@0.7.2
    - py-petsc4py@3.15.0
    - py-warpx@21.05 ^warpx dims=2
    - py-warpx@21.05 ^warpx dims=3
    - py-warpx@21.05 ^warpx dims=rz
    - qthreads@1.16 scheduler=distrib
    - raja@0.13.0
    - scr@3.0rc1
    - slepc@3.15.0
    - stc@0.9.0
    - strumpack@5.1.1 ~slate
    - sundials@5.7.0
    - superlu-dist@6.4.0
    - superlu@5.2.1
    - swig@4.0.2
    - swig@4.0.2-fortran
    - sz@2.1.11.1
    - tasmanian@7.5
    - tau@2.30.1
    - turbine@1.3.0
    - umap@2.1.0
    - umpire@4.1.2
    - upcxx@2021.3.0
    - zfp@0.5.5

    # Explicit conflicts with Cray -or- Intel compiler (prohibited via package.py)
   #- dyninst@11.0.0
   #- hpctoolkit@2021.03.01
   #- plasma@20.9.20
   #- qt@5.15.2
   #- qwt@6.1.6
   #- slate@2021.05.02 ~cuda

    # Cannot build suite-sparse due to OOM killer
   #- fortrilinos@2.0.0 ^trilinos +nox +superlu-dist +stratimikos
   #- omega-h@9.32.5
   #- trilinos@13.0.1
   #- trilinos@13.0.1 +nox +superlu-dist

    # Failed builds
   #- archer@2.0.0 # llvm@8 fails
   #- axom@0.5.0 # thirdparty/sol/sol.hpp(11408): rvalue ref cannot be bound to an lvalue
   #- heffte@2.0.0 # test/test_units_nompi.cpp(499): error: more than one instance of constructor "heffte::box3d::box3d"
   #- hpx@1.6.0 # include/boost/asio/impl/read.hpp(377): no instance of overloaded function "hpx::util::detail::bound
   #- mpifileutils@0.11 ~xattr # libcap: _caps_output.gperf:96:53: unknown type name 'size_t', libcircle: configure: check if MPI setup correctly
   #- nrm@0.1.0 # py-gevent: configure: compiler doesn't halt on function prototype mismatch
   #- py-jupyterhub@1.0.0
   #- rempi@1.1.0 # configure: couldn't find MPI
   #- unifyfs@0.9.1 # configure: couldn't find MPI

  specs:
  - $default_specs

Cori E4S 21.02

Production Spack Environment

spack:
  view: false
  concretization: separately
  config:
    install_tree:
      root: /global/common/software/spackecp/e4s-21.02/software
    module_roots:
      tcl: /global/common/software/spackecp/e4s-21.02/modules/
    build_stage: $tempdir/user/spack-stage
  modules:
    enable:
    - tcl
    tcl:
      blacklist_implicits: true
      hash_length: 0
      naming_scheme: '{name}/{version}-{compiler.name}-{compiler.version}'
      all:
        conflict:
        - '{name}'
        environment:
          set:
            '{name}_ROOT': '{prefix}'
      darshan-runtime:
        conflict:
        - 'darshan'
      darshan-util:
        conflict:
        - 'darshan'
      projections:
        all: '{name}/{version}-{compiler.name}-{compiler.version}'
  mirrors:
    cori-e4s-21.02: /global/common/software/spackecp/mirrors/cori-e4s-21.02
  compilers:
  - compiler:
      spec: intel@19.1.2.254
      paths:
        cc: cc
        cxx: CC
        f77: ftn
        fc: ftn
      flags: {}
      operating_system: cnl7
      target: any
      modules:
      - PrgEnv-intel
      - intel/19.1.2.254
      environment: {unset: []}
      extra_rpaths: []
  - compiler:
      spec: gcc@10.1.0
      paths:
        cc: cc
        cxx: CC
        f77: ftn
        fc: ftn
      operating_system: cnl7
      modules:
      - PrgEnv-gnu
      - gcc/10.1.0
  
  definitions:
  - intel_compiler: ['%intel@19.1.2.254']
  - gcc_compiler: ['%gcc@10.1.0']
  - e4s_intel:    
    - adios2@2.7.1 +hdf5 
    - aml@0.1.0
    - arborx@0.9-beta +openmp     
    - bolt@2.0
    - caliper@2.5.0 +fortran
    - faodel@1.1906.1
    - flecsi@1.4 +cinch +caliper +graphviz +tutorial
    - flit@2.1.0  
    - gasnet@2020.3.0 +udp
    - ginkgo@1.3.0
    - gotcha@1.0.3 +test
    - hdf5@1.10.7    
    - hypre@2.20.0 +mixedint +superlu-dist +openmp
    - libnrm@0.1.0
    - libquo@1.3.1
    - mercury@2.0.0 +udreg
    - mfem@4.2.0 +examples +gnutls +gslib +lapack +libunwind +openmp +threadsafe +pumi +umpire    
    - ninja@1.10.2
    - omega-h@9.32.5 ~trilinos
    - openpmd-api@0.13.2 
    - papi@6.0.0.1 +example +static_tools +powercap +infiniband
    - papyrus@1.0.1 
    - pdt@3.25.1 +pic 
    - precice@2.2.0 +python
    - pumi@2.2.5 +fortran
    - qthreads@1.16 ~hwloc
    - raja@0.13.0 +tests    
    - slepc@3.14.2
    - strumpack@5.1.1 +shared
    - sundials@5.7.0 +examples-cxx +hypre +klu +lapack 
    - superlu@5.2.1
    - superlu-dist@6.4.0 +openmp
    - swig@4.0.2-fortran
    - tasmanian@7.3 +blas +fortran +mpi +python +xsdkflags
    - tau@2.30.1 +mpi ~pdt
    - turbine@1.2.3 +hdf5 +python
    - umap@2.1.0 +tests
    - umpire@4.1.2 +fortran +numa +openmp
    - upcxx@2020.10.0
    - zfp@0.5.5 +aligned +c +fortran +openmp +profile
        
  - e4s_gcc:
    - darshan-runtime@3.2.1 +slurm 
    - darshan-util@3.2.1 +bzip2    
    - dyninst@10.2.1 
    - legion@20.03.0 
    - plasma@20.9.20 
    - slate@2020.10.00 ~cuda  
    

    # skipping package
    #  - adios@1.13.1 +bzip2 +fortran +hdf5 +netcdf
    # - kokkos-kernels@3.2.00 +mkl +openmp
    # - kokkos@3.2.00  +compiler_warnings +deprecated_code +examples +hwloc +memkind +numactl +openmp +pic +tests    
    # - openmpi@4.0.5 +cxx +thread_multiple schedulers=slurm
    # - parallel-netcdf@1.12.1 +burstbuffer
    # - petsc@3.14.4 +X +fftw +jpeg +libpng +libyaml +memkind 
    # - py-jupyterhub@1.0.0
    # - py-libensemble@0.7.1 +mpi +nlopt +petsc4py  +scipy
    # - py-petsc4py@3.14.1
    # - trilinos@13.0.1 

    # _______________________________ ISSUES TO SOLVE _______________________________
    # issue installing vtkh using intel compiler
    # using gcc compiler ascent has dependency for conduit@develop which not pinned to version. There was a build error related to missing HDF5 library
    # - ascent@0.6.0
    
    # Issue detecting fortran compiler https://cdash.spack.io/viewConfigure.php?buildid=105216. Also issue installing conduit since its tied to 'develop'. Tried using conduit@0.7.1 and still failed see https://cdash.spack.io/viewBuildError.php?buildid=105206. Talk to @cyrush at spack slack.   
    # - axom@0.4.0 +mfem +python

    # /usr/lib64/gcc/x86_64-suse-linux/7/../../../../x86_64-suse-linux/bin/ld: /usr/lib/libm.so: error adding symbols: file in wrong format see https://cdash.spack.io/buildSummary.php?buildid=104952
    # - adios2@2.7.1 +hdf5 +dataman +dataspaces   

    # skipping this version for now 5.7 is the latest version 5.8 doesn't exist in spack repo
    # - globalarrays@5.8 +blas +lapack +scalapack
    
    #  Warning: Linking the shared library libhpcrun.la against the static library see  https://cdash.spack.io/viewBuildError.php?buildid=104938
    # - hpctoolkit@2020.08.03 %gcc

    # error: identifier "HPX_SMT_PAUSE" is undefined see https://cdash.spack.io/viewBuildError.php?buildid=105747
    # - hpx@1.6.0 +async_mpi +examples

    # Error building legion with intel compiler https://cdash.spack.io/viewBuildError.php?buildid=105190
    # - legion@20.03.0

    # skip magma because it's a GPU package
    #- magma@2.5.4 cuda_arch=70 ^cuda@10.2.89
   
    # error with intel compiler:  building dtcmp 
    # error with gnu compiler: Could NOT find LibCircle (missing: LibCircle_LIBRARIES
    # - mpifileutils@0.10.1

    # /global/cfs/cdirs/m3503/spack-NSewtxLx/spack-stage/siddiq90/spack-stage-phist-1.9.3-rznbmfuo2mt2erku4rit4peyqxu7iji4/spack-src/fortran_bindings/test/kernels.F90(63): catastrophic error: **Internal compiler error: internal abort** Please report this error along with the circumstances in which it occurred in a Software Problem Report.  Note: File and line given may not be explicit cause of this error. see https://cdash.spack.io/buildSummary.php?buildid=104915
    # - phist@1.9.3

    
    # configure: error: Failed to find C MPI Wrapper. see https://cdash.spack.io/buildSummary.php?buildid=104940
    # - rempi@1.1.0  

    # "%intel@19:" conflicts with "slate" [Does not currently build with icpc >= 2019]
    # - slate@2020.10.00 ^cuda@10.2.89

    # build Error with Intel: error building dtcmp: configure: error: C compiler cannot create executables
    # build Error with GCC: make[2]: *** No rule to make target '/global/cfs/cdirs/m3503/spack-qhLmtUlQ/spack_path_placeholder/spack_path_placeholder/spack_path_placeholder/spack_path_placehol/cray-cnl7-haswell/gcc-10.1.0/libyogrt-1.24-6wngjuplxnjjsivzvilwjsp4gwu4ziuj/lib/libyogrt.a', needed by 'examples/test_ckpt_F'.  Stop. see https://cdash.spack.io/buildSummary.php?buildid=104889
    # - scr@2.0.0
    # error installing ant Error: JAVA_HOME is not defined correctly. https://software.nersc.gov/NERSC/e4s-2102/-/jobs/87103
    # - stc@0.8.3 
    
    # Issue building sundials with raja support see https://cdash.spack.io/viewBuildError.php?buildid=105455. Not sure if raja support with sundials is neccessary.
    # sundials@5.7.0 +examples-cxx +examples-f2003 +f2003 +hypre +klu +lapack +openmp +raja +superlu-dist

    # https://cdash.spack.io/viewBuildError.php?buildid=105513
    # - sz@2.1.11.1 +fortran +python +time_compression +random_access +pastri

    # issue building tau with intel see https://cdash.spack.io/viewBuildError.php?buildid=105235 one of the error points to missing `-lpdb` library. Tau has `+pdt` enabled by default
    # - tau@2.30.1 +adios2 +gasnet +likwid  +ompt +openmp +mpi  +python +scorep +shmem +sqlite

    #  Could NOT find AXL (missing: AXL_LIBRARIES AXL_INCLUDE_DIRS) see https://cdash.spack.io/buildSummary.php?buildid=105476
    # - veloc@1.4

    # issue configure: error: "Couldn't find MPI" see https://cdash.spack.io/viewConfigure.php?buildid=105491
    # issue installing mercury (dependency) for unifyfs with gcc https://cdash.spack.io/buildSummary.php?buildid=105497
    #- unifyfs@0.9.1

  specs:
  - matrix:
    - [$e4s_intel]
    - [$intel_compiler]
  - matrix:    
    - [$e4s_gcc]
    - [$gcc_compiler]

  packages:
    all:
      compiler: [intel@19.1.2.254, gcc@10.1.0]
      target: [haswell]
      providers:
        mpi: [mpich]
        mkl: [cray-libsci, intel-mkl]
        blas: [cray-libsci, intel-mkl]
        scalapack: [cray-libsci, intel-mkl]
        pkgconfig: [pkg-config]
    
    cray-libsci:
      buildable: false
      externals:
      - spec: cray-libsci@19.06.1%intel
        modules:
        - cray-libsci/19.06.1

    fftw:
      buildable: false
      externals:
      - spec: fftw@3.3.8.4%intel
        modules:
        - cray-fftw/3.3.8.4

    hdf5:
      variants: +cxx +debug +fortran +szip +threadsafe +hl
      
    hwloc:
      buildable: false
      externals:
      - spec: hwloc
        prefix: /usr
    intel-mkl:
      buildable: false
      externals:
      - spec: intel-mkl@19.1.2.254
        modules:
        - intel/19.1.2.254

    mpich:
      buildable: false
      externals:
      - spec: mpich@3.1
        modules:
        - cray-mpich/7.7.10
    
    netcdf-c:
      buildable: false
      externals:
        - spec: netcdf-c@4.7.4
          modules:
          - cray-netcdf/4.7.4.0

    openssl:
      buildable: false
      externals:
      - spec: openssl@1.1.1g
        prefix: /usr
    # issue installing version 5.32.1 and confirmed 5.32.0 works
    perl:
      version: [5.32.0]

    # disable slate since we can't build with icc >= 19
    strumpack:
      variants: ~slate

Cori E4S 20.10

Production Spack Environment

spack:
  concretization: separately
  view: false
  config:
    install_tree: /global/common/software/spackecp/e4s-20.10/software
    build_stage: $tempdir/$user/spack-stage
    module_roots:
      tcl: /global/common/software/spackecp/e4s-20.10/modules/
  mirrors::
    e4s-2020-10: /global/common/software/spackecp/mirrors/e4s-2020-10
  modules:
    enable:
    - tcl
    tcl:
      hash_length: 8
      projections:
        all: '{name}/{version}-{compiler.name}-{compiler.version}'
      all:
        conflict:
        - '{name}'
        filter:
          environment_blacklist: []
        load: []
        environment:
          unset: []
      verbose: false
      whitelist: []
      blacklist: []
      blacklist_implicits: false
  
  definitions:
  - e4s:   
    - adios2@2.6.0 
    - aml@0.1.0
    - arborx@0.9-beta +openmp
    - bolt@1.0
    - caliper@2.4.0
    - darshan-runtime@3.2.1 +slurm
    - darshan-util@3.2.1 +bzip2 
    - flit@2.1.0
    - gasnet@2020.3.0 +udp
    - ginkgo@1.2.0
    - globalarrays@5.7 +blas +lapack +scalapack
    - gotcha@1.0.3 +test
    - hdf5@1.10.6 +cxx +debug +fortran +szip +threadsafe +hl
    - hypre@2.20.0 +mixedint
    - kokkos-kernels@3.2.00 +mkl +openmp
    - kokkos@3.2.00 +debug +debug_dualview_modify_check +compiler_warnings +examples +hwloc +memkind +numactl +openmp +pic +tests
    - libnrm@0.1.0
    - libquo@1.3.1
    - mercury@1.0.1 +udreg
    - mfem@4.1.0 +examples +gnutls +gslib +lapack +libunwind +openmp +threadsafe +pumi +umpire  
    - ninja@1.10.1
    - openpmd-api@0.12.0
    - papi@6.0.0.1 +example +static_tools
    - parallel-netcdf@1.12.1
    - pdt@3.25.1 +pic
    - petsc@3.14.0    
    - pumi@2.2.2 +fortran
    - py-libensemble@0.7.0 +mpi +nlopt +scipy
    - py-petsc4py@3.13.0
    - qthreads@1.14 ~hwloc
    - raja@0.12.1
    - slepc@3.14.0 
    - stc@0.8.3 
    - sundials@5.4.0 +examples-cxx +examples-f2003 ~examples-f77 +f2003 +klu +openmp +hypre +lapack 
    - superlu@5.2.1
    - superlu-dist@6.3.1
    - swig@4.0.2
    - sz@2.1.10 +fortran +hdf5 +python +time_compression +random_access +netcdf +pastri
    - tasmanian@7.3 +blas +fortran +mpi +python +xsdkflags 
    - turbine@1.2.3 +hdf5 +python
    - umap@2.1.0 +tests
    - umpire@4.0.1 +fortran +numa +openmp
    - upcxx@2020.3.0
    - veloc@1.4
    - zfp@0.5.5

    
  # - adios@1.13.1 +netcdf +szip +fortran +bzip2  module already installed
  # - ascent@develop skipping package because its on develop
  # - axom@0.3.3 skip build for now, spack ci rebuild issue
  # issue with concretization of dyninst:  "%intel" conflicts with "dyninst" see https://software.nersc.gov/ecp/nersc-e4s/-/jobs/46526
  # - dyninst@10.2.1 +static
  # - faodel@1.1906.1 network=libfabric issue with build https://software.nersc.gov/NERSC/nersc-e4s/-/jobs/60284
  # flesci concretization issue: https://github.com/spack/spack/issues/19292
  # - flecsi@1 +cinch +coverage +doc +doxygen +graphviz +hdf5 +tutorial
  # - hpctoolkit@2020.08.03 +all-static +cray +mpi  # depends on dyninst
  # - hpx@1.5.1 issue with installing boost
  # - legion@20.03.0 failed to build se https://software.nersc.gov/ecp/e4s/facilitypipelines/nersc-e4s/-/jobs/59787
  # - magma@2.5.3 this package requires GPU, this is not applicable for Cori
  # - mpifileutils@0.10.1 +gpfs +lustre # fails on libcircle requires MPICC wrapper see https://software.nersc.gov/ecp/e4s/facilitypipelines/nersc-e4s/-/jobs/57907 we could install OpenMPI and build this with the wrapper.
  #- openmpi            # skip openmpi 
  # - omega-h@9.29.0
  # - papi@6.0.0.1 +example +static_tools   module already installed  
  # - phist@1.9.1 see https://software.nersc.gov/NERSC/nersc-e4s/-/jobs/63409
  # concretization issue "%intel" conflicts with "plasma" see https://software.nersc.gov/ecp/nersc-e4s/-/jobs/46533
  # - plasma@20.9.20
  # - precice@2.1.0 issue finding PETSC
  # - py-jupyterhub@1.0.0
  # error building rempi https://software.nersc.gov/ecp/nersc-e4s/-/jobs/32884
  # - rempi@1.1.0
  # - scr@2.0.0  # async_api=CRAY_DW issue finding DATAWARP libraries -- Could NOT find DATAWARP (missing: DATAWARP_LIBRARIES DATAWARP_INCLUDE_DIRS) see https://software.nersc.gov/ecp/e4s/facilitypipelines/nersc-e4s/-/jobs/59782
  # - slate@develop package tied to develop skipping this build
  # - strumpack@4.0.0 +shared ~butterflypack ~cuda +count_flops +build_dev_tests +build_tests 
  # error building otf2, we can disable tau with otf2 support 
  # - tau@2.29 +craycnl +openmp +mpi
  # - trilinos@13.0.0 +debug +float +openmp +pnetcdf +zlib
  # - unifyfs@0.9.0 +hdf5. can't find MPICC see https://software.nersc.gov/NERSC/nersc-e4s/-/jobs/63408
  
  - arch:
    - '%intel@19.1.2.254 arch=cray-cnl7-haswell'
  specs:
  - matrix:
    - - $e4s
    - - $arch

  compilers:
  - compiler:
      spec: intel@19.1.2.254
      paths:
        cc: cc
        cxx: CC
        f77: ftn
        fc: ftn
      flags: {}
      operating_system: cnl7
      target: any
      modules:
      - PrgEnv-intel
      - intel/19.1.2.254
      environment: {unset: []}
      extra_rpaths: []


  packages:
    all:
      compiler: [intel@19.1.2.254]
      target: [haswell]
      providers:
        mpi: [mpich]
        mkl: [intel-mkl, cray-libsci]
        blas: [intel-mkl, cray-libsci]
        scalapack: [intel-mkl, cray-libsci]
        pkgconfig: [pkg-config]
            
    berkeley-db:
      version: [18.1.4]

    boost:
      version: [1.74.0]   

    bzip2:
      buildable: false
      externals:
      - spec: bzip2
        prefix: /usr

    cmake:
      version: [3.16.5]  # issue with cmake 3.17.3 using 3.16.5 for now see https://github.com/spack/spack/issues/17605  

    cray-libsci:
      buildable: false
      externals:
      - spec: cray-libsci@19.06.1%intel
        modules:
        - cray-libsci/19.06.1
    
    diffutils:
      version: [3.7]
    
    elfutils:
      version: [0.180]
    
    expat:
      version: [2.2.9]

    fftw:
      buildable: false
      externals:
      - spec: fftw@3.3.8.4%intel
        modules: 
          - cray-fftw/3.3.8.4
    
    gdbm:
      version: [1.18.1]
    
    gettext:
      buildable: false      
      externals:
      - spec: gettext
        prefix: /usr
    
    help2man:
      version: [1.47.11]

    hwloc:
      buildable: false
      externals:
      - spec: hwloc
        prefix: /usr 
    
    hypre:
      version: [2.20.0]

    intel-mkl:
      buildable: false
      externals:
      - spec: intel-mkl@19.1.2.254
        modules:
          - intel/19.1.2.254    
    
    libbsd:
      version: [0.10.0]

    libiconv:
      version: [1.16]
    
    libsigsegv:
      version: [2.12]

    libxml2:
      version: [2.9.10]  

    lz4:
      buildable: false
      externals:
      - spec: lz4
        prefix: /usr

    m4:
      buildable: false
      externals:
      - spec: m4
        prefix: /usr  

    mpi:
      buildable: false

    mpich:
      buildable: false
      externals:
      - spec: mpich@3.1
        modules:
        - cray-mpich/7.7.10 

    openssl:
      buildable: false
      externals:
      - spec: openssl@1.1.1g
        prefix: /usr 
     
    ncurses:
      version: [6.2]  

    netcdf:
      buildable: false
      externals:
      - spec: netcdf@4.6.3.2%intel
        modules: 
          - cray-netcdf/4.6.3.2
       
    perl:
      buildable: false
      externals:
      - spec: perl
        prefix: /usr 

    petsc:
      version: [3.14.0]

    pdsh: # required for scr 
      buildable: false
      externals:
      - spec: pdsh
        prefix: /usr    
        
    pkgconf:
      version: [1.7.3]
    
    sqlite:
      version: [3.31.1]
        
    tar:
      buildable: false
      externals:
      - spec: tar
        prefix: /usr

    zlib:
      version: [1.2.11]

How To Guide

How to setup a schedule pipeline

First go to CI/CD > Schedules and create a schedule pipeline. You should see a similar pipeline for another stack. The schedule pipeline contains a unique variable PIPELINE_NAME which is the name of E4S stack to run. The value is all CAPS, so if you want to trigger E4S 23.05 stack for Perlmutter, the value will be PERLMUTTER_E4S_23.05. Please make sure the variable PIPELINE_NAME matches the one you defined in .gitlab-ci.yml for your job. The pipeline can be run via web interface, if you chose this route, just set PIPELINE_NAME to the appropriate value.

The reason why we setup schedule pipeline and web-interface is to allow one to trigger pipeline automatically at schedule interval or trigger pipeline manually such that a commit is not required to trigger pipeline. This is useful when one needs to check if pipeline can rebuild at any given time due to system change.

How to find available runners

You can find all runners by going to Settings > CI/CD > Runners.

This project is configured with gitlab runner for Perlmutter and Muller using the production account e4s. Shown below are the available runners

Gitlab Runner by NERSC system

System

Runner Name

Perlmutter

perlmutter-e4s

Muller

muller-e4s

The runner configuration files are located in directory ~/.gitlab-runner for user e4s.

How to register gitlab runner

We have a script titled register.sh that is responsible for registering a gitlab runner. This script will expect a registration token which can be found at Settings > CI/CD > Runners. Shown below is the script used to register the runner on Perlmutter, once you execute the script, you will be prompted for the registration token.

e4s:login37> cat ~/register.sh
#!/bin/bash

read -sp "Registration Token?" TOKEN
gitlab-runner register \
    --url https://software.nersc.gov \
    --registration-token ${TOKEN} \
        --tag-list ${NERSC_HOST}-${USER} \
    --name "E4S Runner on Perlmutter" \
    --executor custom \
    --custom-config-exec "/global/homes/e/e4s/jacamar/binaries/jacamar-auth" \
    --custom-config-args "-u config --configuration /global/homes/e/e4s/.gitlab-runner/jacamar.toml" \
    --custom-prepare-exec "/global/homes/e/e4s/jacamar/binaries/jacamar-auth" \
    --custom-prepare-args "prepare" \
    --custom-run-exec  "/global/homes/e/e4s/jacamar/binaries/jacamar-auth" \
    --custom-run-args "run" \
    --custom-cleanup-exec "/global/homes/e/e4s/jacamar/binaries/jacamar-auth" \
    --custom-cleanup-args "cleanup --configuration /global/homes/e/e4s/.gitlab-runner/jacamar.toml" \
    --config /global/homes/e/e4s/.gitlab-runner/perlmutter.config.toml

Spack Training

Goal

The goal of this training is to provide advice for how one can use Spack to install packages and manage a software stack on Perlmutter. We will cover the following topics:

  • User Environment

  • Defining Compilers in Spack

  • Define Package Preference and Externals

  • Create a source mirror

  • Building CUDA packages

  • Generating modulefiles

After completing the training, one can expect to be familiar with the customizations needed for an optimal Spack experience on Perlmutter.

Pre-Requisite

In order to perform this training, you need a NERSC account and access to Perlmutter. We assume you already have a basic understanding of spack.

Setup

In order to get started, please Connect to Perlmutter via ssh. Once you have access, please clone the following Git repository into your $HOME directory.

git clone https://github.com/NERSC/spack-infrastructure.git

User Environment

Spack builds can be sensitive to your user environment and any configuration setup in your shell startup files. We recommend you review your startup configuration files. Some things to look out for are the following:

  1. Loading or unloading of any modules

  2. Activating a Python or Conda environment

  3. Any user environment variables such as $PATH

Note

We have seen that purging modules (module purge) can alter Spack builds and cause most of the Cray programming environment to be removed. For more details see spack/#27124.

When performing Spack builds, we encourage using the default modules. This should look at follows:

elvis@login34> module list

Currently Loaded Modules:
  1) craype-x86-milan     4) perftools-base/22.06.0                 7) craype/2.7.16      10) cray-libsci/21.08.1.2  13) darshan/3.3.1 (io)
  2) libfabric/1.15.0.0   5) xpmem/2.3.2-2.2_7.5__g93dd7ee.shasta   8) cray-dsmml/0.2.2   11) PrgEnv-gnu/8.3.3
  3) craype-network-ofi   6) gcc/11.2.0                             9) cray-mpich/8.1.17  12) xalt/2.10.2

  Where:
   io:  Input/output software

In order to setup our environment, let’s source the setup script which will create a new Python virtual environment to perform the Spack builds. Please run the following commands:

elvis@login34> cd spack-infrastructure/
elvis@login34> source setup-env.sh
Collecting clingo
  Using cached clingo-5.5.2-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.2 MB)
Collecting cffi
  Using cached cffi-1.15.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl (402 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Installing collected packages: pycparser, cffi, clingo
Successfully installed cffi-1.15.1 clingo-5.5.2 pycparser-2.21
WARNING: You are using pip version 20.2.3; however, version 21.3.1 is available.
You should consider upgrading via the '/global/homes/e/elvis/spack-infrastructure/spack-pyenv/bin/python3 -m pip install --upgrade pip' command.
/global/homes/e/elvis/spack-infrastructure/spack-pyenv/bin/python
Package    Version
---------- -------
cffi       1.15.1
clingo     5.5.2
pip        20.2.3
pycparser  2.21
setuptools 44.1.1
WARNING: You are using pip version 20.2.3; however, version 21.3.1 is available.
You should consider upgrading via the '/global/homes/e/elvis/spack-infrastructure/spack-pyenv/bin/python3 -m pip install --upgrade pip' command.

The setup-env.sh script will install clingo in your Python environment which is typically required by Spack along with a few other configurations relevant for building Spack.

Note

Spack requires clingo in-order to bootstrap clingo however we observed issues where Spack was unable to bootstrap clingo see spack/28315. We found that installing clingo as a Python package addressed the issue.

Acquiring Spack

Clone the following Spack branch from the Git Repository and source the setup script.

git clone -b e4s-22.05 https://github.com/spack/spack.git
source spack/share/spack/setup-env.sh

Once you have acquired Spack and sourced the activation script, please run the following commands to ensure your setup is done correctly. We have configured the environment, SPACK_PYTHON, to use a Python wrapper in the virtual environment.

(spack-pyenv) elvis@login34> spack --version
0.18.0.dev0 (6040c82740449632aa1d6faab08f93f5e4c54615)

(spack-pyenv) elvis@login34> echo $SPACK_PYTHON
/global/homes/e/elvis/spack-infrastructure/spack-pyenv/bin/python

(spack-pyenv) elvis@login34> which python
/global/homes/e/elvis/spack-infrastructure/spack-pyenv/bin/python

The command below will pass the full path to the Python interpreter used by Spack, which should be the path set by environment SPACK_PYTHON.

(spack-pyenv) elvis@login34> spack-python --path
/global/homes/e/elvis/spack-infrastructure/spack-pyenv/bin/python

Creating a Spack Environment

When using Spack, you may be tempted to start installing packages via spack install in your Spack instance. Note that it’s best you organize your Spack stacks in their own spack environment, similar to how one would organize a Python or Conda environment.

Let’s start by creating a Spack environment named data_viz, and activating it.

spack env create data_viz
spack env activate data_viz

Upon completion you should confirm the output of spack env status matches the following:

(spack-pyenv) elvis@login34> spack env status
==> In environment data_viz

Let’s navigate to the directory for Spack environment data_viz. You will see a file spack.yaml that is used to specify your Spack configuration. This includes configuration options such as which compilers to use in your Spack builds.

(spack-pyenv) elvis@login34> spack cd -e data_viz
(spack-pyenv) elvis@login34> ls -l
total 1
-rw-rw-r-- 1 elvis elvis 199 Aug  3 19:09 spack.yaml

Defining Compilers

In order to use Spack, one must define a list of compilers in order to build packages. On Perlmutter, we have gcc/11.2.0 and cce/13.0.2 compilers available as modulefiles which correspond to the GCC and Cray compiler. In order to specify the compiler definition we must use the corresponding PrgEnv-* module.

(spack-pyenv) elvis@login34> ml -t av gcc/11.2.0 cce/13.0.2
/opt/cray/pe/lmod/modulefiles/core:
cce/13.0.2
gcc/11.2.0

Let’s add the following content in spack.yaml. Please open the file in your preferred editor and paste the contents. Note that we specify the full path for cc, cxx, f77, and fc which should correspond to the Cray wrappers.

 1# This is a Spack Environment file.
 2#
 3# It describes a set of packages to be installed, along with
 4# configuration settings.
 5spack:
 6  config:
 7    view: false
 8    concretization: separately
 9    build_stage: $spack/var/spack/stage
10    misc_cache: $spack/var/spack/misc_cache
11    concretizer: clingo
12
13  compilers:
14  - compiler:
15      spec: gcc@11.2.0
16      paths:
17        cc: cc
18        cxx: CC
19        f77: ftn
20        fc: ftn
21      flags: {}
22      operating_system: sles15
23      target: any
24      modules:
25      - PrgEnv-gnu
26      - gcc/11.2.0
27      - craype-x86-milan
28      - libfabric
29      extra_rpaths: []
30  - compiler:
31      spec: cce@13.0.2
32      paths:
33        cc: /opt/cray/pe/craype/default/bin/cc
34        cxx: /opt/cray/pe/craype/default/bin/CC
35        f77: /opt/cray/pe/craype/default/bin/ftn
36        fc: /opt/cray/pe/craype/default/bin/ftn
37      flags: {}
38      operating_system: sles15
39      target: any
40      modules:
41      - PrgEnv-cray
42      - cce/13.0.2
43      - craype-x86-milan
44      - libfabric
45      environment: {}
46      extra_rpaths: []
47
48  # add package specs to the `specs` list
49  specs: []
50  packages:
51    all:
52      compiler: [gcc@11.2.0, cce@13.0.2]
53
54  view: true

Note

The directory /opt/cray/pe/craype/default resorts to the default Cray programming environment, craype, in this case its 2.7.16 and the cc wrapper should be from this corresponding directory.

(spack-pyenv) elvis@login34> ls -ld /opt/cray/pe/craype/default
lrwxrwxrwx 1 root root 6 Jun  1 14:56 /opt/cray/pe/craype/default -> 2.7.16

(spack-pyenv) elvis@login34> which cc
/opt/cray/pe/craype/2.7.16/bin/cc

On Perlmutter, the craype/2.7.16 modulefile is responsible for setting the Cray wrappers which is loaded by default as shown below:

(spack-pyenv) elvis@login34> ml -t list craype/2.7.16
craype/2.7.16

If this modulefile was removed, you will not have access to the Cray wrappers cc, CC or ftn which may result in several errors.

Now let’s check all available compilers by running spack compiler list

(spack-pyenv) elvis@login34> spack compiler list
==> Available compilers
-- cce sles15-any -----------------------------------------------
cce@13.0.2

-- gcc sles15-any -----------------------------------------------
gcc@11.2.0

Package Preference

Now let’s try to run spack spec -Il hdf5, you will notice Spack will try to install all the packages from source, some of which are dependencies that should not be installed but rather set as external packages. For instance, utilities like openssl, bzip2, diffutils, openmpi, openssh should not be installed from source. We have documented Recommended External Packages for Spack that outlines a list of packages where we recommend using the NERSC system installations.

 1(spack-pyenv) elvis@login34> spack spec -Il hdf5
 2Input spec
 3--------------------------------
 4 -   hdf5
 5
 6Concretized
 7--------------------------------
 8 -   z4dfikd  hdf5@1.12.2%gcc@11.2.0~cxx~fortran~hl~ipo~java+mpi+shared~szip~threadsafe+tools api=default build_type=RelWithDebInfo arch=cray-sles15-zen3
 9 -   auepzq2      ^cmake@3.23.1%gcc@11.2.0~doc+ncurses+ownlibs~qt build_type=Release arch=cray-sles15-zen3
10 -   2t22mc5          ^ncurses@6.2%gcc@11.2.0~symlinks+termlib abi=none arch=cray-sles15-zen3
11 -   nugfov2              ^pkgconf@1.8.0%gcc@11.2.0 arch=cray-sles15-zen3
12 -   i2r3jpl          ^openssl@1.1.1o%gcc@11.2.0~docs~shared certs=system arch=cray-sles15-zen3
13 -   ekj3iat              ^perl@5.34.1%gcc@11.2.0+cpanm+shared+threads arch=cray-sles15-zen3
14 -   hafeanv                  ^berkeley-db@18.1.40%gcc@11.2.0+cxx~docs+stl patches=b231fcc arch=cray-sles15-zen3
15 -   blbwwl4                  ^bzip2@1.0.8%gcc@11.2.0~debug~pic+shared arch=cray-sles15-zen3
16 -   gvbyw6w                      ^diffutils@3.8%gcc@11.2.0 arch=cray-sles15-zen3
17 -   3xwztgy                          ^libiconv@1.16%gcc@11.2.0 libs=shared,static arch=cray-sles15-zen3
18 -   bxrz7zm                  ^gdbm@1.19%gcc@11.2.0 arch=cray-sles15-zen3
19 -   avhrefq                      ^readline@8.1%gcc@11.2.0 arch=cray-sles15-zen3
20 -   ozmcyfj                  ^zlib@1.2.12%gcc@11.2.0+optimize+pic+shared patches=0d38234 arch=cray-sles15-zen3
21 -   gdm5qma      ^openmpi@4.1.3%gcc@11.2.0~atomics~cuda~cxx~cxx_exceptions~gpfs~internal-hwloc~java~legacylaunchers~lustre~memchecker~pmi+pmix+romio+rsh~singularity+static+vt+wrapper-rpath fabrics=none schedulers=none arch=cray-sles15-zen3
22 -   6rkjosk          ^hwloc@2.7.1%gcc@11.2.0~cairo~cuda~gl~libudev+libxml2~netloc~nvml~opencl+pci~rocm+shared arch=cray-sles15-zen3
23 -   oyeiwvg              ^libpciaccess@0.16%gcc@11.2.0 arch=cray-sles15-zen3
24 -   56oycjj                  ^libtool@2.4.7%gcc@11.2.0 arch=cray-sles15-zen3
25 -   flsruli                      ^m4@1.4.19%gcc@11.2.0+sigsegv patches=9dc5fbd,bfdffa7 arch=cray-sles15-zen3
26 -   wcuq435                          ^libsigsegv@2.13%gcc@11.2.0 arch=cray-sles15-zen3
27 -   koitq65                  ^util-macros@1.19.3%gcc@11.2.0 arch=cray-sles15-zen3
28 -   u2ai4xj              ^libxml2@2.9.13%gcc@11.2.0~python arch=cray-sles15-zen3
29 -   tyswlp4                  ^xz@5.2.5%gcc@11.2.0~pic libs=shared,static arch=cray-sles15-zen3
30 -   w2itznc          ^libevent@2.1.12%gcc@11.2.0+openssl arch=cray-sles15-zen3
31 -   t4jyphv          ^numactl@2.0.14%gcc@11.2.0 patches=4e1d78c,62fc8a8,ff37630 arch=cray-sles15-zen3
32 -   al4xc7v              ^autoconf@2.69%gcc@11.2.0 patches=35c4492,7793209,a49dd5b arch=cray-sles15-zen3
33 -   2uxxcnx              ^automake@1.16.5%gcc@11.2.0 arch=cray-sles15-zen3
34 -   w5aq2sc          ^openssh@9.0p1%gcc@11.2.0 arch=cray-sles15-zen3
35 -   mkoju5b              ^libedit@3.1-20210216%gcc@11.2.0 arch=cray-sles15-zen3
36 -   t3wpbom          ^pmix@4.1.2%gcc@11.2.0~docs+pmi_backwards_compatibility~restful arch=cray-sles15-zen3

Let’s try to update our Spack configuration with the external packages as follows:

 1# This is a Spack Environment file.
 2#
 3# It describes a set of packages to be installed, along with
 4# configuration settings.
 5spack:
 6  config:
 7    view: false
 8    concretization: separately
 9    build_stage: $spack/var/spack/stage
10    misc_cache: $spack/var/spack/misc_cache
11    concretizer: clingo
12
13  compilers:
14  - compiler:
15      spec: gcc@11.2.0
16      paths:
17        cc: cc
18        cxx: CC
19        f77: ftn
20        fc: ftn
21      flags: {}
22      operating_system: sles15
23      target: any
24      modules:
25      - PrgEnv-gnu
26      - gcc/11.2.0
27      - craype-x86-milan
28      - libfabric
29      extra_rpaths: []
30  - compiler:
31      spec: cce@13.0.2
32      paths:
33        cc: /opt/cray/pe/craype/default/bin/cc
34        cxx: /opt/cray/pe/craype/default/bin/CC
35        f77: /opt/cray/pe/craype/default/bin/ftn
36        fc: /opt/cray/pe/craype/default/bin/ftn
37      flags: {}
38      operating_system: sles15
39      target: any
40      modules:
41      - PrgEnv-cray
42      - cce/13.0.2
43      - craype-x86-milan
44      - libfabric
45      environment: {}
46      extra_rpaths: []
47
48  # add package specs to the `specs` list
49  specs: []
50  packages:
51    all:
52      compiler: [gcc@11.2.0, cce@13.0.2]
53    bzip2:
54      version: [1.0.6]
55      externals:
56      - spec: bzip2@1.0.6
57        prefix: /usr
58    diffutils:
59      version: [3.6]
60      externals:
61      - spec: diffutils@3.6
62        prefix: /usr
63    findutils:
64      version: [4.6.0]
65      externals:
66      - spec: findutils@4.6.0
67        prefix: /usr
68    openssl:
69      version: [1.1.0i]
70      buildable: false
71      externals:
72      - spec: openssl@1.1.0i
73        prefix: /usr
74    openssh:
75      version: [7.9p1]
76      buildable: false
77      externals:
78      - spec: openssh@7.9p1
79        prefix: /usr
80    readline:
81      version: [7.0]
82      buildable: false
83      externals:
84      - spec: readline@7.0
85        prefix: /usr
86    tar:
87      version: [1.3]
88      buildable: false
89      externals:
90      - spec: tar@1.30
91        prefix: /usr
92    unzip:
93      version: [6.0]
94      buildable: false
95      externals:
96      - spec: unzip@6.0
97        prefix: /usr
98
99  view: true

Many software packages depend on MPI, BLAS, PMI, and libfabrics, and these packages are typically available on Perlmutter. Shown below is a breakdown of the provider and its corresponding modules typically available on Perlmutter

  • MPI: cray-mpich

  • BLAS: cray-libsci

  • PMI: cray-pmi

  • libfabrics: libfabrics

Shown below are the corresponding modules that you should consider when setting up external packages.

(spack-pyenv) elvis@login34> ml -d av cray-mpich cray-libsci cray-pmi libfabrics

--------------------------------------------------- Cray Compiler/Network Dependent Packages ----------------------------------------------------
   cray-mpich-abi/8.1.17    cray-mpich/8.1.17 (L)

--------------------------------------------------------------- Cray Core Modules ---------------------------------------------------------------
   cray-libsci/21.08.1.2 (L)    cray-pmi-lib/6.0.17    cray-pmi/6.1.3

  Where:
   L:  Module is loaded

Use "module spider" to find all possible modules and extensions.
Use "module keyword key1 key2 ..." to search for all possible modules matching any of the "keys".

In Spack, you can use the spack providers command to find the corresponding Spack package that maps to the provider. In Spack these are referred to as virtual packages which are a collection of Spack packages that provide the same functionality.

(spack-pyenv) elvis@login34> spack providers
Virtual packages:
    D     daal      flame  glu     iconv  jpeg     lua-lang        mkl  mysql-client  osmesa  pkgconfig  sycl  unwind  yacc
    awk   elf       fuse   glx     ipp    lapack   luajit          mpe  onedal        pbs     rpc        szip  uuid    ziglang
    blas  fftw-api  gl     golang  java   libllvm  mariadb-client  mpi  opencl        pil     scalapack  tbb   xxd

For instance, if you want to see all the MPI providers you can run the following. Note that cray-mpich is in the list.

(spack-pyenv) elvis@login34> spack providers mpi
mpi:
cray-mpich     intel-mpi              mpich@:1.1  mpich          mpt@1:         mvapich2@2.3:  openmpi         spectrum-mpi
cray-mvapich2  intel-oneapi-mpi       mpich@:1.2  mpilander      mpt@3:         mvapich2-gdr   openmpi@1.6.5
fujitsu-mpi    intel-parallel-studio  mpich@:3.1  mpitrampoline  mvapich2       mvapich2x      openmpi@1.7.5:
hpcx-mpi       mpich@:1.0             mpich@:3.2  mpt            mvapich2@2.1:  nvhpc          openmpi@2.0.0:

Now let’s try to update our Spack configuration as follows:

  1 # This is a Spack Environment file.
  2 #
  3 # It describes a set of packages to be installed, along with
  4 # configuration settings.
  5 spack:
  6   config:
  7     view: false
  8     concretization: separately
  9     build_stage: $spack/var/spack/stage
 10     misc_cache: $spack/var/spack/misc_cache
 11     concretizer: clingo
 12
 13   compilers:
 14   - compiler:
 15       spec: gcc@11.2.0
 16       paths:
 17         cc: cc
 18         cxx: CC
 19         f77: ftn
 20         fc: ftn
 21       flags: {}
 22       operating_system: sles15
 23       target: any
 24       modules:
 25       - PrgEnv-gnu
 26       - gcc/11.2.0
 27       - craype-x86-milan
 28       - libfabric
 29       extra_rpaths: []
 30   - compiler:
 31       spec: cce@13.0.2
 32       paths:
 33         cc: /opt/cray/pe/craype/default/bin/cc
 34         cxx: /opt/cray/pe/craype/default/bin/CC
 35         f77: /opt/cray/pe/craype/default/bin/ftn
 36         fc: /opt/cray/pe/craype/default/bin/ftn
 37       flags: {}
 38       operating_system: sles15
 39       target: any
 40       modules:
 41       - PrgEnv-cray
 42       - cce/13.0.2
 43       - craype-x86-milan
 44       - libfabric
 45       environment: {}
 46       extra_rpaths: []
 47
 48   # add package specs to the `specs` list
 49   specs: []
 50   packages:
 51     all:
 52       compiler: [gcc@11.2.0, cce@13.0.2]
 53       providers:
 54         blas: [cray-libsci]
 55         mpi: [cray-mpich]
 56     bzip2:
 57       version: [1.0.6]
 58       externals:
 59       - spec: bzip2@1.0.6
 60         prefix: /usr
 61     cray-libsci:
 62       buildable: false
 63       externals:
 64       - spec: cray-libsci@21.08.1.2
 65         modules:
 66         - cray-libsci/21.08.1.2
 67     cray-mpich:
 68       buildable: false
 69       externals:
 70       - spec: cray-mpich@8.1.15 %gcc@11.2.0
 71         prefix: /opt/cray/pe/mpich/8.1.15/ofi/gnu/9.1
 72         modules:
 73         - cray-mpich/8.1.15
 74         - cudatoolkit/11.5
 75       - spec: cray-mpich@8.1.15 %cce@13.0.2
 76         prefix: /opt/cray/pe/mpich/8.1.15/ofi/cray/10.0/
 77         modules:
 78         - cray-mpich/8.1.15
 79         - cudatoolkit/11.5
 80     cray-pmi:
 81       buildable: false
 82       externals:
 83       - spec: cray-pmi@6.1.1
 84         modules:
 85         - cray-pmi/6.1.1
 86     diffutils:
 87       version: [3.6]
 88       externals:
 89       - spec: diffutils@3.6
 90         prefix: /usr
 91     findutils:
 92       version: [4.6.0]
 93       externals:
 94       - spec: findutils@4.6.0
 95         prefix: /usr
 96     libfabric:
 97       buildable: false
 98       variants: fabrics=sockets,tcp,udp,rxm
 99       externals:
100       - spec: libfabric@1.11.0.4.114
101         prefix: /opt/cray/libfabric/1.11.0.4.114
102         modules:
103         - libfabric/1.11.0.4.114
104     openssl:
105       version: [1.1.0i]
106       buildable: false
107       externals:
108       - spec: openssl@1.1.0i
109         prefix: /usr
110     openssh:
111       version: [7.9p1]
112       buildable: false
113       externals:
114       - spec: openssh@7.9p1
115         prefix: /usr
116     readline:
117       version: [7.0]
118       buildable: false
119       externals:
120       - spec: readline@7.0
121         prefix: /usr
122     tar:
123       version: [1.3]
124       buildable: false
125       externals:
126       - spec: tar@1.30
127         prefix: /usr
128     unzip:
129       version: [6.0]
130       buildable: false
131       externals:
132       - spec: unzip@6.0
133         prefix: /usr
134
135   view: true

Let’s try to run spack spec hypre and notice that Spack will now use cray-libsci and cray-mpich as the dependencies, because we have set these packages as externals.

(spack-pyenv) elvis@login34> spack spec hypre
Input spec
--------------------------------
hypre@2.24.0

Concretized
--------------------------------
hypre@2.24.0%gcc@11.2.0~complex~cuda~debug+fortran~gptune~int64~internal-superlu~mixedint+mpi~openmp~rocm+shared~superlu-dist~unified-memory arch=cray-sles15-zen3
    ^cray-libsci@21.08.1.2%gcc@11.2.0~mpi~openmp+shared arch=cray-sles15-zen3
    ^cray-mpich@8.1.15%gcc@11.2.0+wrappers arch=cray-sles15-zen3

Now let’s try to add some packages to our Spack configuration by adding the following lines:

  1# This is a Spack Environment file.
  2#
  3# It describes a set of packages to be installed, along with
  4# configuration settings.
  5spack:
  6  config:
  7    view: false
  8    concretization: separately
  9    build_stage: $spack/var/spack/stage
 10    misc_cache: $spack/var/spack/misc_cache
 11    concretizer: clingo
 12  compilers:
 13  - compiler:
 14      spec: gcc@11.2.0
 15      paths:
 16        cc: cc
 17        cxx: CC
 18        f77: ftn
 19        fc: ftn
 20      flags: {}
 21      operating_system: sles15
 22      target: any
 23      modules:
 24      - PrgEnv-gnu
 25      - gcc/11.2.0
 26      - craype-x86-milan
 27      - libfabric
 28      extra_rpaths: []
 29  - compiler:
 30      spec: cce@13.0.2
 31      paths:
 32        cc: /opt/cray/pe/craype/default/bin/cc
 33        cxx: /opt/cray/pe/craype/default/bin/CC
 34        f77: /opt/cray/pe/craype/default/bin/ftn
 35        fc: /opt/cray/pe/craype/default/bin/ftn
 36      flags: {}
 37      operating_system: sles15
 38      target: any
 39      modules:
 40      - PrgEnv-cray
 41      - cce/13.0.2
 42      - craype-x86-milan
 43      - libfabric
 44      environment: {}
 45      extra_rpaths: []
 46  # add package specs to the `specs` list
 47  specs:
 48  - papi %gcc
 49  - papi %cce
 50  - hypre %gcc
 51  - hypre %cce
 52  - darshan-runtime %gcc
 53  - darshan-runtime %cce
 54  packages:
 55    all:
 56      compiler: [gcc@11.2.0, cce@13.0.2]
 57      providers:
 58        blas: [cray-libsci]
 59        mpi: [cray-mpich]
 60    bzip2:
 61      version: [1.0.6]
 62      externals:
 63      - spec: bzip2@1.0.6
 64        prefix: /usr
 65    cray-libsci:
 66      buildable: false
 67      externals:
 68      - spec: cray-libsci@21.08.1.2
 69        modules:
 70        - cray-libsci/21.08.1.2
 71    cray-mpich:
 72      buildable: false
 73      externals:
 74      - spec: cray-mpich@8.1.15 %gcc@11.2.0
 75        prefix: /opt/cray/pe/mpich/8.1.15/ofi/gnu/9.1
 76        modules:
 77        - cray-mpich/8.1.15
 78        - cudatoolkit/11.5
 79      - spec: cray-mpich@8.1.15 %cce@13.0.2
 80        prefix: /opt/cray/pe/mpich/8.1.15/ofi/cray/10.0/
 81        modules:
 82        - cray-mpich/8.1.15
 83        - cudatoolkit/11.5
 84    cray-pmi:
 85      buildable: false
 86      externals:
 87      - spec: cray-pmi@6.1.1
 88        modules:
 89        - cray-pmi/6.1.1
 90    diffutils:
 91      version: [3.6]
 92      externals:
 93      - spec: diffutils@3.6
 94        prefix: /usr
 95    findutils:
 96      version: [4.6.0]
 97      externals:
 98      - spec: findutils@4.6.0
 99        prefix: /usr
100    libfabric:
101      buildable: false
102      variants: fabrics=sockets,tcp,udp,rxm
103      externals:
104      - spec: libfabric@1.11.0.4.114
105        prefix: /opt/cray/libfabric/1.11.0.4.114
106        modules:
107        - libfabric/1.11.0.4.114
108    openssl:
109      version: [1.1.0i]
110      buildable: false
111      externals:
112      - spec: openssl@1.1.0i
113        prefix: /usr
114    openssh:
115      version: [7.9p1]
116      buildable: false
117      externals:
118      - spec: openssh@7.9p1
119        prefix: /usr
120    readline:
121      version: [7.0]
122      buildable: false
123      externals:
124      - spec: readline@7.0
125        prefix: /usr
126    tar:
127      version: [1.3]
128      buildable: false
129      externals:
130      - spec: tar@1.30
131        prefix: /usr
132    unzip:
133      version: [6.0]
134      buildable: false
135      externals:
136      - spec: unzip@6.0
137        prefix: /usr
138  view: true

Next, we will concretize the environment, you should see papi, hypre and darshan-runtime built with each compiler.

(spack-pyenv) elvis@login34> spack concretize
==> Starting concretization pool with 6 processes
==> Environment concretized in 18.58 seconds.
==> Concretized papi%gcc
 -   s2y4nrv  papi@6.0.0.1%gcc@11.2.0~cuda+example~infiniband~lmsensors~nvml~powercap~rapl~rocm~rocm_smi~sde+shared~static_tools arch=cray-sles15-zen3

==> Concretized papi%cce
 -   3aprcx5  papi@6.0.0.1%cce@13.0.2~cuda+example~infiniband~lmsensors~nvml~powercap~rapl~rocm~rocm_smi~sde+shared~static_tools patches=b6d6caa arch=cray-sles15-zen3

==> Concretized hypre%gcc
 -   mbn7bum  hypre@2.24.0%gcc@11.2.0~complex~cuda~debug+fortran~gptune~int64~internal-superlu~mixedint+mpi~openmp~rocm+shared~superlu-dist~unified-memory arch=cray-sles15-zen3
 -   jzbnd6y      ^cray-libsci@21.08.1.2%gcc@11.2.0~mpi~openmp+shared arch=cray-sles15-zen3
 -   3zy6uvs      ^cray-mpich@8.1.15%gcc@11.2.0+wrappers arch=cray-sles15-zen3

==> Concretized hypre%cce
 -   62ofdsf  hypre@2.24.0%cce@13.0.2~complex~cuda~debug+fortran~gptune~int64~internal-superlu~mixedint+mpi~openmp~rocm+shared~superlu-dist~unified-memory arch=cray-sles15-zen3
 -   7uzhxpv      ^cray-libsci@21.08.1.2%cce@13.0.2~mpi~openmp+shared arch=cray-sles15-zen3
 -   tb5uxwe      ^cray-mpich@8.1.15%cce@13.0.2+wrappers arch=cray-sles15-zen3

==> Concretized darshan-runtime%gcc
 -   hkxzwvt  darshan-runtime@3.3.1%gcc@11.2.0~apmpi~apmpi_sync~apxc~hdf5+mpi scheduler=NONE arch=cray-sles15-zen3
 -   3zy6uvs      ^cray-mpich@8.1.15%gcc@11.2.0+wrappers arch=cray-sles15-zen3
 -   ozmcyfj      ^zlib@1.2.12%gcc@11.2.0+optimize+pic+shared patches=0d38234 arch=cray-sles15-zen3

==> Concretized darshan-runtime%cce
 -   uj3wa4a  darshan-runtime@3.3.1%cce@13.0.2~apmpi~apmpi_sync~apxc~hdf5+mpi scheduler=NONE arch=cray-sles15-zen3
 -   tb5uxwe      ^cray-mpich@8.1.15%cce@13.0.2+wrappers arch=cray-sles15-zen3
 -   e2hl6cx      ^zlib@1.2.12%cce@13.0.2+optimize+pic+shared patches=0d38234 arch=cray-sles15-zen3

Let’s install all the packages via spack install. This would be a good time to get a cup of coffee since it will likely take a few minutes.

(spack-pyenv) elvis@login34> spack install
==> Installing environment data_viz
==> Installing papi-6.0.0.1-s2y4nrvu6whr6hhgi63aa3nqwz2d35af
==> No binary for papi-6.0.0.1-s2y4nrvu6whr6hhgi63aa3nqwz2d35af found: installing from source
==> Fetching https://mirror.spack.io/_source-cache/archive/3c/3cd7ed50c65b0d21d66e46d0ba34cd171178af4bbf9d94e693915c1aca1e287f.tar.gz
==> No patches needed for papi
==> papi: Executing phase: 'autoreconf'
==> papi: Executing phase: 'configure'
==> papi: Executing phase: 'build'
==> papi: Executing phase: 'install'
==> papi: Successfully installed papi-6.0.0.1-s2y4nrvu6whr6hhgi63aa3nqwz2d35af
  Fetch: 1.49s.  Build: 28.94s.  Total: 30.43s.
[+] /global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/gcc-11.2.0/papi-6.0.0.1-s2y4nrvu6whr6hhgi63aa3nqwz2d35af
==> Installing papi-6.0.0.1-3aprcx5klzafe7xt6aq57jx5sequpue2
==> No binary for papi-6.0.0.1-3aprcx5klzafe7xt6aq57jx5sequpue2 found: installing from source
==> Using cached archive: /global/u1/e/elvis/spack-infrastructure/spack/var/spack/cache/_source-cache/archive/3c/3cd7ed50c65b0d21d66e46d0ba34cd171178af4bbf9d94e693915c1aca1e287f.tar.gz
==> Applied patch /global/u1/e/elvis/spack-infrastructure/spack/var/spack/repos/builtin/packages/papi/crayftn-fixes.patch
==> papi: Executing phase: 'autoreconf'
==> papi: Executing phase: 'configure'
==> papi: Executing phase: 'build'
==> papi: Executing phase: 'install'
==> papi: Successfully installed papi-6.0.0.1-3aprcx5klzafe7xt6aq57jx5sequpue2
  Fetch: 0.01s.  Build: 28.94s.  Total: 28.95s.
[+] /global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/cce-13.0.2/papi-6.0.0.1-3aprcx5klzafe7xt6aq57jx5sequpue2
==> cray-libsci@21.08.1.2 : has external module in ['cray-libsci/21.08.1.2']
[+] /opt/cray/pe/libsci/21.08.1.2/GNU/9.1/x86_64 (external cray-libsci-21.08.1.2-jzbnd6ycupy2ycs5jiavwyvkxv3rpuru)
==> cray-mpich@8.1.15 : has external module in ['cray-mpich/8.1.15', 'cudatoolkit/11.5']
[+] /opt/cray/pe/mpich/8.1.15/ofi/gnu/9.1 (external cray-mpich-8.1.15-3zy6uvszbd5a3rniq2xd2v5a3d27qstw)
==> cray-libsci@21.08.1.2 : has external module in ['cray-libsci/21.08.1.2']
[+] /opt/cray/pe/libsci/21.08.1.2/CRAY/9.0/x86_64 (external cray-libsci-21.08.1.2-7uzhxpvoka7ixfxs44354dkishquwyhq)
==> cray-mpich@8.1.15 : has external module in ['cray-mpich/8.1.15', 'cudatoolkit/11.5']
[+] /opt/cray/pe/mpich/8.1.15/ofi/cray/10.0/ (external cray-mpich-8.1.15-tb5uxwezfzx4xth7azefyrhzlvf7koqb)
==> Installing zlib-1.2.12-ozmcyfjfv7i5gjjgklfsh43h67vzsuc5
==> No binary for zlib-1.2.12-ozmcyfjfv7i5gjjgklfsh43h67vzsuc5 found: installing from source
==> Fetching https://mirror.spack.io/_source-cache/archive/91/91844808532e5ce316b3c010929493c0244f3d37593afd6de04f71821d5136d9.tar.gz
==> Applied patch /global/u1/e/elvis/spack-infrastructure/spack/var/spack/repos/builtin/packages/zlib/configure-cc.patch
==> zlib: Executing phase: 'install'
==> zlib: Successfully installed zlib-1.2.12-ozmcyfjfv7i5gjjgklfsh43h67vzsuc5
  Fetch: 0.62s.  Build: 2.10s.  Total: 2.72s.
[+] /global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/gcc-11.2.0/zlib-1.2.12-ozmcyfjfv7i5gjjgklfsh43h67vzsuc5
==> Installing zlib-1.2.12-e2hl6cxmzbg5psoh5upqmqqltjftc3pb
==> No binary for zlib-1.2.12-e2hl6cxmzbg5psoh5upqmqqltjftc3pb found: installing from source
==> Using cached archive: /global/u1/e/elvis/spack-infrastructure/spack/var/spack/cache/_source-cache/archive/91/91844808532e5ce316b3c010929493c0244f3d37593afd6de04f71821d5136d9.tar.gz
==> Applied patch /global/u1/e/elvis/spack-infrastructure/spack/var/spack/repos/builtin/packages/zlib/configure-cc.patch
==> zlib: Executing phase: 'install'
==> zlib: Successfully installed zlib-1.2.12-e2hl6cxmzbg5psoh5upqmqqltjftc3pb
  Fetch: 0.00s.  Build: 2.45s.  Total: 2.45s.
[+] /global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/cce-13.0.2/zlib-1.2.12-e2hl6cxmzbg5psoh5upqmqqltjftc3pb
==> Installing hypre-2.24.0-mbn7bumcoqmjhf5y2sm3hnr64vml4dvf
==> No binary for hypre-2.24.0-mbn7bumcoqmjhf5y2sm3hnr64vml4dvf found: installing from source
==> Fetching https://mirror.spack.io/_source-cache/archive/f4/f480e61fc25bf533fc201fdf79ec440be79bb8117650627d1f25151e8be2fdb5.tar.gz
==> No patches needed for hypre
==> hypre: Executing phase: 'autoreconf'
==> hypre: Executing phase: 'configure'
==> hypre: Executing phase: 'build'
==> hypre: Executing phase: 'install'
==> hypre: Successfully installed hypre-2.24.0-mbn7bumcoqmjhf5y2sm3hnr64vml4dvf
  Fetch: 0.77s.  Build: 37.43s.  Total: 38.20s.
[+] /global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/gcc-11.2.0/hypre-2.24.0-mbn7bumcoqmjhf5y2sm3hnr64vml4dvf
==> Installing hypre-2.24.0-62ofdsfxckay53ewpiidg4nlamhnzq3b
==> No binary for hypre-2.24.0-62ofdsfxckay53ewpiidg4nlamhnzq3b found: installing from source
==> Using cached archive: /global/u1/e/elvis/spack-infrastructure/spack/var/spack/cache/_source-cache/archive/f4/f480e61fc25bf533fc201fdf79ec440be79bb8117650627d1f25151e8be2fdb5.tar.gz
==> No patches needed for hypre
==> hypre: Executing phase: 'autoreconf'
==> hypre: Executing phase: 'configure'
==> hypre: Executing phase: 'build'
==> hypre: Executing phase: 'install'
==> hypre: Successfully installed hypre-2.24.0-62ofdsfxckay53ewpiidg4nlamhnzq3b
  Fetch: 0.01s.  Build: 1m 5.86s.  Total: 1m 5.87s.
[+] /global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/cce-13.0.2/hypre-2.24.0-62ofdsfxckay53ewpiidg4nlamhnzq3b
==> Installing darshan-runtime-3.3.1-hkxzwvtw5rlmsvwt4irwnxxuwzwbuzoj
==> No binary for darshan-runtime-3.3.1-hkxzwvtw5rlmsvwt4irwnxxuwzwbuzoj found: installing from source
==> Fetching https://mirror.spack.io/_source-cache/archive/28/281d871335977d0592a49d053df93d68ce1840f6fdec27fea7a59586a84395f7.tar.gz
==> No patches needed for darshan-runtime
==> darshan-runtime: Executing phase: 'autoreconf'
==> darshan-runtime: Executing phase: 'configure'
==> darshan-runtime: Executing phase: 'build'
==> darshan-runtime: Executing phase: 'install'
==> darshan-runtime: Successfully installed darshan-runtime-3.3.1-hkxzwvtw5rlmsvwt4irwnxxuwzwbuzoj
  Fetch: 1.07s.  Build: 9.24s.  Total: 10.31s.
[+] /global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/gcc-11.2.0/darshan-runtime-3.3.1-hkxzwvtw5rlmsvwt4irwnxxuwzwbuzoj
==> Installing darshan-runtime-3.3.1-uj3wa4au7kphj52syka4w3dxiadosagh
==> No binary for darshan-runtime-3.3.1-uj3wa4au7kphj52syka4w3dxiadosagh found: installing from source
==> Using cached archive: /global/u1/e/elvis/spack-infrastructure/spack/var/spack/cache/_source-cache/archive/28/281d871335977d0592a49d053df93d68ce1840f6fdec27fea7a59586a84395f7.tar.gz
==> No patches needed for darshan-runtime
==> darshan-runtime: Executing phase: 'autoreconf'
==> darshan-runtime: Executing phase: 'configure'
==> darshan-runtime: Executing phase: 'build'
==> darshan-runtime: Executing phase: 'install'
==> darshan-runtime: Successfully installed darshan-runtime-3.3.1-uj3wa4au7kphj52syka4w3dxiadosagh
  Fetch: 0.01s.  Build: 9.58s.  Total: 9.58s.
[+] /global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/cce-13.0.2/darshan-runtime-3.3.1-uj3wa4au7kphj52syka4w3dxiadosagh
==> Updating view at /global/u1/e/elvis/spack-infrastructure/spack/var/spack/environments/data_viz/.spack-env/view
==> Warning: Skipping external package: cray-libsci@21.08.1.2%gcc@11.2.0~mpi~openmp+shared arch=cray-sles15-zen3/jzbnd6y
==> Warning: Skipping external package: cray-mpich@8.1.15%gcc@11.2.0+wrappers arch=cray-sles15-zen3/3zy6uvs
==> Warning: Skipping external package: cray-libsci@21.08.1.2%cce@13.0.2~mpi~openmp+shared arch=cray-sles15-zen3/7uzhxpv
==> Warning: Skipping external package: cray-mpich@8.1.15%cce@13.0.2+wrappers arch=cray-sles15-zen3/tb5uxwe
==> Error: 178 fatal error(s) when merging prefixes:
    `/global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/gcc-11.2.0/papi-6.0.0.1-s2y4nrvu6whr6hhgi63aa3nqwz2d35af/.spack/archived-files/src/removed_la_files.txt` and `/global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/cce-13.0.2/papi-6.0.0.1-3aprcx5klzafe7xt6aq57jx5sequpue2/.spack/archived-files/src/removed_la_files.txt` both project to `.spack/papi/archived-files/src/removed_la_files.txt`
    `/global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/gcc-11.2.0/papi-6.0.0.1-s2y4nrvu6whr6hhgi63aa3nqwz2d35af/.spack/install_environment.json` and `/global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/cce-13.0.2/papi-6.0.0.1-3aprcx5klzafe7xt6aq57jx5sequpue2/.spack/install_environment.json` both project to `.spack/papi/install_environment.json`
    `/global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/gcc-11.2.0/papi-6.0.0.1-s2y4nrvu6whr6hhgi63aa3nqwz2d35af/.spack/install_manifest.json` and `/global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/cce-13.0.2/papi-6.0.0.1-3aprcx5klzafe7xt6aq57jx5sequpue2/.spack/install_manifest.json` both project to `.spack/papi/install_manifest.json`

Upon completion you can run spack find to see all installed packages.

(spack-pyenv) elvis@login34> spack find
==> In environment data_viz
==> Root specs
-- no arch / cce ------------------------------------------------
darshan-runtime%cce  hypre%cce  papi%cce

-- no arch / gcc ------------------------------------------------
darshan-runtime%gcc  hypre%gcc  papi%gcc

==> 12 installed packages
-- cray-sles15-zen3 / cce@13.0.2 --------------------------------
cray-libsci@21.08.1.2  cray-mpich@8.1.15  darshan-runtime@3.3.1  hypre@2.24.0  papi@6.0.0.1  zlib@1.2.12

-- cray-sles15-zen3 / gcc@11.2.0 --------------------------------
cray-libsci@21.08.1.2  cray-mpich@8.1.15  darshan-runtime@3.3.1  hypre@2.24.0  papi@6.0.0.1  zlib@1.2.12

Defining a Source Mirror

You may have noticed Spack will fetch tarballs from the web when installing packages and this can be time-consuming when downloading large tarballs. It is a good idea to store tarballs on the filesystem once and then let Spack use them for any Spack builds. You should have one location where tarballs. Let’s run the following command:

(spack-pyenv) elvis@login34> spack mirror create -d $CI_PROJECT_DIR/spack_mirror -a
==> Adding package cray-libsci@21.08.1.2 to mirror
==> Adding package cray-libsci@21.08.1.2 to mirror
==> Adding package cray-mpich@8.1.15 to mirror
==> Adding package cray-mpich@8.1.15 to mirror
==> Adding package darshan-runtime@3.3.1 to mirror
==> Using cached archive: /global/u1/e/elvis/spack-infrastructure/spack/var/spack/cache/_source-cache/archive/28/281d871335977d0592a49d053df93d68ce1840f6fdec27fea7a59586a84395f7.tar.gz
==> Adding package darshan-runtime@3.3.1 to mirror
==> Adding package hypre@2.24.0 to mirror
==> Using cached archive: /global/u1/e/elvis/spack-infrastructure/spack/var/spack/cache/_source-cache/archive/f4/f480e61fc25bf533fc201fdf79ec440be79bb8117650627d1f25151e8be2fdb5.tar.gz
==> Adding package hypre@2.24.0 to mirror
==> Adding package papi@6.0.0.1 to mirror
==> Using cached archive: /global/u1/e/elvis/spack-infrastructure/spack/var/spack/cache/_source-cache/archive/3c/3cd7ed50c65b0d21d66e46d0ba34cd171178af4bbf9d94e693915c1aca1e287f.tar.gz
==> Fetching https://mirror.spack.io/_source-cache/archive/64/64c57b3ad4026255238cc495df6abfacc41de391a0af497c27d0ac819444a1f8
==> Adding package papi@6.0.0.1 to mirror
==> Adding package zlib@1.2.12 to mirror
==> Using cached archive: /global/u1/e/elvis/spack-infrastructure/spack/var/spack/cache/_source-cache/archive/91/91844808532e5ce316b3c010929493c0244f3d37593afd6de04f71821d5136d9.tar.gz
==> Adding package zlib@1.2.12 to mirror
==> Successfully created mirror in file:///global/homes/e/elvis/spack-infrastructure/spack_mirror
  Archive stats:
    4    already present
    4    added
    0    failed to fetch.

If you inspect the directory you will notice the tarballs are present in this directory.

(spack-pyenv) elvis@login34> ls -l $CI_PROJECT_DIR/spack_mirror/*
/global/homes/e/elvis/spack-infrastructure/spack_mirror/darshan-runtime:
total 1
lrwxrwxrwx 1 elvis elvis 99 Aug  4 08:28 darshan-runtime-3.3.1.tar.gz -> ../_source-cache/archive/28/281d871335977d0592a49d053df93d68ce1840f6fdec27fea7a59586a84395f7.tar.gz

/global/homes/e/elvis/spack-infrastructure/spack_mirror/hypre:
total 1
lrwxrwxrwx 1 elvis elvis 99 Aug  4 08:28 hypre-2.24.0.tar.gz -> ../_source-cache/archive/f4/f480e61fc25bf533fc201fdf79ec440be79bb8117650627d1f25151e8be2fdb5.tar.gz

/global/homes/e/elvis/spack-infrastructure/spack_mirror/papi:
total 2
lrwxrwxrwx 1 elvis elvis 99 Aug  4 08:28 papi-6.0.0.1.tar.gz -> ../_source-cache/archive/3c/3cd7ed50c65b0d21d66e46d0ba34cd171178af4bbf9d94e693915c1aca1e287f.tar.gz
lrwxrwxrwx 1 elvis elvis 92 Aug  4 08:28 raw-64c57b3 -> ../_source-cache/archive/64/64c57b3ad4026255238cc495df6abfacc41de391a0af497c27d0ac819444a1f8

/global/homes/e/elvis/spack-infrastructure/spack_mirror/_source-cache:
total 1
drwxrwxr-x 7 elvis elvis 512 Aug  4 08:28 archive

/global/homes/e/elvis/spack-infrastructure/spack_mirror/zlib:
total 1
lrwxrwxrwx 1 elvis elvis 99 Aug  4 08:28 zlib-1.2.12.tar.gz -> ../_source-cache/archive/91/91844808532e5ce316b3c010929493c0244f3d37593afd6de04f71821d5136d9.tar.gz

Building CUDA Packages

On Perlmutter, the standalone CUDA package is available by loading the following modulefile:

(spack-pyenv) elvis@login34> ml -t av cudatoolkit
/opt/cray/pe/lmod/modulefiles/core:
cudatoolkit/11.5
cudatoolkit/11.7

NVIDIA provides CUDA as part of the NVHPC compiler which is installed on Perlmutter and accessible via the nvhpc modulefile.

(spack-pyenv) elvis@login34> ml -t av nvhpc
/opt/cray/pe/lmod/modulefiles/mix_compilers:
nvhpc-mixed/21.11
nvhpc-mixed/22.5
/opt/cray/pe/lmod/modulefiles/core:
nvhpc/21.11
nvhpc/22.5

The root of nvhpc/21.11 is available at /opt/nvidia/hpc_sdk/Linux_x86_64/21.11. You can see content of this modulefile by running module show nvhpc/21.11 and inspecting the modulefile. Shown below is the directory structure for root of NVHPC stack.

(spack-pyenv) elvis@login34> ls -l /opt/nvidia/hpc_sdk/Linux_x86_64/21.11
total 0
drwxr-xr-x  2 root root  72 Aug  1 07:03 cmake
drwxrwxr-x  6 root root 144 Aug  1 07:07 comm_libs
drwxrwxr-x 14 root root 235 Aug  1 07:07 compilers
drwxrwxr-x  3 root root  78 Aug  1 07:07 cuda
drwxrwxr-x 11 root root 205 Aug  1 07:05 examples
drwxrwxr-x  3 root root  55 Aug  1 07:07 math_libs
drwxrwxr-x  4 root root  71 Aug  1 07:07 profilers
drwxrwxr-x  6 root root  90 Aug  1 07:03 REDIST

cuda/11.5 is installed in following directory, which can be activated by loading the cudatoolkit/11.5 modulefile.

(spack-pyenv) elvis@login34> ls -l /opt/nvidia/hpc_sdk/Linux_x86_64/21.11/cuda/11.5
total 65
drwxrwxr-x 3 root root   335 Aug  1 07:04 bin
drwxrwxr-x 4 root root   385 Aug  1 07:04 compute-sanitizer
-rw-r--r-- 1 root root   160 Dec  8  2021 DOCS
-rw-r--r-- 1 root root 61727 Dec  8  2021 EULA.txt
drwxrwxr-x 4 root root    44 Aug  1 07:04 extras
lrwxrwxrwx 1 root root    28 Dec  8  2021 include -> targets/x86_64-linux/include
lrwxrwxrwx 1 root root    24 Dec  8  2021 lib64 -> targets/x86_64-linux/lib
drwxrwxr-x 7 root root   242 Aug  1 07:04 libnvvp
drwxrwxr-x 3 root root    30 Aug  1 07:04 nvml
drwxrwxr-x 7 root root   106 Aug  1 07:04 nvvm
drwxrwxr-x 7 root root    94 Aug  1 07:04 nvvm-prev
-rw-r--r-- 1 root root   524 Dec  8  2021 README
drwxrwxr-x 3 root root    26 Aug  1 07:04 share
drwxrwxr-x 3 root root    35 Aug  1 07:04 targets
drwxrwxr-x 2 root root    52 Aug  1 07:05 tools
-rw-r--r-- 1 root root  2669 Dec  8  2021 version.json

We can confirm the nvcc compiler provided by CUDA is available in this directory along with the libcudart.so (CUDA Runtime) library

(spack-pyenv) elvis@login34> /opt/nvidia/hpc_sdk/Linux_x86_64/21.11/cuda/11.5/bin/nvcc --version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2021 NVIDIA Corporation
Built on Thu_Nov_18_09:45:30_PST_2021
Cuda compilation tools, release 11.5, V11.5.119
Build cuda_11.5.r11.5/compiler.30672275_0

(spack-pyenv) elvis@login34> ls /opt/nvidia/hpc_sdk/Linux_x86_64/21.11/cuda/11.5/lib64/libcudart.so
/opt/nvidia/hpc_sdk/Linux_x86_64/21.11/cuda/11.5/lib64/libcudart.so

Let’s define our CUDA package preference in our Spack configuration. To illustrate, we will install papi with the spec papi +cuda %gcc. This indicates that we want PAPI installed with CUDA support using the GCC compiler. Please copy the following content in your spack.yaml.

  1 # This is a Spack Environment file.
  2 #
  3 # It describes a set of packages to be installed, along with
  4 # configuration settings.
  5 spack:
  6   config:
  7     view: false
  8     concretization: separately
  9     build_stage: $spack/var/spack/stage
 10     misc_cache: $spack/var/spack/misc_cache
 11     concretizer: clingo
 12   compilers:
 13   - compiler:
 14       spec: gcc@11.2.0
 15       paths:
 16         cc: cc
 17         cxx: CC
 18         f77: ftn
 19         fc: ftn
 20       flags: {}
 21       operating_system: sles15
 22       target: any
 23       modules:
 24       - PrgEnv-gnu
 25       - gcc/11.2.0
 26       - craype-x86-milan
 27       - libfabric
 28       extra_rpaths: []
 29   - compiler:
 30       spec: cce@13.0.2
 31       paths:
 32         cc: /opt/cray/pe/craype/default/bin/cc
 33         cxx: /opt/cray/pe/craype/default/bin/CC
 34         f77: /opt/cray/pe/craype/default/bin/ftn
 35         fc: /opt/cray/pe/craype/default/bin/ftn
 36       flags: {}
 37       operating_system: sles15
 38       target: any
 39       modules:
 40       - PrgEnv-cray
 41       - cce/13.0.2
 42       - craype-x86-milan
 43       - libfabric
 44       environment: {}
 45       extra_rpaths: []
 46
 47   # add package specs to the `specs` list
 48   specs:
 49   - papi %gcc
 50   - papi %cce
 51   - hypre %gcc
 52   - hypre %cce
 53   - darshan-runtime %gcc
 54   - darshan-runtime %cce
 55   - papi +cuda %gcc
 56   packages:
 57     all:
 58       compiler: [gcc@11.2.0, cce@13.0.2]
 59       providers:
 60         blas: [cray-libsci]
 61         mpi: [cray-mpich]
 62     bzip2:
 63       version: [1.0.6]
 64       externals:
 65       - spec: bzip2@1.0.6
 66         prefix: /usr
 67     cray-libsci:
 68       buildable: false
 69       externals:
 70       - spec: cray-libsci@21.08.1.2
 71         modules:
 72         - cray-libsci/21.08.1.2
 73     cray-mpich:
 74       buildable: false
 75       externals:
 76       - spec: cray-mpich@8.1.15 %gcc@11.2.0
 77         prefix: /opt/cray/pe/mpich/8.1.15/ofi/gnu/9.1
 78         modules:
 79         - cray-mpich/8.1.15
 80         - cudatoolkit/11.5
 81       - spec: cray-mpich@8.1.15 %cce@13.0.2
 82         prefix: /opt/cray/pe/mpich/8.1.15/ofi/cray/10.0/
 83         modules:
 84         - cray-mpich/8.1.15
 85         - cudatoolkit/11.5
 86     cray-pmi:
 87       buildable: false
 88       externals:
 89       - spec: cray-pmi@6.1.1
 90         modules:
 91         - cray-pmi/6.1.1
 92     cuda:
 93       buildable: false
 94       version: [11.5.0]
 95       externals:
 96       - spec: cuda@11.5.0
 97         prefix: /opt/nvidia/hpc_sdk/Linux_x86_64/21.11/cuda/11.5
 98         modules:
 99         - cudatoolkit/11.5
100     diffutils:
101       version: [3.6]
102       externals:
103       - spec: diffutils@3.6
104         prefix: /usr
105     findutils:
106       version: [4.6.0]
107       externals:
108       - spec: findutils@4.6.0
109         prefix: /usr
110     libfabric:
111       buildable: false
112       variants: fabrics=sockets,tcp,udp,rxm
113       externals:
114       - spec: libfabric@1.11.0.4.114
115         prefix: /opt/cray/libfabric/1.11.0.4.114
116         modules:
117         - libfabric/1.11.0.4.114
118     openssl:
119       version: [1.1.0i]
120       buildable: false
121       externals:
122       - spec: openssl@1.1.0i
123         prefix: /usr
124     openssh:
125       version: [7.9p1]
126       buildable: false
127       externals:
128       - spec: openssh@7.9p1
129         prefix: /usr
130     readline:
131       version: [7.0]
132       buildable: false
133       externals:
134       - spec: readline@7.0
135         prefix: /usr
136     tar:
137       version: [1.3]
138       buildable: false
139       externals:
140       - spec: tar@1.30
141         prefix: /usr
142     unzip:
143       version: [6.0]
144       buildable: false
145       externals:
146       - spec: unzip@6.0
147         prefix: /usr
148   view: true

Now let’s try to install.

(spack-pyenv) elvis@login34> spack install
==> Installing environment data_viz
==> cuda@11.5.0 : has external module in ['cudatoolkit/11.5']
[+] /opt/nvidia/hpc_sdk/Linux_x86_64/21.11/cuda/11.5 (external cuda-11.5.0-puekfe32hbj72iftffa3etecesmlqwqg)
==> Installing papi-6.0.0.1-x43djbqgyb64susljh3vu4czlqapbyie
==> No binary for papi-6.0.0.1-x43djbqgyb64susljh3vu4czlqapbyie found: installing from source
==> Using cached archive: /global/u1/e/elvis/spack-infrastructure/spack/var/spack/cache/_source-cache/archive/3c/3cd7ed50c65b0d21d66e46d0ba34cd171178af4bbf9d94e693915c1aca1e287f.tar.gz
==> No patches needed for papi
==> papi: Executing phase: 'autoreconf'
==> papi: Executing phase: 'configure'
==> papi: Executing phase: 'build'
==> papi: Executing phase: 'install'
==> papi: Successfully installed papi-6.0.0.1-x43djbqgyb64susljh3vu4czlqapbyie
  Fetch: 0.01s.  Build: 4m 46.76s.  Total: 4m 46.76s.
[+] /global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/gcc-11.2.0/papi-6.0.0.1-x43djbqgyb64susljh3vu4czlqapbyie
==> Updating view at /global/u1/e/elvis/spack-infrastructure/spack/var/spack/environments/data_viz/.spack-env/view
==> Warning: Skipping external package: cray-libsci@21.08.1.2%gcc@11.2.0~mpi~openmp+shared arch=cray-sles15-zen3/jzbnd6y
==> Warning: Skipping external package: cray-mpich@8.1.15%gcc@11.2.0+wrappers arch=cray-sles15-zen3/3zy6uvs
==> Warning: Skipping external package: cray-libsci@21.08.1.2%cce@13.0.2~mpi~openmp+shared arch=cray-sles15-zen3/7uzhxpv
==> Warning: Skipping external package: cray-mpich@8.1.15%cce@13.0.2+wrappers arch=cray-sles15-zen3/tb5uxwe
==> Warning: Skipping external package: cuda@11.5.0%gcc@11.2.0~allow-unsupported-compilers~dev arch=cray-sles15-zen3/puekfe3
==> Error: 193 fatal error(s) when merging prefixes:
    `/global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/gcc-11.2.0/papi-6.0.0.1-s2y4nrvu6whr6hhgi63aa3nqwz2d35af/.spack/archived-files/src/removed_la_files.txt` and `/global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/cce-13.0.2/papi-6.0.0.1-3aprcx5klzafe7xt6aq57jx5sequpue2/.spack/archived-files/src/removed_la_files.txt` both project to `.spack/papi/archived-files/src/removed_la_files.txt`
    `/global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/gcc-11.2.0/papi-6.0.0.1-s2y4nrvu6whr6hhgi63aa3nqwz2d35af/.spack/install_environment.json` and `/global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/cce-13.0.2/papi-6.0.0.1-3aprcx5klzafe7xt6aq57jx5sequpue2/.spack/install_environment.json` both project to `.spack/papi/install_environment.json`
    `/global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/gcc-11.2.0/papi-6.0.0.1-s2y4nrvu6whr6hhgi63aa3nqwz2d35af/.spack/install_manifest.json` and `/global/u1/e/elvis/spack-infrastructure/spack/opt/spack/cray-sles15-zen3/cce-13.0.2/papi-6.0.0.1-3aprcx5klzafe7xt6aq57jx5sequpue2/.spack/install_manifest.json` both project to `.spack/papi/install_manifest.json`

Generating Modulefiles

In this section we let Spack generate modulefiles for the Spack packages we installed. Perlmutter is using Lmod as the module system which supports both tcl and lua modules. You may want to refer to Modules for more information.

(spack-pyenv) elvis@login34> module --version

Modules based on Lua: Version 8.3.1  2020-02-16 19:46 :z
    by Robert McLay mclay@tacc.utexas.edu

For this training we will cover how to generate tcl modules in a flat hierarchy. To get started, let’s add the following to our Spack configuration:

  1# This is a Spack Environment file.
  2#
  3# It describes a set of packages to be installed, along with
  4# configuration settings.
  5spack:
  6  config:
  7    view: false
  8    concretization: separately
  9    build_stage: $spack/var/spack/stage
 10    misc_cache: $spack/var/spack/misc_cache
 11    concretizer: clingo
 12  compilers:
 13  - compiler:
 14      spec: gcc@11.2.0
 15      paths:
 16        cc: cc
 17        cxx: CC
 18        f77: ftn
 19        fc: ftn
 20      flags: {}
 21      operating_system: sles15
 22      target: any
 23      modules:
 24      - PrgEnv-gnu
 25      - gcc/11.2.0
 26      - craype-x86-milan
 27      - libfabric
 28      extra_rpaths: []
 29  - compiler:
 30      spec: cce@13.0.2
 31      paths:
 32        cc: /opt/cray/pe/craype/default/bin/cc
 33        cxx: /opt/cray/pe/craype/default/bin/CC
 34        f77: /opt/cray/pe/craype/default/bin/ftn
 35        fc: /opt/cray/pe/craype/default/bin/ftn
 36      flags: {}
 37      operating_system: sles15
 38      target: any
 39      modules:
 40      - PrgEnv-cray
 41      - cce/13.0.2
 42      - craype-x86-milan
 43      - libfabric
 44      environment: {}
 45      extra_rpaths: []
 46
 47  # add package specs to the `specs` list
 48  specs:
 49  - papi %gcc
 50  - papi %cce
 51  - hypre %gcc
 52  - hypre %cce
 53  - darshan-runtime %gcc
 54  - darshan-runtime %cce
 55  - papi +cuda %gcc
 56  packages:
 57    all:
 58      compiler: [gcc@11.2.0, cce@13.0.2]
 59      providers:
 60        blas: [cray-libsci]
 61        mpi: [cray-mpich]
 62    bzip2:
 63      version: [1.0.6]
 64      externals:
 65      - spec: bzip2@1.0.6
 66        prefix: /usr
 67    cray-libsci:
 68      buildable: false
 69      externals:
 70      - spec: cray-libsci@21.08.1.2
 71        modules:
 72        - cray-libsci/21.08.1.2
 73    cray-mpich:
 74      buildable: false
 75      externals:
 76      - spec: cray-mpich@8.1.15 %gcc@11.2.0
 77        prefix: /opt/cray/pe/mpich/8.1.15/ofi/gnu/9.1
 78        modules:
 79        - cray-mpich/8.1.15
 80        - cudatoolkit/11.5
 81      - spec: cray-mpich@8.1.15 %cce@13.0.2
 82        prefix: /opt/cray/pe/mpich/8.1.15/ofi/cray/10.0/
 83        modules:
 84        - cray-mpich/8.1.15
 85        - cudatoolkit/11.5
 86    cray-pmi:
 87      buildable: false
 88      externals:
 89      - spec: cray-pmi@6.1.1
 90        modules:
 91        - cray-pmi/6.1.1
 92    cuda:
 93      buildable: false
 94      version: [11.5.0]
 95      externals:
 96      - spec: cuda@11.5.0
 97        prefix: /opt/nvidia/hpc_sdk/Linux_x86_64/21.11/cuda/11.5
 98        modules:
 99        - cudatoolkit/11.5
100    diffutils:
101      version: [3.6]
102      externals:
103      - spec: diffutils@3.6
104        prefix: /usr
105    findutils:
106      version: [4.6.0]
107      externals:
108      - spec: findutils@4.6.0
109        prefix: /usr
110    libfabric:
111      buildable: false
112      variants: fabrics=sockets,tcp,udp,rxm
113      externals:
114      - spec: libfabric@1.11.0.4.114
115        prefix: /opt/cray/libfabric/1.11.0.4.114
116        modules:
117        - libfabric/1.11.0.4.114
118    openssl:
119      version: [1.1.0i]
120      buildable: false
121      externals:
122      - spec: openssl@1.1.0i
123        prefix: /usr
124    openssh:
125      version: [7.9p1]
126      buildable: false
127      externals:
128      - spec: openssh@7.9p1
129        prefix: /usr
130    readline:
131      version: [7.0]
132      buildable: false
133      externals:
134      - spec: readline@7.0
135        prefix: /usr
136    tar:
137      version: [1.3]
138      buildable: false
139      externals:
140      - spec: tar@1.30
141        prefix: /usr
142    unzip:
143      version: [6.0]
144      buildable: false
145      externals:
146      - spec: unzip@6.0
147        prefix: /usr
148  modules:
149    default:
150      enable:
151      - tcl
152      tcl:
153        blacklist_implicits: true
154        hash_length: 0
155        naming_scheme: '{name}/{version}-{compiler.name}-{compiler.version}'
156        all:
157          autoload: direct
158          conflict:
159          - '{name}'
160          environment:
161            set:
162              '{name}_ROOT': '{prefix}'
163          suffixes:
164            ^cuda: cuda
165
166  view: true

The blacklist_implicits: true will ignore module generation for dependencies which is useful when you are building a large software stack, you don’t want an explosion of modulefiles for utilities that you would never use. The hash_length: 0 will avoid adding hash characters at end of modulefile, the naming_scheme will instruct Spack how to format the modulefiles being written on the file-system. Now let’s generate the modulefiles. It is generally a good idea to run this in debug mode to understand how files are being generated. The spack module tcl refresh command will generate tcl modules, it is good idea to specify --delete-tree -y which will delete the root of module tree and -y will accept confirmation. In the output take note of where modulefiles are being written. You will see a list of specs as BLACKLISTED_AS_IMPLICIT which are specs that will not generate modulefiles

 1(spack-pyenv) elvis@login34> spack -d module tcl refresh --delete-tree -y
 2==> [2022-08-04-09:42:35.558437] Reading config file /global/u1/e/elvis/spack-infrastructure/spack/etc/spack/defaults/config.yaml
 3==> [2022-08-04-09:42:35.708144] Reading config file /global/u1/e/elvis/spack-infrastructure/spack/var/spack/environments/data_viz/spack.yaml
 4==> [2022-08-04-09:42:35.767338] Using environment 'data_viz'
 5==> [2022-08-04-09:42:35.968497] Imported module from built-in commands
 6==> [2022-08-04-09:42:35.975354] Imported module from built-in commands
 7==> [2022-08-04-09:42:35.991742] Reading config file /global/u1/e/elvis/spack-infrastructure/spack/etc/spack/defaults/bootstrap.yaml
 8==> [2022-08-04-09:42:36.044748] DATABASE LOCK TIMEOUT: 3s
 9==> [2022-08-04-09:42:36.044959] PACKAGE LOCK TIMEOUT: No timeout
10==> [2022-08-04-09:42:36.161175] Reading config file /global/u1/e/elvis/spack-infrastructure/spack/etc/spack/defaults/repos.yaml
11==> [2022-08-04-09:42:36.634555] Reading config file /global/u1/e/elvis/spack-infrastructure/spack/etc/spack/defaults/modules.yaml
12==> [2022-08-04-09:42:36.691668] Reading config file /global/u1/e/elvis/spack-infrastructure/spack/etc/spack/defaults/cray/modules.yaml
13==> [2022-08-04-09:42:38.077573]    BLACKLISTED_AS_IMPLICIT : cray-libsci@21.08.1.2%cce@13.0.2~mpi~openmp+shared arch=cray-sles15-zen3/7uzhxpv
14==> [2022-08-04-09:42:38.079387]    BLACKLISTED_AS_IMPLICIT : cray-libsci@21.08.1.2%gcc@11.2.0~mpi~openmp+shared arch=cray-sles15-zen3/jzbnd6y
15==> [2022-08-04-09:42:38.081189]    BLACKLISTED_AS_IMPLICIT : cray-mpich@8.1.15%cce@13.0.2+wrappers arch=cray-sles15-zen3/tb5uxwe
16==> [2022-08-04-09:42:38.082661]    BLACKLISTED_AS_IMPLICIT : cray-mpich@8.1.15%gcc@11.2.0+wrappers arch=cray-sles15-zen3/3zy6uvs
17==> [2022-08-04-09:42:38.084601]    BLACKLISTED_AS_IMPLICIT : cuda@11.5.0%gcc@11.2.0~allow-unsupported-compilers~dev arch=cray-sles15-zen3/puekfe3
18==> [2022-08-04-09:42:38.097284]    BLACKLISTED_AS_IMPLICIT : zlib@1.2.12%cce@13.0.2+optimize+pic+shared patches=0d38234 arch=cray-sles15-zen3/e2hl6cx
19==> [2022-08-04-09:42:38.099494]    BLACKLISTED_AS_IMPLICIT : zlib@1.2.12%gcc@11.2.0+optimize+pic+shared patches=0d38234 arch=cray-sles15-zen3/ozmcyfj
20==> [2022-08-04-09:44:22.697989] Regenerating tcl module files
21==> [2022-08-04-09:44:22.872234]    WRITE: darshan-runtime@3.3.1%cce@13.0.2~apmpi~apmpi_sync~apxc~hdf5+mpi scheduler=NONE arch=cray-sles15-zen3/uj3wa4a [/global/u1/e/elvis/spack-infrastructure/spack/share/spack/modules/cray-sles15-zen3/darshan-runtime/3.3.1-cce-13.0.2]
22==> [2022-08-04-09:44:23.696894] Module name: cce/13.0.2
23==> [2022-08-04-09:44:23.697138] Package directory variable prefix: CCE
24==> [2022-08-04-09:44:23.959854] Module name: cce/13.0.2
25==> [2022-08-04-09:44:23.960027] Package directory variable prefix: CCE
26==> [2022-08-04-09:44:24.183730] Module name: cce/13.0.2
27==> [2022-08-04-09:44:24.183920] Package directory variable prefix: CCE
28==> [2022-08-04-09:44:24.810258] Module name: cce/13.0.2
29==> [2022-08-04-09:44:24.810473] Package directory variable prefix: CCE
30==> [2022-08-04-09:44:25.037930] Module name: cce/13.0.2
31==> [2022-08-04-09:44:25.038163] Package directory variable prefix: CCE
32==> [2022-08-04-09:44:25.052737]    BLACKLISTED_AS_IMPLICIT : cray-mpich@8.1.15%cce@13.0.2+wrappers arch=cray-sles15-zen3/tb5uxwe
33==> [2022-08-04-09:44:25.056012]    BLACKLISTED_AS_IMPLICIT : zlib@1.2.12%cce@13.0.2+optimize+pic+shared patches=0d38234 arch=cray-sles15-zen3/e2hl6cx
34==> [2022-08-04-09:44:25.060927] Reading config file /global/u1/e/elvis/spack-infrastructure/spack/etc/spack/defaults/packages.yaml
35==> [2022-08-04-09:44:25.113314]    WRITE: darshan-runtime@3.3.1%gcc@11.2.0~apmpi~apmpi_sync~apxc~hdf5+mpi scheduler=NONE arch=cray-sles15-zen3/hkxzwvt [/global/u1/e/elvis/spack-infrastructure/spack/share/spack/modules/cray-sles15-zen3/darshan-runtime/3.3.1-gcc-11.2.0]
36==> [2022-08-04-09:44:25.219719]    BLACKLISTED_AS_IMPLICIT : cray-mpich@8.1.15%gcc@11.2.0+wrappers arch=cray-sles15-zen3/3zy6uvs
37==> [2022-08-04-09:44:25.222960]    BLACKLISTED_AS_IMPLICIT : zlib@1.2.12%gcc@11.2.0+optimize+pic+shared patches=0d38234 arch=cray-sles15-zen3/ozmcyfj
38==> [2022-08-04-09:44:25.258546]    WRITE: hypre@2.24.0%cce@13.0.2~complex~cuda~debug+fortran~gptune~int64~internal-superlu~mixedint+mpi~openmp~rocm+shared~superlu-dist~unified-memory arch=cray-sles15-zen3/62ofdsf [/global/u1/e/elvis/spack-infrastructure/spack/share/spack/modules/cray-sles15-zen3/hypre/2.24.0-cce-13.0.2]
39==> [2022-08-04-09:44:25.550468] Module name: cce/13.0.2
40==> [2022-08-04-09:44:25.550681] Package directory variable prefix: CCE
41==> [2022-08-04-09:44:25.785678] Module name: cce/13.0.2
42==> [2022-08-04-09:44:25.785853] Package directory variable prefix: CCE
43==> [2022-08-04-09:44:25.995944] Module name: cce/13.0.2
44==> [2022-08-04-09:44:25.996162] Package directory variable prefix: CCE
45==> [2022-08-04-09:44:26.212011] Module name: cce/13.0.2
46==> [2022-08-04-09:44:26.212283] Package directory variable prefix: CCE
47==> [2022-08-04-09:44:26.225681]    BLACKLISTED_AS_IMPLICIT : cray-libsci@21.08.1.2%cce@13.0.2~mpi~openmp+shared arch=cray-sles15-zen3/7uzhxpv
48==> [2022-08-04-09:44:26.230079]    BLACKLISTED_AS_IMPLICIT : cray-mpich@8.1.15%cce@13.0.2+wrappers arch=cray-sles15-zen3/tb5uxwe
49==> [2022-08-04-09:44:26.238876]    WRITE: hypre@2.24.0%gcc@11.2.0~complex~cuda~debug+fortran~gptune~int64~internal-superlu~mixedint+mpi~openmp~rocm+shared~superlu-dist~unified-memory arch=cray-sles15-zen3/mbn7bum [/global/u1/e/elvis/spack-infrastructure/spack/share/spack/modules/cray-sles15-zen3/hypre/2.24.0-gcc-11.2.0]
50==> [2022-08-04-09:44:26.385208]    BLACKLISTED_AS_IMPLICIT : cray-libsci@21.08.1.2%gcc@11.2.0~mpi~openmp+shared arch=cray-sles15-zen3/jzbnd6y
51==> [2022-08-04-09:44:26.388329]    BLACKLISTED_AS_IMPLICIT : cray-mpich@8.1.15%gcc@11.2.0+wrappers arch=cray-sles15-zen3/3zy6uvs
52==> [2022-08-04-09:44:26.398423]    WRITE: papi@6.0.0.1%cce@13.0.2~cuda+example~infiniband~lmsensors~nvml~powercap~rapl~rocm~rocm_smi~sde+shared~static_tools patches=b6d6caa arch=cray-sles15-zen3/3aprcx5 [/global/u1/e/elvis/spack-infrastructure/spack/share/spack/modules/cray-sles15-zen3/papi/6.0.0.1-cce-13.0.2]
53==> [2022-08-04-09:44:26.749919] Module name: cce/13.0.2
54==> [2022-08-04-09:44:26.750092] Package directory variable prefix: CCE
55==> [2022-08-04-09:44:26.762459]    WRITE: papi@6.0.0.1%gcc@11.2.0~cuda+example~infiniband~lmsensors~nvml~powercap~rapl~rocm~rocm_smi~sde+shared~static_tools arch=cray-sles15-zen3/s2y4nrv [/global/u1/e/elvis/spack-infrastructure/spack/share/spack/modules/cray-sles15-zen3/papi/6.0.0.1-gcc-11.2.0]
56==> [2022-08-04-09:44:26.897249]    WRITE: papi@6.0.0.1%gcc@11.2.0+cuda+example~infiniband~lmsensors~nvml~powercap~rapl~rocm~rocm_smi~sde+shared~static_tools arch=cray-sles15-zen3/x43djbq [/global/u1/e/elvis/spack-infrastructure/spack/share/spack/modules/cray-sles15-zen3/papi/6.0.0.1-gcc-11.2.0-cuda]
57==> [2022-08-04-09:44:27.240985] Module name: gcc/11.2.0
58==> [2022-08-04-09:44:27.241199] Package directory variable prefix: GCC
59==> [2022-08-04-09:44:27.316093]    BLACKLISTED_AS_IMPLICIT : cuda@11.5.0%gcc@11.2.0~allow-unsupported-compilers~dev arch=cray-sles15-zen3/puekfe3

Spack will generate the modulefiles, in its default location $SPACK_ROOT/share/spack/modules which is organized by architecture (spack arch) as shown below:

(spack-pyenv) elvis@login34> ls $SPACK_ROOT/share/spack/modules/$(spack arch)/*
/global/homes/e/elvis/spack-infrastructure/spack/share/spack/modules/cray-sles15-zen3/darshan-runtime:
3.3.1-cce-13.0.2  3.3.1-gcc-11.2.0

/global/homes/e/elvis/spack-infrastructure/spack/share/spack/modules/cray-sles15-zen3/hypre:
2.24.0-cce-13.0.2  2.24.0-gcc-11.2.0

/global/homes/e/elvis/spack-infrastructure/spack/share/spack/modules/cray-sles15-zen3/papi:
6.0.0.1-cce-13.0.2  6.0.0.1-gcc-11.2.0  6.0.0.1-gcc-11.2.0-cuda

Let’s change the directory path such that modulefiles are not inside Spack’s root directory and they are easy to remember. For this exercise let’s generate the modulefiles in your $HOME/spack-infrastructure/modules directory.

  1# This is a Spack Environment file.
  2#
  3# It describes a set of packages to be installed, along with
  4# configuration settings.
  5spack:
  6  config:
  7    view: false
  8    concretization: separately
  9    build_stage: $spack/var/spack/stage
 10    misc_cache: $spack/var/spack/misc_cache
 11    concretizer: clingo
 12  compilers:
 13  - compiler:
 14      spec: gcc@11.2.0
 15      paths:
 16        cc: cc
 17        cxx: CC
 18        f77: ftn
 19        fc: ftn
 20      flags: {}
 21      operating_system: sles15
 22      target: any
 23      modules:
 24      - PrgEnv-gnu
 25      - gcc/11.2.0
 26      - craype-x86-milan
 27      - libfabric
 28      extra_rpaths: []
 29  - compiler:
 30      spec: cce@13.0.2
 31      paths:
 32        cc: /opt/cray/pe/craype/default/bin/cc
 33        cxx: /opt/cray/pe/craype/default/bin/CC
 34        f77: /opt/cray/pe/craype/default/bin/ftn
 35        fc: /opt/cray/pe/craype/default/bin/ftn
 36      flags: {}
 37      operating_system: sles15
 38      target: any
 39      modules:
 40      - PrgEnv-cray
 41      - cce/13.0.2
 42      - craype-x86-milan
 43      - libfabric
 44      environment: {}
 45      extra_rpaths: []
 46
 47  # add package specs to the `specs` list
 48  specs:
 49  - papi %gcc
 50  - papi %cce
 51  - hypre %gcc
 52  - hypre %cce
 53  - darshan-runtime %gcc
 54  - darshan-runtime %cce
 55  - papi +cuda %gcc
 56  packages:
 57    all:
 58      compiler: [gcc@11.2.0, cce@13.0.2]
 59      providers:
 60        blas: [cray-libsci]
 61        mpi: [cray-mpich]
 62    bzip2:
 63      version: [1.0.6]
 64      externals:
 65      - spec: bzip2@1.0.6
 66        prefix: /usr
 67    cray-libsci:
 68      buildable: false
 69      externals:
 70      - spec: cray-libsci@21.08.1.2
 71        modules:
 72        - cray-libsci/21.08.1.2
 73    cray-mpich:
 74      buildable: false
 75      externals:
 76      - spec: cray-mpich@8.1.15 %gcc@11.2.0
 77        prefix: /opt/cray/pe/mpich/8.1.15/ofi/gnu/9.1
 78        modules:
 79        - cray-mpich/8.1.15
 80        - cudatoolkit/11.5
 81      - spec: cray-mpich@8.1.15 %cce@13.0.2
 82        prefix: /opt/cray/pe/mpich/8.1.15/ofi/cray/10.0/
 83        modules:
 84        - cray-mpich/8.1.15
 85        - cudatoolkit/11.5
 86    cray-pmi:
 87      buildable: false
 88      externals:
 89      - spec: cray-pmi@6.1.1
 90        modules:
 91        - cray-pmi/6.1.1
 92    cuda:
 93      buildable: false
 94      version: [11.5.0]
 95      externals:
 96      - spec: cuda@11.5.0
 97        prefix: /opt/nvidia/hpc_sdk/Linux_x86_64/21.11/cuda/11.5
 98        modules:
 99        - cudatoolkit/11.5
100    diffutils:
101      version: [3.6]
102      externals:
103      - spec: diffutils@3.6
104        prefix: /usr
105    findutils:
106      version: [4.6.0]
107      externals:
108      - spec: findutils@4.6.0
109        prefix: /usr
110    libfabric:
111      buildable: false
112      variants: fabrics=sockets,tcp,udp,rxm
113      externals:
114      - spec: libfabric@1.11.0.4.114
115        prefix: /opt/cray/libfabric/1.11.0.4.114
116        modules:
117        - libfabric/1.11.0.4.114
118    openssl:
119      version: [1.1.0i]
120      buildable: false
121      externals:
122      - spec: openssl@1.1.0i
123        prefix: /usr
124    openssh:
125      version: [7.9p1]
126      buildable: false
127      externals:
128      - spec: openssh@7.9p1
129        prefix: /usr
130    readline:
131      version: [7.0]
132      buildable: false
133      externals:
134      - spec: readline@7.0
135        prefix: /usr
136    tar:
137      version: [1.3]
138      buildable: false
139      externals:
140      - spec: tar@1.30
141        prefix: /usr
142    unzip:
143      version: [6.0]
144      buildable: false
145      externals:
146      - spec: unzip@6.0
147        prefix: /usr
148  modules:
149    default:
150      enable:
151      - tcl
152      roots:
153        tcl: /global/homes/e/elvis/spack-infrastructure/modules
154      tcl:
155        blacklist_implicits: true
156        hash_length: 0
157        naming_scheme: '{name}/{version}-{compiler.name}-{compiler.version}'
158        all:
159          autoload: direct
160          conflict:
161          - '{name}'
162          environment:
163            set:
164              '{name}_ROOT': '{prefix}'
165          suffixes:
166            ^cuda: cuda
167
168  view: true

Now you will see the modulefiles are written in $HOME/spack-infrastructure/modules.

(spack-pyenv) elvis@login34> spack -d module tcl refresh --delete-tree -y
==> [2022-08-04-09:53:00.452047] Reading config file /global/u1/e/elvis/spack-infrastructure/spack/etc/spack/defaults/config.yaml
==> [2022-08-04-09:53:00.563502] Reading config file /global/u1/e/elvis/spack-infrastructure/spack/var/spack/environments/data_viz/spack.yaml
==> [2022-08-04-09:53:00.617365] Using environment 'data_viz'
==> [2022-08-04-09:53:00.625951] Imported module from built-in commands
==> [2022-08-04-09:53:00.632039] Imported module from built-in commands
==> [2022-08-04-09:53:00.637512] Reading config file /global/u1/e/elvis/spack-infrastructure/spack/etc/spack/defaults/bootstrap.yaml
==> [2022-08-04-09:53:00.654001] DATABASE LOCK TIMEOUT: 3s
==> [2022-08-04-09:53:00.654065] PACKAGE LOCK TIMEOUT: No timeout
==> [2022-08-04-09:53:00.657750] Reading config file /global/u1/e/elvis/spack-infrastructure/spack/etc/spack/defaults/repos.yaml
==> [2022-08-04-09:53:00.670487] Reading config file /global/u1/e/elvis/spack-infrastructure/spack/etc/spack/defaults/modules.yaml
==> [2022-08-04-09:53:00.687615] Reading config file /global/u1/e/elvis/spack-infrastructure/spack/etc/spack/defaults/cray/modules.yaml
==> [2022-08-04-09:53:00.891563]    BLACKLISTED_AS_IMPLICIT : cray-libsci@21.08.1.2%cce@13.0.2~mpi~openmp+shared arch=cray-sles15-zen3/7uzhxpv
==> [2022-08-04-09:53:00.892858]    BLACKLISTED_AS_IMPLICIT : cray-libsci@21.08.1.2%gcc@11.2.0~mpi~openmp+shared arch=cray-sles15-zen3/jzbnd6y
==> [2022-08-04-09:53:00.894129]    BLACKLISTED_AS_IMPLICIT : cray-mpich@8.1.15%cce@13.0.2+wrappers arch=cray-sles15-zen3/tb5uxwe
==> [2022-08-04-09:53:00.895334]    BLACKLISTED_AS_IMPLICIT : cray-mpich@8.1.15%gcc@11.2.0+wrappers arch=cray-sles15-zen3/3zy6uvs
==> [2022-08-04-09:53:00.896502]    BLACKLISTED_AS_IMPLICIT : cuda@11.5.0%gcc@11.2.0~allow-unsupported-compilers~dev arch=cray-sles15-zen3/puekfe3
==> [2022-08-04-09:53:00.904007]    BLACKLISTED_AS_IMPLICIT : zlib@1.2.12%cce@13.0.2+optimize+pic+shared patches=0d38234 arch=cray-sles15-zen3/e2hl6cx
==> [2022-08-04-09:53:00.905394]    BLACKLISTED_AS_IMPLICIT : zlib@1.2.12%gcc@11.2.0+optimize+pic+shared patches=0d38234 arch=cray-sles15-zen3/ozmcyfj
==> [2022-08-04-09:53:03.555915] Regenerating tcl module files
==> [2022-08-04-09:53:03.577058]    WRITE: darshan-runtime@3.3.1%cce@13.0.2~apmpi~apmpi_sync~apxc~hdf5+mpi scheduler=NONE arch=cray-sles15-zen3/uj3wa4a [/global/homes/e/elvis/spack-infrastructure/modules/cray-sles15-zen3/darshan-runtime/3.3.1-cce-13.0.2]
==> [2022-08-04-09:53:04.003818] Module name: cce/13.0.2
==> [2022-08-04-09:53:04.004044] Package directory variable prefix: CCE
==> [2022-08-04-09:53:04.248393] Module name: cce/13.0.2
==> [2022-08-04-09:53:04.248675] Package directory variable prefix: CCE
==> [2022-08-04-09:53:04.484157] Module name: cce/13.0.2
==> [2022-08-04-09:53:04.484420] Package directory variable prefix: CCE
==> [2022-08-04-09:53:04.766465] Module name: cce/13.0.2
==> [2022-08-04-09:53:04.766692] Package directory variable prefix: CCE
==> [2022-08-04-09:53:05.024080] Module name: cce/13.0.2
==> [2022-08-04-09:53:05.024335] Package directory variable prefix: CCE
==> [2022-08-04-09:53:05.043781]    BLACKLISTED_AS_IMPLICIT : cray-mpich@8.1.15%cce@13.0.2+wrappers arch=cray-sles15-zen3/tb5uxwe
==> [2022-08-04-09:53:05.048836]    BLACKLISTED_AS_IMPLICIT : zlib@1.2.12%cce@13.0.2+optimize+pic+shared patches=0d38234 arch=cray-sles15-zen3/e2hl6cx
==> [2022-08-04-09:53:05.055298] Reading config file /global/u1/e/elvis/spack-infrastructure/spack/etc/spack/defaults/packages.yaml
==> [2022-08-04-09:53:05.111091]    WRITE: darshan-runtime@3.3.1%gcc@11.2.0~apmpi~apmpi_sync~apxc~hdf5+mpi scheduler=NONE arch=cray-sles15-zen3/hkxzwvt [/global/homes/e/elvis/spack-infrastructure/modules/cray-sles15-zen3/darshan-runtime/3.3.1-gcc-11.2.0]
==> [2022-08-04-09:53:05.161578]    BLACKLISTED_AS_IMPLICIT : cray-mpich@8.1.15%gcc@11.2.0+wrappers arch=cray-sles15-zen3/3zy6uvs
==> [2022-08-04-09:53:05.164707]    BLACKLISTED_AS_IMPLICIT : zlib@1.2.12%gcc@11.2.0+optimize+pic+shared patches=0d38234 arch=cray-sles15-zen3/ozmcyfj
==> [2022-08-04-09:53:05.171012]    WRITE: hypre@2.24.0%cce@13.0.2~complex~cuda~debug+fortran~gptune~int64~internal-superlu~mixedint+mpi~openmp~rocm+shared~superlu-dist~unified-memory arch=cray-sles15-zen3/62ofdsf [/global/homes/e/elvis/spack-infrastructure/modules/cray-sles15-zen3/hypre/2.24.0-cce-13.0.2]
==> [2022-08-04-09:53:05.469562] Module name: cce/13.0.2
==> [2022-08-04-09:53:05.469791] Package directory variable prefix: CCE
==> [2022-08-04-09:53:05.767046] Module name: cce/13.0.2
==> [2022-08-04-09:53:05.767239] Package directory variable prefix: CCE
==> [2022-08-04-09:53:06.050449] Module name: cce/13.0.2
==> [2022-08-04-09:53:06.050663] Package directory variable prefix: CCE
==> [2022-08-04-09:53:06.295722] Module name: cce/13.0.2
==> [2022-08-04-09:53:06.295923] Package directory variable prefix: CCE
==> [2022-08-04-09:53:06.307895]    BLACKLISTED_AS_IMPLICIT : cray-libsci@21.08.1.2%cce@13.0.2~mpi~openmp+shared arch=cray-sles15-zen3/7uzhxpv
==> [2022-08-04-09:53:06.313024]    BLACKLISTED_AS_IMPLICIT : cray-mpich@8.1.15%cce@13.0.2+wrappers arch=cray-sles15-zen3/tb5uxwe
==> [2022-08-04-09:53:06.321590]    WRITE: hypre@2.24.0%gcc@11.2.0~complex~cuda~debug+fortran~gptune~int64~internal-superlu~mixedint+mpi~openmp~rocm+shared~superlu-dist~unified-memory arch=cray-sles15-zen3/mbn7bum [/global/homes/e/elvis/spack-infrastructure/modules/cray-sles15-zen3/hypre/2.24.0-gcc-11.2.0]
==> [2022-08-04-09:53:06.366559]    BLACKLISTED_AS_IMPLICIT : cray-libsci@21.08.1.2%gcc@11.2.0~mpi~openmp+shared arch=cray-sles15-zen3/jzbnd6y
==> [2022-08-04-09:53:06.369882]    BLACKLISTED_AS_IMPLICIT : cray-mpich@8.1.15%gcc@11.2.0+wrappers arch=cray-sles15-zen3/3zy6uvs
==> [2022-08-04-09:53:06.377335]    WRITE: papi@6.0.0.1%cce@13.0.2~cuda+example~infiniband~lmsensors~nvml~powercap~rapl~rocm~rocm_smi~sde+shared~static_tools patches=b6d6caa arch=cray-sles15-zen3/3aprcx5 [/global/homes/e/elvis/spack-infrastructure/modules/cray-sles15-zen3/papi/6.0.0.1-cce-13.0.2]
==> [2022-08-04-09:53:06.656390] Module name: cce/13.0.2
==> [2022-08-04-09:53:06.656565] Package directory variable prefix: CCE
==> [2022-08-04-09:53:06.670466]    WRITE: papi@6.0.0.1%gcc@11.2.0~cuda+example~infiniband~lmsensors~nvml~powercap~rapl~rocm~rocm_smi~sde+shared~static_tools arch=cray-sles15-zen3/s2y4nrv [/global/homes/e/elvis/spack-infrastructure/modules/cray-sles15-zen3/papi/6.0.0.1-gcc-11.2.0]
==> [2022-08-04-09:53:06.719655]    WRITE: papi@6.0.0.1%gcc@11.2.0+cuda+example~infiniband~lmsensors~nvml~powercap~rapl~rocm~rocm_smi~sde+shared~static_tools arch=cray-sles15-zen3/x43djbq [/global/homes/e/elvis/spack-infrastructure/modules/cray-sles15-zen3/papi/6.0.0.1-gcc-11.2.0-cuda]
==> [2022-08-04-09:53:07.034250] Module name: gcc/11.2.0
==> [2022-08-04-09:53:07.034531] Package directory variable prefix: GCC
==> [2022-08-04-09:53:07.055549]    BLACKLISTED_AS_IMPLICIT : cuda@11.5.0%gcc@11.2.0~allow-unsupported-compilers~dev arch=cray-sles15-zen3/puekfe3

We can see that Spack has generated the modulefiles in the format of {name}/{version}-{compiler.name}-{compiler.version}. For example, the -cuda suffix was added for the PAPI module that has CUDA-enabled features.

(spack-pyenv) elvis@login34> ls -l $CI_PROJECT_DIR/modules/$(spack arch)/*
/global/homes/e/elvis/spack-infrastructure/modules/cray-sles15-zen3/darshan-runtime:
total 8
-rw-r--r-- 1 elvis elvis 2245 Aug  4 09:53 3.3.1-cce-13.0.2
-rw-r--r-- 1 elvis elvis 2243 Aug  4 09:53 3.3.1-gcc-11.2.0

/global/homes/e/elvis/spack-infrastructure/modules/cray-sles15-zen3/hypre:
total 8
-rw-r--r-- 1 elvis elvis 1951 Aug  4 09:53 2.24.0-cce-13.0.2
-rw-r--r-- 1 elvis elvis 1943 Aug  4 09:53 2.24.0-gcc-11.2.0

/global/homes/e/elvis/spack-infrastructure/modules/cray-sles15-zen3/papi:
total 12
-rw-r--r-- 1 elvis elvis 2441 Aug  4 09:53 6.0.0.1-cce-13.0.2
-rw-r--r-- 1 elvis elvis 2425 Aug  4 09:53 6.0.0.1-gcc-11.2.0
-rw-r--r-- 1 elvis elvis 2503 Aug  4 09:53 6.0.0.1-gcc-11.2.0-cuda

We can add this directory to MODULEPATH by running the following:

(spack-pyenv) elvis@login34> module use $CI_PROJECT_DIR/modules/$(spack arch)

Next, if we run ml av we will see the modules generated from Spack that correspond to the installed Spack packages.

(spack-pyenv) elvis@login34> ml av

------------------------------------ /global/homes/e/elvis/spack-infrastructure/modules/cray-sles15-zen3 -------------------------------------
   darshan-runtime/3.3.1-cce-13.0.2        hypre/2.24.0-cce-13.0.2        papi/6.0.0.1-cce-13.0.2         papi/6.0.0.1-gcc-11.2.0
   darshan-runtime/3.3.1-gcc-11.2.0 (D)    hypre/2.24.0-gcc-11.2.0 (D)    papi/6.0.0.1-gcc-11.2.0-cuda

This concludes the Spack training.

Administration Guide

This page holds documentation for the processes that support the development, deployment and continuous integration activities of Spack infrastructure at NERSC.

Login Access

You can access Cori and Perlmutter, for more details see Connecting to NERSC. If either system is down you can access data transfer nodes (dtn[01-04].nersc.gov) and then access the appropriate system. Please check out the NERSC MOTD at https://www.nersc.gov/live-status/motd/ for live updates to system.

In order to access TDS systems like muller or gerty you will need to access one of the systems (cori, perlmutter, dtn) and then run the following:

ssh dtn01.nersc.gov
ssh gerty

It is probably a good idea to run usgrsu once you are in the correct login node otherwise you may be prompted for a password for the e4s user.

The e4s user is a collaboration account which is a shared account used for spack deployments. You will need to login as e4s user via usgrsu command or use sshproxy to get 24hr credential and then ssh as the collaboration account. This will prompt you for a password which is your NERSC password for your username not e4s user.

Only members part of c_e4s unix group have access to use the collaboration account. You can run the following to see list of users that belong to the group. If you don’t belong to this group and should be part of this group, please send a ticket at https://help.nersc.gov

getent group c_e4s

Production Software Stack

The spack stack is installed on shared filesystem at /global/common/software/spackecp. The project space has a quota limit for space and inode count. To check for the quota space please run the following

cfsquota /global/common/software/spackecp

The production installation of e4s stack on Perlmutter is stored in sub-directory perlmutter with a version for each stack as follows

(spack-pyenv) e4s:login22> ls -ld /global/common/software/spackecp/perlmutter/e4s-*
drwxrwsr-x+ 8 e4s spackecp 512 Jun  6 10:09 /global/common/software/spackecp/perlmutter/e4s-21.11
drwxrwsr-x+ 9 e4s spackecp 512 Jan 12 07:34 /global/common/software/spackecp/perlmutter/e4s-22.05
drwxrwsr-x+ 5 e4s spackecp 512 Mar 28 10:24 /global/common/software/spackecp/perlmutter/e4s-22.11

Within the installation you will see several subdirectories which contain a unique identified from the CI job. The default is a symbolic link to the active production stack

(spack-pyenv) e4s:login22> ls -l /global/common/software/spackecp/perlmutter/e4s-22.11/
total 4
drwxrwsr-x+ 3 e4s spackecp 512 Mar  6 14:40 82028
drwxrwsr-x+ 3 e4s spackecp 512 Mar 28 10:16 82069
drwxrwsr-x+ 3 e4s spackecp 512 Mar 28 08:34 83104
lrwxrwxrwx  1 e4s spackecp   5 Mar 28 10:24 default -> 83104

We have one modulefile per e4s stack, they are named as e4s/<version> with a symobolic link spack/e4s-<version>. In the modulefile you will see path to root installation of spack. As we can see from example below, the root of spack is located in /global/common/software/spackecp/perlmutter/e4s-22.11/default/spack

(spack-pyenv) e4s:login22> module --redirect --raw show e4s/22.11 | grep root
local root = "/global/common/software/spackecp/perlmutter/e4s-22.11/default/spack"
         spack_setup = pathJoin(root, "share/spack/setup-env.sh")
         spack_setup = pathJoin(root, "share/spack/setup-env.csh")
         spack_setup = pathJoin(root, "share/spack/setup-env.fish")
   remove_path("PATH", pathJoin(root, "bin"))

(spack-pyenv) e4s:login22> ls -l /global/common/software/spackecp/perlmutter/e4s-22.11/default/spack
total 100
drwxrwsr-x+ 2 e4s spackecp   512 Mar 28 08:54 bin
-rw-rw-r--  1 e4s spackecp 55695 Mar 28 08:35 CHANGELOG.md
-rw-rw-r--  1 e4s spackecp  1941 Mar 28 08:35 CITATION.cff
-rw-rw-r--  1 e4s spackecp  3262 Mar 28 08:35 COPYRIGHT
drwxrwsr-x+ 3 e4s spackecp   512 Mar 28 08:35 etc
drwxrwsr-x+ 3 e4s spackecp   512 Mar 28 08:35 lib
-rw-rw-r--  1 e4s spackecp 11358 Mar 28 08:35 LICENSE-APACHE
-rw-rw-r--  1 e4s spackecp  1107 Mar 28 08:35 LICENSE-MIT
-rw-rw-r--  1 e4s spackecp  1167 Mar 28 08:35 NOTICE
drwxrwsr-x+ 3 e4s spackecp   512 Mar 28 08:35 opt
-rw-rw-r--  1 e4s spackecp  2946 Mar 28 08:35 pyproject.toml
-rw-rw-r--  1 e4s spackecp   764 Mar 28 08:35 pytest.ini
-rw-rw-r--  1 e4s spackecp  6522 Mar 28 08:35 README.md
-rw-rw-r--  1 e4s spackecp   699 Mar 28 08:35 SECURITY.md
drwxrwsr-x+ 3 e4s spackecp   512 Mar 28 08:35 share
drwxrwsr-x+ 3 e4s spackecp   512 Mar 28 08:35 var

Changing Production stack within a release

To change the production path you will need to change the default symbolic link to the latest run. First navigate to the directory where you have the production installation. For example, lets change to the root of e4s-22.11 and remove the symbolic link

cd  /global/common/software/spackecp/perlmutter/e4s-22.11/
unlink default

Next create a symbolic link to the new directory

ln -s <DIRECTORY_ID> default

Troubleshooting GitLab Runner

Once you are logged in, you can login to the desired system to restart the runner. You can check the runner status by navigating to Settings > CI/CD > Runners. If the GitLab runner is down you will need to restart the runner. To check the status of the runner you can do the following, if you see the following message this means the runner is active and running.

● perlmutter-e4s.service - Gitlab runner for e4s runner on perlmutter
  Loaded: loaded (/global/homes/e/e4s/.config/systemd/user/perlmutter-e4s.service; enabled; vendor preset: disabled)
  Active: active (running) since Mon 2023-06-05 10:36:39 PDT; 23h ago
Main PID: 140477 (gitlab-runner)
   Tasks: 47 (limit: 39321)
  Memory: 11.9G
     CPU: 1d 5h 43min 43.685s
  CGroup: /user.slice/user-93315.slice/user@93315.service/app.slice/perlmutter-e4s.service
          └─ 140477 /global/homes/e/e4s/jacamar/gitlab-runner run -c /global/homes/e/e4s/.gitlab-runner/perlmutter.config.toml

If the runner is not active you can restart this by running

systemctl --user restart perlmutter-e4s

The systemd service files are used for managing the gitlab runners. These files are the following

(spack-pyenv) e4s:login22> ls -l ~/.config/systemd/user/*.service
-rw-rw-r-- 1 e4s e4s 326 May  9 07:32 /global/homes/e/e4s/.config/systemd/user/muller-e4s.service
-rw-rw-r-- 1 e4s e4s 334 May  9 07:30 /global/homes/e/e4s/.config/systemd/user/perlmutter-e4s.service

The gitlab-runner command should be accessible via the e4s user. To register a runner you can run gitlab-runner register and follow the prompt. The runner configuration will be written to ~/.gitlab-runner/config.toml. However we recommend you create a separate config.toml or copy the file to separate location. For instance if you want to register a runner for muller you can set gitlab-runner register -c ~/.gitlab-runner/muller.config.toml when registering the runner and it will write the runner configuration to ~/.gitlab-runner/muller.config.toml. For more details regarding runner registration please see https://docs.gitlab.com/runner/register/.

Sometimes you may see unexpected results during CI jobs if you made changes to the GitLab configuration and you have multiple GitLab-runner processes running on different nodes. Therefore, we recommend you use pdsh to search for all process across all nodes to find the process and then terminate it. The command below will search for the gitlab-runner process for service perlmutter-e4s across all Perlmutter login nodes.

pdsh -w login[01-40] systemctl --user status perlmutter-e4s 2>&1 < /dev/null

Jacamar

The GitLab runnners are using Jacamar CI, there should be a jacamar.toml file in the following location:

e4s:login27> ls -l ~/.gitlab-runner/jacamar.toml
-rw-rw-r-- 1 e4s e4s 758 Aug 11 08:57 /global/homes/e/e4s/.gitlab-runner/jacamar.toml

Any updates to the Jacamar configuration are applied to runner and there is no need to restart GitLab runner.

The binaries jacamar and jacamar-auth are located in the following location, if we need to upgrade Jacamar we should place them in this location,

e4s:login27> ls -l ~/jacamar/binaries/
total 15684
-rwxr-xr-x 1 e4s e4s 6283264 Jul  7 15:50 jacamar
-rwxr-xr-x 1 e4s e4s 9773296 Jul  7 15:50 jacamar-auth

Test for NERSC System Changes

NERSC uses ReFrame to test system health after maintenance. In order to ensure the earliest possible notification of system changes that will affect E4S builds, a test has been added. This test can be found at https://gitlab.nersc.gov/nersc/consulting/reframe-at-nersc/reframe-nersc-tests.

Contributing Guide

This guide will discuss how one can contribute back to this project. First, you will need to clone this repository locally.

git clone https://software.nersc.gov/NERSC/spack-infrastructure.git

You will need to setup a Personal Access Token in order to clone via HTTPS since git over ssh is disabled.

The typical contribution process will be as follows:

git checkout -b <featureX>
git add <file1> <file2> ... <fileN>
git commit -m "COMMIT MESSAGE"
git push 

Please create a feature branch, add the files that need to be changed, commit and push your changes. Next create a merge request to main branch.

If you want to reproduce the steps in the CI, we encourage you review the .gitlab-ci.yml and run the instructions to recreate the environment.

Once you clone this repo locally, you can source the setup-env.sh script.

cd spack-infrastructure
source setup-env.sh

This script will create a python environment where you can perform spack builds. This script will set CI_PROJECT_DIR to root of spack-infrastructure project.

(spack-pyenv) siddiq90@login31> echo $CI_PROJECT_DIR
/global/homes/s/siddiq90/gitrepos/software.nersc.gov/spack-infrastructure

Troubleshooting CI builds

First you will need to review the pipeline build in https://software.nersc.gov/NERSC/spack-infrastructure/-/pipelines and login as e4s user on the appropriate system. At top of pipeline you will see location where project was cloned typically this would be in $CFS/m3503. The content would look something like

Reinitialized existing Git repository in /global/cfs/cdirs/m3503/ci/oGV2kxLA/0/NERSC/spack-infrastructure/.git/

You will need to navigate to the directory and then repeat the steps specified in .gitlab-ci.yml to reproduce the issue.

How to add a new E4S stack

All spack configuration are stored in spack-configs, so if you want to add a new spack configuration please create a new directory to store your spack configuration. You can use the directory format <site>-e4s-<e4s version> to map a E4S release to a particular system so cori-e4s-22.02 refers to E4S 22.02 release built for Cori.

You will need to create one or more gitlab job in .gitlab-ci.yml to ensure gitlab can run the pipeline. We recommend you create a scheduled pipeline in order to run job on a scheduled basis. The scheduled pipeline must define name PIPELINE_NAME with name of gitlab job to run.

The production pipelines should not run via scheduled pipeline, instead they should be run manually via web interface. The production pipeline should only be run when one need to redeploy the entire stack due to rebuild.

Building User Documentation

The documentation is built using sphinx and hosted on readthedocs. We have enabled Preview Documentation from Pull Requests. When a pull request or push event occurs, the documentation will be rebuilt and hosted. This allows us to preview the changes during the review process.

If you want to build documentation locally, use the following steps:

  1. Create a python virtual environment and activate the environment

python3 -m venv $HOME/nersc-spack
source $HOME/nersc-spack/bin/activate
  1. Install the dependencies

pip install -r docs/requirement.txt
  1. Build the documentation

cd docs
make clean
make html

Shown below is a typical build for documentation

(nersc-spack)  ~/Documents/github/spack-infrastructure/docs/ [update_contributing_guide*] make clean
Removing everything under '_build'...
(nersc-spack)  ~/Documents/github/spack-infrastructure/docs/ [update_contributing_guide*] make html 
Running Sphinx v5.1.1
making output directory... done
myst v0.18.0: MdParserConfig(commonmark_only=False, gfm_only=False, enable_extensions=[], disable_syntax=[], all_links_external=False, url_schemes=('http', 'https', 'mailto', 'ftp'), ref_domains=None, highlight_code_blocks=True, number_code_blocks=[], title_to_header=False, heading_anchors=None, heading_slug_func=None, footnote_transition=True, words_per_minute=200, sub_delimiters=('{', '}'), linkify_fuzzy_links=True, dmath_allow_labels=True, dmath_allow_space=True, dmath_allow_digits=True, dmath_double_inline=False, update_mathjax=True, mathjax_classes='tex2jax_process|mathjax_process|math|output_area')
building [mo]: targets for 0 po files that are out of date
building [html]: targets for 3 source files that are out of date
updating environment: [new config] 3 added, 0 changed, 0 removed
reading sources... [100%] spack_configs                                                                                                                                                                                                    
looking for now-outdated files... none found
pickling environment... done
checking consistency... done
preparing documents... done
writing output... [100%] spack_configs                                                                                                                                                                                                     
generating indices... genindex done
writing additional pages... search done
copying static files... done
copying extra files... done
dumping search index in English (code: en)... done
dumping object inventory... done
build succeeded.

The HTML pages are in _build/html.

The documentation page can be accessible by opening the file _build/html/index.html in your browser. Alternatively you can use the open command from your terminal

open _build/html/index.html

Sphinx will read a configuration file, conf.py, when building the documentation

Please refer to the Configuration section of the Sphinx documentation for more details.

Conferences

Conference

Talk

Date

Link

SEA Improving Scientific Software Conference 2022

Spack Infrastructure at NERSC

Apr 5th, 2022

PPTX, VIDEO

HPC Software Summit

NERSC Spack Infrastructure

Aug 17th, 2022

PDF

E4S at NERSC 2022

NERSC Spack Infrastructure

Aug 25th, 2022

PDF

Indices and tables