Example CI workflows from DESC repositories

Here we describe a selection of CI workflows taken from activate DESC repositories. These demonstrate CI at DESC in action.

We go through each example step-by-step, highlighting any additional capabilities of GitHub Actions that was not covered by the examples in the previous sections.

The DESC CI workflows were taken from the repositories on January 2023.

The Core Cosmology Library

  1name: continuous-integration
  2on:
  3  push:
  4    branches:
  5      - main
  6      - master
  7      - releases/*
  8  pull_request: null
  9
 10jobs:
 11  build:
 12    runs-on: ${{ matrix.os }}
 13    defaults:
 14      run:
 15        shell: bash -l {0}
 16    strategy:
 17      fail-fast: false
 18      matrix:
 19        os:
 20          - macos-11
 21          - ubuntu-latest
 22        py:
 23          - 3.8
 24
 25    steps:
 26      - name: cancel previous runs
 27        uses: styfle/cancel-workflow-action@0.6.0
 28        with:
 29          access_token: ${{ github.token }}
 30
 31      - uses: actions/checkout@v2
 32
 33      - uses: conda-incubator/setup-miniconda@v2
 34        with:
 35          python-version: ${{ matrix.py }}
 36          channels: conda-forge,defaults
 37          channel-priority: strict
 38          show-channel-urls: true
 39          miniforge-version: latest
 40          miniforge-variant: Mambaforge
 41
 42      - name: remove homebrew
 43        if: matrix.os == 'macos-11'
 44        run: |
 45          curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/uninstall.sh -o uninstall_homebrew.sh
 46          chmod u+x ./uninstall_homebrew.sh
 47          ./uninstall_homebrew.sh -f -q >& /dev/null
 48          rm -f uninstall_homebrew.sh
 49
 50      - name: lint
 51        run: |
 52          export MAMBA_NO_BANNER=1
 53          mamba install flake8
 54          flake8 pyccl
 55          flake8 --exclude=data benchmarks
 56          if [[ `grep "$(printf '\t')" pyccl/*.py` != "" ]]; then
 57            exit 1
 58          fi
 59
 60      - name: install deps
 61        run: |
 62          mamba install \
 63            pip \
 64            numpy nose coveralls pyyaml gsl fftw cmake swig scipy \
 65            compilers pkg-config setuptools_scm pytest pandas pytest-cov \
 66            cython "camb>=1.3" isitgr traitlets fast-pt
 67
 68          if [[ ${MATRIX_OS} == "macos-11" ]]; then
 69            mamba install llvm-openmp
 70            echo "DYLD_FALLBACK_LIBRARY_PATH=${CONDA_PREFIX}/lib" >> $GITHUB_ENV
 71            SDKROOT=$(xcrun --sdk macosx --show-sdk-path)
 72            echo "SDKROOT: ${SDKROOT}"
 73            echo "SDKROOT=${SDKROOT}" >> $GITHUB_ENV
 74            echo "CONDA_BUILD_SYSROOT=${SDKROOT}" >> $GITHUB_ENV
 75          fi
 76        env:
 77          MATRIX_OS: ${{ matrix.os }}
 78
 79      - name: install class
 80        run: |
 81          if [[ ${MATRIX_OS} == "macos-11" ]]; then
 82            export LDFLAGS="-L/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib"
 83            export LDFLAGS="$LDFLAGS -L/Users/runner/miniconda3/envs/test/lib"
 84            . ci_scripts/install_class_osx.sh
 85          else
 86            . ci_scripts/install_class_linux.sh
 87          fi
 88        env:
 89          MATRIX_OS: ${{ matrix.os }}
 90
 91      - name: build CCL
 92        run: |
 93          python setup.py build
 94          python setup.py develop
 95
 96      - name: c unit tests
 97        run: |
 98          cd build
 99          make -j4
100          CLASS_PARAM_DIR=./extern/share/class/ OMP_NUM_THREADS=2 ./check_ccl
101
102      - name: python unit tests
103        run: |
104          OMP_NUM_THREADS=2 pytest -vv pyccl --cov=pyccl
105
106      - name: benchmarks
107        run: |
108          OMP_NUM_THREADS=2 pytest -vv benchmarks --cov=pyccl --cov-append
109
110      - name: coveralls
111        env:
112          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
113        run: |
114          coveralls --service=github
115
116#  build-doc-only-ubuntu:
117#    runs-on: ubuntu-latest
118#
119#    steps:
120#      - uses: actions/checkout@v2
121#      - name: install latex
122#        run: |
123#
124#          sudo apt install \
125#            texlive texlive-bibtex-extra texlive-science texlive-publishers \
126#            latexmk python3-sphinx python3-sphinx-rtd-theme python3-nbconvert \
127#            python3-jupyter-client jupyter-client jupyter-nbconvert sphinx-common \
128#            pandoc python3-setuptools \
129#            -y
130#          sudo pip3 install mkauthlist
131#
132#      - name: build docs
133#        run: |
134#          cd doc/0000-ccl_note
135#          make
  • The workflow is trigged by push requests to the main, master or any release branch. It is also triggered by a pull request to any branch.

  • There is one job in the workflow, called build, which uses a strategy: matrix: to spawn two jobs, testing the code on ubuntu-latest and macos-11, both using Python version 3.8.

  • Now the steps of the build job.

    1. The styfle/cancel-workflow-action@0.6.0 GitHub Action is called to cancel any previous CI workflows that are still currently running. The github.token variable is an automatically created and unique GITHUB_TOKEN secret that can be used to authenticate in a workflow run (more here).

    2. The actions/checkout@v2 GitHub Action is called to checkout the repository onto the host machine.

    3. The conda-incubator/setup-miniconda@v2 GitHub Action is called to install MiniConda onto the host machine with the specified version of Python from the strategy matrix.

    4. If this is the macos-11 runner, uninstall the HomeBrew package manager.

    5. Lint the code in the pyccl and benchmarks folders using Flake8.

    6. Install the Python dependencies using Mamba (see note below about setting the environment variables in this step).

    7. Install CLASS (Cosmic Linear Anisotropy Solving System) using the correct script (depending on the runner) within the ci_scripts directory.

    8. Build CCL using setup.py.

    9. Run the unit tests using pytest.

    10. Upload code coverage results output by pytest.

This example has quite a bit more setup to get the host machine ready for the unit tests compared to the simple examples in our demo code, particularly to accommodate for testing on the MacOS operating system. Below is a few more details about the GitHub Actions syntax not covered by the examples in the previous sections of this guide.

Tip

You can download resources onto the host machine using terminal commands just like you would on your own machine (using wget, curl, etc).

Tip

You can evaluate expressions in workflows to give you more control on what part of your workflows run under certain conditions. For example the if: expression on line 43. See the documentation here for more details.

Tip

You can set environment variables on the host machine using env:. The scope of these variables depends on where they are set. For example MATRIX_OS set on line 76 is only visible for the commands within the install deps job.

Tip

You can also set environment variables within a step that are available to any subsequent steps in a workflow job by defining or updating the environment variable and writing this to the GITHUB_ENV environment file (see lines 68-75). The step that creates or updates the environment variable does not have access to the new value, but all subsequent steps in a job will have access.

imSim

 1name: imSim CI
 2
 3on:
 4    push:
 5        branches:
 6            - main
 7            - releases/*
 8
 9    pull_request:
10        branches:
11            - main
12            - releases/*
13
14jobs:
15    build:
16        runs-on: ${{ matrix.os }}
17
18        strategy:
19            matrix:
20                # For now, just ubuntu, 3.8.  Can add more later.
21                os: [ ubuntu-latest ]
22                py: [ 3.8 ]
23                CC: [ gcc ]
24                CXX: [ g++ ]
25
26        defaults:
27            run:
28                # cf. https://github.com/conda-incubator/setup-miniconda#important
29                shell: bash -l {0}
30
31        steps:
32            - uses: actions/checkout@v2
33
34            - name: Setup conda
35              uses: conda-incubator/setup-miniconda@v2
36              with:
37                  activate-environment: stack
38                  python-version: 3.8
39                  condarc-file: etc/.condarc
40
41            - name: Install conda deps
42              run: |
43                conda info
44                conda list
45                conda install -y mamba
46                mamba install -y --file etc/standalone_conda_requirements.txt
47                conda info
48                conda list
49            - name: Install pip deps
50              run: |
51                # We need to get batoid onto conda, but for now, this is a separate step.
52                pip install batoid
53                pip install skyCatalogs==1.2.0
54                conda info
55                conda list
56            - name: Install rubin_sim and rubin_sim_data
57              run: |
58                # Do the minimal installation from source to avoid installing
59                # a bunch of unneeded packages.
60                git clone https://github.com/lsst/rubin_sim.git
61                cd rubin_sim
62                pip install -e .
63                cd ..
64                mkdir rubin_sim_data
65                # Just get the skybrightness and throughputs data for now.
66                curl https://s3df.slac.stanford.edu/groups/rubin/static/sim-data/rubin_sim_data/skybrightness_may_2021.tgz | tar -C rubin_sim_data -xz
67                curl https://s3df.slac.stanford.edu/groups/rubin/static/sim-data/rubin_sim_data/throughputs_aug_2021.tgz | tar -C rubin_sim_data -xz
68            - name: Install imSim
69              run:
70                pip install .
71
72            - name: Install test deps
73              run:
74                conda install -y pytest nose
75
76            - name: Run tests
77              run: |
78                export RUBIN_SIM_DATA_DIR=`pwd`/rubin_sim_data
79                eups list lsst_distrib
80                pytest
  • The workflow is trigged by push and pull requests to the main and release branches.

  • There is one job in the workflow, called build, which uses a strategy: matrix: to spawn one job, testing the code on ubuntu-latest using Python version 3.8.

  • Now the steps of the build job.

    1. The actions/checkout@v2 Action is called to checkout the repository onto the host machine.

    2. The conda-incubator/setup-miniconda@v2 Action is called to install MiniConda onto the host machine with a specified condarc file.

    3. Install Mamba and use it to install dependencies from the standalone_conda_requirements.txt file.

    4. Install any dependencies not on Conda using pip.

    5. Manually install rubin_sim and download skybrightness and throughputs catalogs.

    6. Install imSim onto the host machine using pip.

    7. Install dependencies needed for testing.

    8. Run unit tests.

tables_io

 1# This workflow will install Python dependencies, run tests and lint with a variety of Python versions
 2# For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions
 3
 4name: Python package
 5
 6on:
 7  push:
 8    branches: [main]
 9  pull_request:
10    branches: [main]
11
12jobs:
13  build:
14
15    runs-on: ubuntu-latest
16    strategy:
17      matrix:
18        python-version: [3.8, 3.9, "3.10"]
19
20    steps:
21    - uses: actions/checkout@v2
22    - name: Set up Python ${{ matrix.python-version }}
23      uses: actions/setup-python@v2
24      with:
25        python-version: ${{ matrix.python-version }}
26    - name: Install dependencies
27      run: |
28        sudo apt install -y libopenmpi-dev libhdf5-mpi-dev
29        python -m pip install --upgrade pip
30        python -m pip install pylint pytest pytest-cov
31        python -m pip install jupyter
32        if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
33        CC="mpicc" HDF5_MPI="ON" pip install --upgrade --force-reinstall --no-binary=h5py h5py
34        pip install .
35        pip install .[dev]
36      env:
37        CC: mpicc
38        HDF5_MPI: ON
39    - name: Lint with pylint
40      run: |
41        # stop the build if there are Pylint errors
42        pylint --disable=all --extension-pkg-whitelist=numpy --init-hook='import sys; sys.setrecursionlimit(8 * sys.getrecursionlimit())' src/tables_io
43    - name: Lint with flake8
44      run: |
45        # stop the build if there are Python syntax errors or undefined names
46        flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics
47        # exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
48        flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics
49    - name: Test with pytest
50      run: |
51        python -m pytest --cov-report=xml tests
52    - name: Upload coverage to Codecov
53      uses: codecov/codecov-action@v1
54      with:
55        file: ./coverage.xml
56        flags: unittests
57        env_vars: OS,PYTHON
58        name: codecov-umbrella
59        fail_ci_if_error: true
  • The workflow is trigged by push and pull requests to the main branch.

  • There is one job in the workflow, called build, which uses a strategy: matrix: to spawn three jobs, testing the code on Ubuntu using Python versions 3.8, 3.9 and 3.10.

  • Now the steps of the build job.

    1. The actions/checkout@v2 Action is called to checkout the repository onto the host machine.

    2. The actions/setup-python@v2 Action is called to install the specified version of Python onto the host machine.

    3. Install OpenMPI, pip and Python dependencies, then reinstall h5py with parallel support.

    4. Lint the code using pylint.

    5. Run unit tests.

    6. Upload code coverage results.

Note

Testing your code on many different Python versions is a great way to keep your code stable for a wide userbase. However in some cases, like for the dependency list in this repository, you will have to manually guide towards the correct versions (or version ranges) for your package dependencies, depending on the version of Python being installed. The format for this is the same if you are listing the dependencies in the requirements.txt file or the pyproject.toml file. For example, numpy>=1.21.0;python_version>="3.8" says only install numpy versions 1.21.0 or higher if the Python version is 3.8 or higher.