Configuring Python test coverage reporting for SonarCloud with pytest-cov in GitHub monorepos

Kilian KlugeKilian Kluge
3 min read

You have a Python package within a GitHub monorepo and would like to report test coverage to SonarCloud?

Unfortunately, neither SonarCloud's Python Test Coverage documentation nor the Monorepo Support documentation provide clear instructions. The error messages SonarCloud provides when it cannot match coverage data you submitted to the files it scanned provide some hints, but are not instructive either.

Here are the full step-by-step instructions.

Setup

The how-to below assumes that you have the following directory structure:

repository/
├─ ...
├─ package_home/
│  ├─ package/
│  │  ├─ __init__.py
│  │  ├─ main.py
│  │  ├─ ...
│  ├─ tests/
│  │  ├─ __init__.py
│  │  ├─ test_main.py
│  │  ├─ ...
│  ├─ pyproject.toml
│  ├─ ...
├─ ...

Your Python package lives within the package_home directory at the root of your GitHub repository. package contains the actual package and the tests are in the tests folder.

We'll assume that you're using pytest with the pytest-cov plugin and that you can run your tests from within the package_home directory:

cd package_home
pytest --cov=package tests/

Configure SonarCloud

As described in the SonarCloud documentation, you need to set up a separate monorepo project for each package within your repository.

(Note that selecting Setup a monorepo simply allows you to create more than one project associated with a specific repository. It does not create an umbrella project and you can go through the process every time you add a new package to your monorepo.)

Once you've set up the project within SonarCloud, you can then create a sonarcloud-project.properties file in the package_home directory.

This file contains the sonar.projectKey and sonar.organization associated with your new SonarCloud project, as well as the specific configuration for your package:

sonar.projectKey=abc123...
sonar.organization=abc123...

sonar.language=py
sonar.python.version=3

sonar.sources=package
sonar.tests=tests

sonar.python.coverage.reportPaths=coverage.xml

Configure pytest-cov and coverage.py

Let's move on to the tricky part: Creating a coverage report that SonarCloud processes correctly.

The paths within the coverage.xml file are the main issue. They need to match SonarCloud's understanding of the file locations. In our setup, file paths within SonarCloud seem to always be relative to the package_home directory where the sonarcloud-project.properties file is located.

First, we add a brief section for coverage.py to our pyproject.toml:

[tool.coverage.run]
relative_files = true

Then, we tweak the way we're calling pytest:

pytest --cov=. --cov-report=xml tests/

Together, this achieves two things: Instead of the default .coverage file, we get a coverage.xml, and the filename of our main.py is reported as package/main.py.

Set up a GitHub Actions job

Since SonarCloud's automatic scanning is not available for monorepos, we need to report data for all packages through the sonarsource/sonarcloud-github-action.

The relevant bits of the job look as follows:

      - name: Run tests
        working-directory: package_home
        run: |
          pytest --cov=. --cov-report=xml tests/

      - name: SonarCloud Scan
        uses: sonarsource/sonarcloud-github-action@master
        with:
          projectBaseDir: package_home
        env:
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
          SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}

That's it, you're ready to report the first coverage data to SonarCloud.

If your workflow collects coverage information in multiple places, you can use GitHub Action's cache action to collect and submit this data.

0
Subscribe to my newsletter

Read articles from Kilian Kluge directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Kilian Kluge
Kilian Kluge

My journey into software and infrastructure engineering started in a physics research lab, where I discovered the merits of loose coupling and adherence to standards the hard way. I like automated testing, concise documentation, and hunting complex bugs.