SBOM-Fridays: IV. Closed Circuit Setup: Vulnerability Scanner, Background Processor for SBOM Generation and Blocking Release Test

Mister PMister P
7 min read

An SBOM generated at runtime can already have significant added value over SBOMs generated at build or compile time. But to truly beef up your security-processes, here are some neat complementary building blocks to create a sturdy trident of vulnerability mitigation. The three parts are

  • Our SBOM artifacts for static and (multiplied by X releases) dynamic vulnerability assessment.

  • A Vulnerability Scanner, that continuously scans for newly discovered vulnerabilities.

  • A blocking test for when vulnerabilities are discovered during the build/release-process.

If you really want to go fully automated, upon discovering of vulnerabilities, you could set up a poller to the NuGet/NPM repositories, pulling fresh data until a patch has been found, and automatically create a PR for it (things like dependabot and others do). Not saying it is a smart or wise thing to do, just saying you can.

Since last post dealt extensively with the creating process of the SBOM, we’ll keep it at a minimum here. We'll pick up with post-processing SBOMs in the next installment.

The blocking unit test I already covered in an older post on this blog. Here, I’ll just show some possible extensions to it and assess the trade-offs that come with it. Since the same flows can actually be reused for a continuously running scanner-process, we’ll talk about use-cases and opportunities for both of them.

The test we will start from, as shown in the older blog post, looks like this:

[Fact]
public async Task CheckPackageVulnerabilities()
{
    // Arrange
    var vulnerableNuGetPackages = new Dictionary<NuGetInfo, List<PackageVulnerability>>();

    // Act
    foreach (var projectFile in _projectFiles)
    {
        foreach (var packageReference in MarkUp.Xml.ListNuGetPackages(projectFile))
        {
            var metadata = await _nuGetRegistryClient.GetPackageMetadataAsync(packageReference, new());
            var nuGetVulnerabilityInfo = metadata.Vulnerabilities ?? new List<PackageVulnerabilityMetadata>();
            var osvVulnerabilityInfo = await _osvRegistryClient.ListPackageVulnerabilitiesAsync("NuGet", packageReference.NuGetPackage, packageReference.Version, new());

            if (nuGetVulnerabilityInfo.Any() || osvVulnerabilityInfo.Count != 0)
            {
                var nugetInfo = new NuGetInfo
                {
                    Project = packageReference.Project,
                    NuGetPackage = packageReference.NuGetPackage,
                    Version = packageReference.Version
                };

                vulnerableNuGetPackages.Add(nugetInfo, osvVulnerabilityInfo.Concat(nuGetVulnerabilityInfo.Select(x => new PackageVulnerability
                {
                    Url = x.AdvisoryUrl?.AbsoluteUri ?? string.Empty,
                    Details = x.Severity.ToString()
                }).ToList()).ToList());
            }
        }
    }

    // Assert
    Assert.Empty(vulnerableNuGetPackages);
}

This test takes under 5 minutes to complete, which for me is acceptable to be run in a release pipeline, so it can block potentially vulnerable releases. You can obviously run the same test for nested packages and vulnerabilities, should your business processes require that. You again take the files we uploaded to the production server and start chopping them into package-references to call the public repositories.

In the front-end we can apply the same principle:

[Fact]
public async Task CheckPackageVulnerabilities()
{
    var uiProj = Architecture
        .ListSolutionProjectPaths()
        .SingleOrDefault(x => x.EndsWith($"{Application.UI}.csproj"));

    var distinctPackages = JsonContracts.Dependencies
        .ListPackageDependencies(Path.Combine(Path.GetDirectoryName(uiProj), "ClientApp", "package.json"));

    var vulnerabilitiesFound = new List<PackageVulnerability>();

    foreach (var package in distinctPackages)
    {
        var name = package.Key;
        var version = package.Value?.ToString();
        var metadata = await _npmRegistryClient.FetchFullPackageMetadataAsync(name, version, new());

        if (metadata?.Vulnerabilities is not null && metadata.Vulnerabilities.Any())
        {
            vulnerabilitiesFound.AddRange(metadata.Vulnerabilities);
        }
    }

    Assert.Empty(vulnerabilitiesFound); // 0-vuln policy for soft fail or audit mode
    Assert.DoesNotContain(vulnerabilitiesFound, v => v.Ratings.Any(r => r.Score > 5)); // more lenient for releases
}

Notice the two Assert calls: you can use them to define and document your business rules, bringing flexible security to your organization. Here you can fetch the same data for nested packages again, but I personally wouldn’t add these to the release pipeline. With easily over a thousand npm-packages, at about 3 or 4 API-calls a piece (if you add in the NVD-info), you can imagine this won’t be finished over a shot of espresso.

For the vulnerability-scanner itself, I’ve added a .NET Background Service, that runs every six hours. For me that’s enough, but if your organization requires more assurance, you can obviously tend to those needs by increasing the frequency to your liking. Following snippet is set up for mono-repos with multiple back-end projects and a single front-end (no nested package.json files). You can easily adapt this to your architecture if needed.

public async Task DoWork(CancellationToken cancellationToken)
{
    try
    {
        while (!cancellationToken.IsCancellationRequested)
        {
            Dictionary<string, string> activityReport = new()
            {
                { "Scan Started", DateTime.UtcNow.ToString()}
            };

            var sourceFiles = Architecture.ListSolutionSourceFilesPaths(_env.IsProduction());
            var packageJson = sourceFiles.FirstOrDefault(x => x.EndsWith("package.json")); 
            var csprojs = sourceFiles.Where(x => x.EndsWith(".csproj"));
            var npmVulnerables = new List<NpmVulnerabilityInfo>();
            var nuGetVulnerables = new Dictionary<NuGetInfo, List<PackageVulnerability>>();
            var topLevelNpmPackagesCount = 0;
            var topLevelNuGetPackagesCount = 0;

            if (!string.IsNullOrWhiteSpace(packageJson) && File.Exists(packageJson))
            {
                var topLevelNpmPackages = Utilities.ClientContracts.JsonContracts
                    .Dependencies
                    .ListPackageDependencies(packageJson);

                topLevelNpmPackagesCount = topLevelNpmPackages.Count();

                foreach (var package in topLevelNpmPackages)
                {
                    var name = package.Key;
                    var version = package.Value?.ToString();

                    try
                    {
                        var metadata = await _npmRegistryClient.FetchFullPackageMetadataAsync(name, version, cancellationToken);

                        if (metadata?.Vulnerabilities is not null && metadata.Vulnerabilities.Any())
                        {
                            npmVulnerables.Add(new NpmVulnerabilityInfo
                            {
                                Project = Architecture.PetFaceUI,
                                NpmPackage = name,
                                Version = version,
                                Vulnerabilities = metadata.Vulnerabilities ?? new List<PackageVulnerability>()
                            });
                        }
                    }
                    catch (Exception ex)
                    {
                        _telemetry.TrackException(ex);
                    }
                }
            }

            activityReport.Add("Amount of NPM Packages Scanned", topLevelNpmPackagesCount.ToString());
            activityReport.Add("Amount of new NPM Vulnerabilities", npmVulnerables.Count.ToString());

            foreach (var csproj in csprojs)
            {
                foreach (var packageReference in MarkUp.Xml.ListNuGetPackages(csproj))
                {
                    try
                    {
                        topLevelNuGetPackagesCount++;
                        var metadata = await _nuGetRegistryClient.GetPackageMetadataAsync(packageReference, cancellationToken);
                        var vulnerabilities = metadata.Vulnerabilities ?? new List<PackageVulnerabilityMetadata>();
                        var osvVulnerabilityInfo = await _osvRegistryClient.ListPackageVulnerabilitiesAsync("NuGet", packageReference.NuGetPackage, packageReference.Version, new());

                        if (vulnerabilities.Any() || osvVulnerabilityInfo.Count != 0)
                        {
                            var nugetInfo = new NuGetInfo
                            {
                                Project = packageReference.Project,
                                NuGetPackage = packageReference.NuGetPackage,
                                Version = packageReference.Version
                            };

                            var nugetPackage = new NuGetVulnerabilityInfo
                            {
                                NuGetPackage = packageReference.NuGetPackage,
                                Version = packageReference.Version,
                                Project = packageReference.Project,
                                Vulnerabilities = vulnerabilities
                            };

                            nuGetVulnerables.Add(nugetInfo, osvVulnerabilityInfo.Concat(vulnerabilities.Select(x => new PackageVulnerability
                            {
                                Url = x.AdvisoryUrl?.AbsoluteUri ?? string.Empty,
                                Details = x.Severity.ToString()
                            }).ToList()).ToList());
                        }
                    }
                    catch (Exception ex)
                    {
                        _telemetry.TrackException(ex);
                    }
                }
            }

            activityReport.Add("Amount of csprojs Scanned", csprojs.Count().ToString());
            activityReport.Add("Amount of NuGet Packages Scanned", topLevelNuGetPackagesCount.ToString());
            activityReport.Add("Amount of new NuGet Vulnerabilities", nuGetVulnerables.Count.ToString());

            if (nuGetVulnerables.Any())
            {
                activityReport.Add("NuGet Vulnerabilities found in:", string.Join(", ", nuGetVulnerables.Select(x => $"{x.Key.Project}: {x.Key.NuGetPackage} - {x.Key.Version} (severity: {string.Join(", ", x.Value.Select(v => v.Details))})")));
            }

            if (npmVulnerables.Any())
            {
                activityReport.Add("NPM Vulnerabilities found in:", string.Join(", ", npmVulnerables.Select(x => $"{x.Project}: {x.NpmPackage} - {x.Version} ({string.Join(", ", x.Vulnerabilities.Select(v => v.Description))})")));
            }

            if (nuGetVulnerables.Any() || npmVulnerables.Any())
            {
                var alertJob = new AlertVulnerabilityJob
                {
                    NuGetVulnerabilities = nuGetVulnerables,
                    NpmVulnerabilities = npmVulnerables
                };

                await _generateSbomJobQueue.QueueAsync(new GenerateSbomJob
                {
                    Solution = "PetFace",
                    Version = ApplicationBuilder.Version,
                    Email = "pepijn@dogface.be",
                    Environment = _env.IsProduction() ? "Production" : _env.IsDevelopment() ? "Development" : "Staging"
                },
                cancellationToken);

                await _alertVulnerabilityJobQueue.QueueAsync(alertJob, cancellationToken);
            }

            activityReport.Add("Scan Finished", DateTime.UtcNow.ToString());

            _telemetry.TrackEvent("Package Vulnerability Scan", activityReport);

            await Task.Delay(Time.Hourly * 6, cancellationToken);
        }
    }
    catch (Exception ex)
    {
        _telemetry.TrackException(ex);
    }
}

For the implementations of the external API-calls, see part III of this series. You might have to scroll a bit to find it though!

What does this method do? It iterates over frameworks and libraries, fetches NuGet and NPM registry info, along with any potential records in the OSV database and distills this information into a dedicated VulnerabilityInfo class.

When vulnerabilities are found, three things happen:

  • We create an AlertVulnerabilityJob, containing all discovered vulnerabilities, and put it on a queue for further actions. We use another background service to notify the stakeholders or engineering teams.

  • We create a GenerateSbomJob, briefly spoke about in the previous installment. By working with a queue, we can effectively decouple our processes.

  • We track this process in Telemetry/Logging, to to prove operational integrity and provide audit value.

Essentially, it’s a closed-circuit, automated vulnerability detection loop, designed to integrate with SBOM generation and notification mechanisms, providing a little piece of mind for your CISO.

Just like in the engine, we could choose to include NVD info as well, but since we’re mainly interested in detection, I’m opting to keep these loops lean. Once a vulnerability has been discovered, best practice is still to put some Human Intelligence on it. We attain this by sending out mails, notifications, … from the AlertVulnerabilityJobProcessor.

This was already the side-quest I wanted to walk you through. Even though it’s very lightweight, the value it adds to your solution’s security and auditability is considerable. For me personally, this may even be the biggest advantage of putting your source files on the production server.
For CISO & Compliance, you could easily expand these scans to include license information and validate them against a list of allowed ones to even get more out of our approach. In the current software landscape, license changes are not that rare anymore.

Either way, I’m really interested in how you or your CISO looks at this. Leave your insights and considerations in the comments below!

Next two parts of this series will be of similar depth and touch upon

  • Post-processing your SBOM instance

  • Dashboard visualizations of our SBOM-information

Hope to see you there!

Thank you for reading.

0
Subscribe to my newsletter

Read articles from Mister P directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Mister P
Mister P