Creating a Jenkins pipeline for a .NET Core application

This post is intended to give a brief introduction of how to create a Jenkins Pipeline for deploying a .NET Core (or .NET) application contained in an MSIX package, including unit tests and code coverage. Since this isn’t the most common workflow (because .NET has better integrations with other tools like Azure DevOps Pipelines), the documentation is a little scattered, so we are creating this post to facilitate this process.

First of all, we present a very brief description of what Jenkins is and what it does. Also, let me mention the team that worked on this, contributing as well as giving feedback: Ignacio Boada, Matías Nicolás Gesualdi, Gabriela Gutierrez, Nicolas Bello Camilletti, Mauro Krikorian and Juan Pablo Tomasi.

What is Jenkins?

Taken directly from its official documentation, Jenkins is a self-contained, open-source automation server that you can use to automate all sorts of tasks related to building, testing, and delivering or deploying software.

How does Jenkins Work

Jenkins works by automatically executing certain scripts to generate files that are required for deployment. These scripts are called JenkinsFiles, and they are just text files that can contain declarative or scripted code. In this post we will be focusing on declarative pipelines.

There are several ways to automate Jenkins execution, for example, triggering it periodically, or when a developer commits to a branch or creates a Pull Request.

Jenkins Pipelines

Jenkins pipelines are a suite of plugins that supports implementing and integrating continuous delivery pipelines into Jenkins. Once triggered, a Jenkins pipeline will execute any code in its JenkinsFile, and generate the artifacts that are needed for deployment. To define a Jenkins Pipeline, you write a JenkinsFile, which in turn can be committed to the source control repository. This is the foundation of “Pipeline-as-code”; treating the CD pipeline as part of the application to be versioned, and reviewed like any other code.


First, Jenkins requires Java 8 or 11 JDK. You can download it by clicking here (SDK version 8). Also, you have to set the Java environment variables, if you haven’t done so already. You can do this by going into the Environment Variables settings (you can find it by typing the Windows key and directly typing “environment”). Once there, create two new variables, JRE_HOME and JAVA_HOME, and set them to your jdk and jre executables.

Other tools needed

Visual Studio 2019

The easiest way to get all the tools we need is to install Visual Studio Community 2019. You can download it from here.

Git for Windows

Jenkins will need Git to be able to pull the code from repositories. You can download it from here.

Getting Started: Installing Jenkins

In order to be more concise and not extend ourselves too much with this post, we won’t include here installation instructions for Jenkins, but you can find them in this tutorial, or in the official Jenkins page if you need something more elaborate like installing it on a docker container. When installing it, you will be asked to define administrator credentials. Write these down, as you will need them to access Jenkins later.

Logging in to Jenkins

Once we have everything installed, we can start using Jenkins. After the installation, Jenkins will automatically start running on its own process. The way to access it is through a web browser in the http://localhost:8080 address. By default, Jenkins listens on port 8080, and is set to be accessible only from localhost. You can change it later by modifying “Jenkins Location” in the settings menu.

Jenkins login page

Creating the pipeline

There are two pipeline types: Pipeline and Multi-branch pipeline.


A pipeline is intended to track a single branch (usually the master branch) and has a single JenkinsFile. By convention, this file is usually placed in the root directory, but you can move it somewhere else, and specify its location when creating the pipeline.

Creating a pipeline in Jenkins

Multi-branch pipeline

A Multi-branch pipeline is intended to support multiple environments, so you can track several branches at the same time. Each of these branches must include a JenkinsFile, and in this case we can’t change the location, all of them must be in their respective root directory.

The multi-branch pipeline will then look for JenkinsFiles in all branches, and trigger a deployment for each one that has changes.

If creating a Multi-branch pipeline, we suggest using the new BlueOcean plugin.


BlueOcean is a plugin that changes most of Jenkins GUI. It also comes with many other plugins integrated, that make some workflows, like integration with GitHub, a lot easier. Creating a Multi-branch pipeline with BlueOcean is much easier than using the normal Jenkins interface.

Creating a Multi-branch pipeline using BlueOcean

To automatically integrate with GitHub or other version control repository, you need to provide credentials to Jenkins. The best way to do this (and the only one if you are using Two-Factor Authentication) is to create a Personal Access token.

Creating a JenkinsFile

Once the pipeline has been created, we have to define at least one JenkinsFile. To help in the creation of declarative pipelines, Jenkins offers its Declarative Directive Generator. You can access it from the Pipeline configuration page. This tool has preloaded many common actions that are used in pipelines. These actions can receive parameters, so if what you want to do is in the list, you just fill the parameters and click “Generate Pipeline Script”, and the tool will show the code that must be added to the JenkinsFile.

Pipeline steps for .NET Core applications

Pipelines are normally separated in “stages”. Each of these stages executes some actions that are important for the deployment. Up to this point, we have shown how to use Jenkins in a more or less generic way. Now we will show what specific steps we have to add in order to create a pipeline for deploying a .NET Core application.

  • Clean the Workspace: It is always a good idea to erase any files from prior pipelines execution, otherwise we risk getting unexpected behavior. To execute this stage, we are using the workspace cleanup plugin.

  • Get source code: After the workspace is clean, we have to retrieve the code from the repository. This step can be generated with the Declarative Directive Generator tool. At this point you will need to provide credentials. As said before, I recommend creating a personal access token.

  • Restore NuGet packages: .NET Core applications use the NuGet package manager. To ensure that our application has any binaries that it needs to build, we have to run a “restore” operation. The ${workspace} is a Jenkins variable, that has the current path to its workspace. Jenkins variables are accessed using the “${}” syntax.

One problem we had is that Jenkins workspace is defined by default inside the Windows/System32 folder. In 64-bit Windows versions, the System32 folder is not accessible by 32-bit applications. Since Visual Studio and MSBuild.exe are 32-bit applications, we had problems with them recognizing files in the Jenkins default workspace. We recommend changing the default workspace to another location to avoid this issue.

  • Clean the solution project: in this case, we used the MSBuild.exe command line. For more information about the switches and parameters available for the MSBuild.exe application, click on this link. The clean is instructed by the `/t:clean` parameter. Normally, this step can be done together with the build, but we were getting errors doing it that way, so we had to do it separately instead. If you don’t have MSBuild.exe in the Windows environment variables, you will need to add it, like we did before with the Java variables.

  • Increase version: To distinguish the different builds, we increased the application’s version. MSIX packaged applications have their version in the package manifest. The easiest way we found to do this was using a PowerShell script to directly open and modify the manifest file. The new version comes from the BUILD_NUMBER Jenkins variable that increases by 1 each time the pipeline runs.

  • Build the solution: This stage builds the solution, generating the package, that will be the one of the artifacts resulting from the pipeline. Note the `/p:PackageCertificateKeyFile` parameter. This parameter instruct to MSBuild where is located the certificate that will be used to sign the package. If not specified, MSBuild will look for the certificate in the solution’s root location.

  • Running the unit tests and creating the test results: This step is necessary to run the tests and create the results file. The first two lines just use a shell to run `dotnet test` on our application (You can use both “/” or “\\” for paths). The `–logger` option specifies to use the junit logger for the test results. The `–collect “Code coverage”` tells dotnet to generate a “.coverage” file with the results of the code coverage. Then, we have a powershell script that renames the coverage files and moves them to a folder we defined ($destinationFolder).

  • Generate the code coverage report: to generate the code coverage report, you need to perform three actions:

1. Convert the .coverage file: The “.coverage” is a proprietary format from Microsoft. To be able to see the results in Jenkins, we have to convert the “.coverage” file into a “.coveragexml” file by using the CodeCoverage.exe app:

2. Generate the report: by using the ReportGenerator.exe app.

3. Publish HTML report: To make the results from the tests and the code coverage available in the Jenkins pipeline, we use the HTML Publisher plugin. After installing it, you can use it as follows. Parameters are self-explanatory.

Resulting tests HTML report

If your HTML report shows as if not having any styles, follow the steps in this post.

  • Archive artifacts: to store the artifacts that the pipeline outputs we added these the “post” directive. This directive will execute after all the stages have been executed. The “always” means that it will be executed, regardless of success or failure of the pipeline. In our case, the artifacts are the “.msix” package created, and the results.xml from the test results.

Running the pipeline

Once we have the JenkinsFile ready, we can try running the pipeline directly from the Jenkin’s Pipeline homepage. Jenkins will show which stages ran successfully and how long it took to complete. It will also provide a link to the artifacts if they could be generated.

Overview of pipeline stages execution
Pipeline execution results

Adding a GitHub Webhook to our pipeline

As we said before, one option for running Jenkins is polling for changes every certain amount of time. This approach is inefficient because it does unnecessary polls where there are no changes, and may take too long to make a poll when there are changes. To solve this, we can use webhooks. In this case, we used GitHub, so we will explain how to do this for a GitHub repository.

GitHub Webhooks are a mechanism for sending requests to an URL when certain conditions are met. This POST request is then used by Jenkins to know if it must start executing a pipeline.

To create a webhook we must go to the Settings->Webhooks section of a GitHub repository.

Formulary for creating a Webhook in GitHub

The Payload URL must have the form http://<jenkins-url>/github-webhook/. (Don’t forget the last “/”, otherwise it won’t work). The <Jenkins-url> can be configured in Jenkins by changing its “Jenkins URL” attribute.

Keep in mind:

  • <Jenkins-url> must be a public IP (and include the port if necessary). Also, the port must be open to inbound connections from GitHub webhook IPs. These can be retrieved from the meta API, in the “hooks” section of the Json.
  • Content-type must be Json
  • Secret is not obligatory, but can be used to secure the communication
  • The “Just the push event” only triggers pipelines on pushes to the corresponding branch. Other events (like PR creation) can be added by using the “Let me select individual events” option.

Once the webhook is created, GitHub will send a POST request to Jenkins every time that the selected events are raised, and this will trigger the Jenkins pipeline.

Summing Up

Jenkins is actually much more powerful than what we have shown this post, but it requires a lot of expertise for complicated workflows. We have only shown how to solve a simple scenario that is not-so-common, and that can serve you as an introduction. Thank you for reading.

Thanks to Nicolás Bello Camilletti

Originally published by Sebastian Rial for SOUTHWORKS on Medium 08 January 2021