All Collections
Getting Started
Publishing Test Reports to Jira
Publishing Test Reports to Jira

Create Living Documentation within Jira by publishing test results

Alan Parkinson avatar
Written by Alan Parkinson
Updated over a week ago

Note: Multiple test reports can be uploaded as part of the same git commit, for information on avoiding conflicts, see here.

Note: Test results do need to be processed to then be displayed in your Behave Pro requirements page; this may take some time especially if it is a large batch. If you are not seeing your test results immediately it may be worth waiting a few minutes and then refreshing the page.

This guide uses cucumber reports as a guide however the process is the same with other test report types. Results can be visualized within Jira and Behave Pro to demonstrate development progress, release state, and Living Documentation of the software's functionality.

Creating an API key

To upload Cucumber test results, first, you have to create an API key that will be used for authentication during the upload. Creating an API key can be done by navigating to Behave Pro settings in your Jira project, then click the Configure button in the Living Documentation section.

Next, follow the API key setup. Make sure you have copied your API key, you will not be able to see it again. After you have created it, you will be able to see a list of your API keys as shown below.

If you want to delete your API key, click on the three dots icon and select Revoke. This will permanently delete your API key.

Uploading test results

You can start uploading test result via an HTTP request. You can use any command-line tool to do that (curl, wget, etc.). For example, you can use the following curl command as shown below.

curl -L -X PUT \

-H "X-API-KEY: your_api_key” \

-H "X-COMMIT-ID: your_commit_sha" \

--data-binary @./target/cucumber-reports/cucumber.json

If you are using Powershell you can use the following command:

Invoke-WebRequest `

-Method PUT `

-Headers @{'X-COMMIT-ID'='your_commit_id';'X-API-KEY'='your_api_key'} `

-Infile ./target/cucumber-reports/cucumber.json

Note: Make sure your tool to send the test report follows the HTTP 307 redirects. When using curl this can be done via -L option. The Invoke-WebRequest follows redirects by default.

Note: If you are using Jira Server or Jira Data Center, the URL to upload the test results is the domain name of your Jira followed by /rest/behavepro/1.0/bdd

For example:

Note: If you get an error "Can not deserialize instance of out of START_ARRAY token" then you need to upload the report as a raw file and not a JSON request. Unset the "Content-Type" header in your request, or set it to "application/octet-stream".

You have to send the test report as the body in the request and you will have to send the following headers:

  • X-API-KEY: This is the API key you have created in the previous step.

  • X-COMMIT-ID: This is the shape of your commit. This must be the long version of the commit sha (40 characters).

The following headers are purely optional. It will affect how the test results will be displayed on the test results panel.

  • X-BUILD-ID: This is the build ID that will be shown on the user interface. If you are using matrix build then this build ID should be the same across all of your matrix jobs.

  • X-BUILD-URL: This will be shown on the user interface. This can be the URL that links to your build. If you are running this manually (not in CI/CD) this can be any URL.

To learn more about uploading test results from matrix jobs, you can read about it here.

Viewing test results on scenario level

To see the scenarios with test results you need to click on the three dots icon and select the acceptance criteria from the dropdown on the issue page to the Acceptance criteria panel.

You will be able to see the scenarios with the status of the test results. To see more details about the test results you need to click on the green tag (or red if the test has failed) as shown below.

After clicking on the test result, the details are displayed on the results panel.

In addition, to test results, you also will be able to see the failed results button just above the list of the scenarios. With this button, you can quickly see test failures on an Issue, particularly if you’re using feature branching.

By clicking on the failed results you can see let if there is a Scenario failing that is associated with the current Issue, and also Scenarios that are not (regression tests).

Viewing test results on feature level

To see test results on the feature level you can visit the Requirements page then you will see feature test results directly on each Requirements card.

You can also see a summary overview of the test results on the Requirements page on the right top corner. Once you click on the Test Results button you will see the test results summary panel which displays result stats across the whole project, providing the immediate project state without having to locate individual Scenarios.

By clicking on each feature card you can navigate into a feature and you can view result stats for that particular feature.

Using CI/CD Pipelines

If you want to upload your test results as part of your pipeline you can use the curl and webrequest commands above, we have guides for the formatting of some specific pipelines below:

Matrix Jobs

Behave Pro supports Matrix jobs, allowing you to visualise test results across multiple platforms, devices and configurations. For a guide on how to publish matrix test reports, see here.

Can't see your test results?

2 common problems to check.

1 - Is the commit SHA being uploaded the most relevant and latest one for your branch?

Test results are based on the Git commits so the SHA provided needs to be the latest that is represented in Git/Behave Pro.

This is usually achieved by incorporating the results upload stage into your CI pipeline; although if you are testing manually we would recommend make sure you send the latest commit SHA with your HTTP request.

If you are using feature branching, the Issue will have its own feature branch, the commit being represented by the issue will be different than that shown on the requirements page (the base branch). This means that for the test reports to be shown on the user story you will need to upload a report for the head SHA of the branch the issue represents.

2 - Is the correct JSON file being uploaded?

Different test automation tools can generate different test results, please make sure you have a valid JSON file and have the correct uri field , your reports won't be visible for you if you are missing or have wrong the URI property in your JSON report. This needs to match the path that is also represented in the Git repository, for example:

"uri": "Features/FeatureFileName.feature"

This is how Behave Pro is able to create a link between the Feature data and test reports.

Did this answer your question?