Choosing the Right Type of Reporting

Learn about the different reporting styles and choose one based on your needs

Alan Parkinson avatar
Written by Alan Parkinson
Updated over a week ago

A key factor when building and running your automated acceptance tests is the feedback you get from them. The goal is to ensure that you are given the right level of information to make further decisions and choices, whilst not being overwhelmed with too much information. Too little or too much detail will slow down teams as they attempt to discover more or parse the useful information. Luckily there are different options to suit your teams needs:

Pretty

The pretty format outputs each step that is run along with any data tables and output from step definitions. Depending on the steps status the formatter will colour them green (pass), red (fail) or yellow (pending).

Pros

  • Outputs a lot of detail so you can debug issues in your acceptance tests

  • For local runs it integrates well with terminals and command lines to give you rich feedback

  • The colour coding makes it easy to quickly identify issues

Cons

  • When there are large amount of acceptance tests it can output too much information making it hard to search for issues.

  • Some CI tools that output the pretty content don't render the colours making it harder to decipher issues

  • It's hard to parse the report into other reporting tools if you require further reporting

Java

mvn test -Dcucumber.options="--plugin pretty"

Ruby

cucumber --format pretty

JavaScript

npm i cucumber-pretty
cucumber-js -f node_modules/cucumber-pretty

HTML

The HTML format outputs a similar level of detail to the pretty formatting but this time throws it into an easy to read HTML report with additional features such as expanding and collapsing details around each automated acceptance test.

Java

mvn test -Dcucumber.options="--plugin html:output/path"

Ruby

cucumber --format html > report.html

JavaScript

npm install cucumber-html-reporter
cucumber-js

Pros

  • Outputs a lot of detail so you can debug issues in your acceptance tests

  • Rich features such as expanding and collapsing details makes it easy to deep dive into failing tests

  • The colour coding makes it easy to quickly identify issues

Cons

  • If not stored properly the HTML report is overwritten each time a new run is executed

  • You have to wait until all tests have been executed before you can review the report

  • It's hard to parse the report into other reporting tools if you require further reporting

JSON

The JSON format outputs all the information from the test run into a standard JSON object that contains a large amount of data for you to process. The JSON format is typically used if you want to consume the data in a bespoke way.

Java

mvn test -Dcucumber.options="--plugin json:output/file.json"

Ruby

cucumber --format json > file.json

JavaScript

cucumber-js --format=json

Pros

  • Outputs a lot of detail that can be consumed by third party reporting tools

  • Can be consumed by a bespoke tool that you or your team have created for your own needs

  • JSON files can be easily stored if teams want access to test runs from the past

Cons

  • If not stored properly the JSON report is overwritten each time a new run is executed

  • Requires up front work to find or build a tool to consume the JSON file

  • It's hard to read the JSON to discover and debug failing automated acceptance tests

JUnit

The JUnit format offers similar features to the JSON format but this time using the XML structure that JUnit uses for reporting test runs. This is particularly useful as many reporting tools have support for the JUnit report structure. So integrating these reports into 3rd party tools can be very easy.

Java

mvn test -Dcucumber.options="--plugin junit:output/file.xml"

Ruby

cucumber --format junit -- report_directory

JavaScript

npm install cucumber-junit-reporter

Pros

  • JUnit format is a standard supported by a lot of third party reporting tools and CI

  • Can be consumed by a bespoke tool that you or your team have created for your own needs

  • JUnit files can be easily stored if teams want access to test runs from the past

Cons

  • If not stored properly the JUnit report is overwritten each time a new run is executed

  • JUnit format doesn't report the same level of detail than other report styles. For example outputting data from step definitions

  • It's hard to read the JUnit XML to discover and debug failing automated acceptance tests

Did this answer your question?