Reporting
From the execution table view, clicking on an execution will redirect you to the execution details, which are composed of the below views.
Execution steps
This table display the test plan execution steps in details, by default in reverse chronological order (last executed step first).
Again, you can use the table headers to filter the execution steps table and as well change the chronological display order :

If you are using TestSet and TestCase object like the above example, you can see that the execution steps are grouped by test case : you can open a test case details by clicking on it :

In addition, you can re-execute only selected test cases instead of the whole test plan by ticking its checkbox and click on the execute button :

You can toggle the display of the step execution details as follow. For Keywords it will show you for instance the input and output from the keyword call as well as the details of the keyword measurements:

(1) Clicking on the right icon next to a measurement will bring you to the performance interactive analysis for this measurement.
Finally, you can open directly an execution step in the Execution tree (see below) by clicking on its associated button :

Execution tree
The table display your test plan execution steps as a tree in chronological order. Any node can be expanded to display its content :

Performance
The performance tab of the execution view will provide you with the most frequently required metrics (keyword throughput, average response time, etc) and filters out-of-the-box. These metrics provide both statistic over time as well as in a summary form, covering the entire execution. Furthermore, the different charts and tables can be refreshed in real time or browsed as in a stationary version of the view. Lastly, for more advanced queries, drill-downs based on custom dimensions or raw measurement browsing, an Object Query Language (OQL) can be used.
Real-time monitoring
For on-going execution, the performance view will be automatically covering the test current time range and be refreshed every 5 seconds, as illustrated on the screenshot below.

You are presented, from top to bottom and left to right with the following information:
- a performance overview chart showing aggregated results overall measurements
- an aggregates statuses chart showing the overall calls grouped by statuses
- a response times chart by measurements showing by default the average, the metric can be changed using the button at the top right of the chart
- a throughput chart by measurements, the total throughput is displayed as bar chart and using the right axis
- a summary table of all metrics applicable to the measurements (including percentiles, median, counts, etc), aggregated over the selected time interval
- a chart showing the number of parallel threads over time
See more details on Analytics Page Structure
Interacting with the dashboard
- You can change the refresh rate for running execution using the dedicated picker at the top right of the view.
- Most important filtering can be done directly using the top filter selections. The measurements to be displayed or hidden can be selected from the statistics table or be configured in the charts options. See more about filters here
- The time range selection can be done at different level and will be applied to all dashboard widgets. See more about time range controls here.
- The grouping option allows custom aggregations for the Response Times and Throughput charts, together with the table view. See more about grouping here.
- The dashboard allows some measurements settings, like custom metrics, and also ability to hide or show specific data. See more about visual settings here.
Exporting measurements
In case you’d like to use your own toolset to process the raw data, you can use the export button from the top right of the screen. This will include the active filters and time range selection.
Error
The error view display an overview of the errors (grouped by error type) which occurred during the execution :

You can jump directly to the nodes producing the error by click on the error type :

