Skip to main content
Account-Level Performance Reports

How to generate reports across your account's projects and experiments

Updated over 5 months ago

This article provides an overview of the performance reports you can generate for insights across your account's projects and experiments.

  • Please refer to this article for more information about performance reporting on the individual experiments that use a standard optimisation methodology.

  • For more information about performance reporting on individual dynamically optimised experiments, check out this article.

What is an account-level performance report?

Jacquard's performance report feature allows you to create reports that examine and compare Jacquard's performance across your projects and individual experiments for different periods.

It offers more granular and additional reporting functionalities to your account dashboard.

Performance reports are available for broadcast and trigger experiments across email, push, and SMS.

How can I create a report?

To run a performance report:

  1. Click on the Performance tab located under Reports in the top navigation bar.

  2. Select the filters you wish to apply to your report.

  3. Click Run report.

Report filter options

The filter options available to you will be dependent on your account set-up and the types of messages available to you in Jacquard. These filters can include:

  • Message Type: Select the type of send that you'd like to run the report on (Broadcast or Triggered).

  • Channel: Select the channel you'd like to run the report on (Email, Push or SMS).

  • Projects: If you have multiple projects on your account, you can select up to four projects to compare.

  • Experiments (optional): The report will include all experiments by default. But you can choose to include or exclude specific experiments in the report using this dropdown.

  • Date Range (broadcast only): By default, the report will run using the date of the first and last experiment in the selected project(s) as the date range. For Broadcast projects, you can run the report on a specific date range using this filter.

What does the report show?

The performance report will display a range of different widgets depending on your selected filters.


Performance data displayed for broadcast experiments (email, push and SMS)

1. Summary for selected projects - broadcast experiments

This first widget provides an overview of what's in the report and displays:

  • The total number of experiments included in the report.

  • An overall average open and click uplift across the report's experiments. Note: Exact metrics displayed depend on the channel selected.

  • Total estimated incremental opens and clicks across the experiments in the report. Note: Exact metrics displayed depend on the channel selected.

2. Summary report for selected experiments - broadcast experiments

This widget gives you a breakdown of all the experiments that have been included in the report that you've run (based on filter selections).

It is possible to sort the columns by clicking on the column header. And you can click the name of the experiment to navigate to the individual experiment.

3. Average uplifts

Available for broadcast experiments.

The average uplift graphs plot the average uplift of experiments over a given time period. Each project included within the report is represented as a bar on the bar chart with an additional column for the average. Periods are automatically set based on the date range selected:

  • Less than 3 months - weekly

  • 3-12 months - monthly

  • 12+ months - yearly

Below the chart is the overall average lift metric for each campaign. Note: It is possible for there to be negative uplift for a given period for a metric that is not the primary optimisation metric.

4. Cumulative incremental events

Available for broadcast experiments.


Performance data displayed for trigger experiments (email, push and SMS)

1. Summary for selected projects - trigger experiments

This first widget provides an overview of what's in the report and displays:

  • The total number of experiments included in the report.

  • The number of Live variants across the project(s) and/or experiment(s) selected.

  • An overall average open and click uplift across the report's experiments. Note: Exact metrics displayed depend on the channel selected.

  • Total estimated incremental opens and clicks across the experiments in the report. Note: Exact metrics displayed depend on the channel selected.

2. Summary report for selected experiments - trigger experiments

This widget gives you a breakdown of all the experiments that have been included in the report that you've run (based on filter selections).

It is possible to sort the columns by clicking on the column header. And you can click the name of the experiment to navigate to the individual experiment.

3. Open and click rate performance

Open and click rate performance graphs are populated for dynamically optimised experiments. These graphs show how performance has changed for each metric over time.

Did this answer your question?