Use A/B testing to test response rates

 

Updated: November 1, 2016

Applies To: Dynamics Marketing

System_CAPS_importantImportant

Microsoft Dynamics Marketing is no longer available as of May 15, 2018. Your organization has until August 13, 2018 to retrieve your data. For advice about how to download data and assets, and how to clean up Dynamics 365 instances that were previously integrated with Dynamics Marketing, see Prepare for the scheduled sunset of Dynamics Marketing. For more information, see also the customer FAQ and read the blog post Microsoft Dynamics Marketing service will be discontinued, and learn what’s coming next. If you have additional questions, please reach out to MDMquest@microsoft.com.

If you’re considering two different strategies for your marketing email design or content, but are unsure which to use, use the A/B testing feature to test each strategy on a smaller portion of your target audience. Microsoft Dynamics Marketing will run both versions of your email message, monitor the response received by each version, and identify a winner based on your criteria.

You can choose whether to run your test as a simple bulk emailing or to include it as part of a fully automated campaign. When the testing period is finished, the system can either automatically send the winning design to the rest of the list or wait for you to manually identify the winner based on the reported results. For example, if the mail was more often opened with subject A than with subject B, then the system could automatically send the rest of the messages using subject A.

This topic walks you through the steps of setting up and running an A/B test.

System_CAPS_importantImportant

Dynamics Marketing doesn’t control whether your A/B test results are statistically significant; it simply declares a winner based on the conditions you define for the test. (Statistical significance is a mathematical technique for confirming the likelihood that an experimental result was produced by a causal effect, rather than just a chance outcome. In general, a constant and wide difference produced by a large sample set is more likely to be statistically significant than a narrow difference found in a small, widely varying sample.)

Start by going to Marketing Execution > Email Marketing > Email Marketing Messages and create a standard marketing email message, just as you would if you were not going to test it. As you work, be sure to set the following:

  • The Company must be the owner of the marketing contacts you will target with your email. This will either be the site company or a client company.

  • The Designation must be set to “A/B Testing”.

  • The message body text must not include segmented content (content that is conditional based on the specific mailing list the recipient belongs to).

Decide which aspects of your email design, content or sending schedule you would like to test. Usually, a good experiment is one that only tests one variable at a time (e.g., subject). Microsoft Dynamics Marketing enables you to test the following:

  • Text shown in the email “subject” field.

  • Name and email shown in the email “from” field.

  • Email content (could even be an entirely different marketing email design).

  • Email delivery date and time (only possible with campaign automation, not with standalone tests; see Automate campaigns with the campaign canvas for details).

You must also decide on criteria for choosing the winner of the test. Dynamics Marketing is able to evaluate any combination of the following (multiple criteria are weighted equally when finding the winner):

  • Number of opens

  • Number of total clicks

  • Number of unique clicks

  • Number of hard bounces

  • Number of soft bounces

  • Number of forwards

  • Number of unsubscribes

Positive actions (such as open or click) increase the associated version’s score, while negative actions (such as unsubscribe), decrease the score. The final score is a weighted average, thus ensuring, for example, that a large number of opens will outweigh the effect of a single unsubscribe.

Once you have worked out your variables, test group, and winning criteria, you are ready to design your test. If you require an entirely different message, then create it now before going on to create your A/B testing setup.

Use an A/B testing setup to design your test. The setup enables you to establish the email or emails that you will compare, the variables that will change between them, the wining conditions, the test targets, and other aspects of your test.

To create or edit an A/B testing setup, do one of the following:

  • Go to Marketing Execution > Campaign Management > Campaigns and either open an existing campaign or create a new one. From here, go to the campaign-automation canvas and drag an A/B Testing activity to the canvas. Double-click the new A/B Testing tile to either select an existing setup or create a new one. Please note that when you create an A/B test from the campaign-automation canvas, you will be guided with a wizard rather than filling out a form as generally described in the subsections following; please read these for a description of each setting presented by the wizard, but please also see “Integrate A/B testing into an automated campaign” later in this topic for additional notes about working with A/B testing in the campaign-automation canvas.

  • Go to Marketing Execution > A/B Testing > A/B Testing and either open an existing setup or create a new one.

  • Go to Marketing Execution > Email Marketing > Email Marketing Message and either open an existing message or create a new one. Make sure your message has its Designation set to “A/B Testing” or “Commercial” and then click on the Create A/B test button in the toolbar. This will create a new A/B testing setup with your selected mail already configured as one of the two test messages; it also creates a copy of your selected mail and configures that as the second of the test messages for your setup.

Field

Description

Name

An internal name for the test setup. This name appears on the list page and campaign canvas and will also be shown to identify your test setup elsewhere.

Designation

Choose one of the following:

  • Campaign Automation: To run the test from the campaign canvas. This will deactivate the Lists/Queries tab (the send list will come from the canvas). This designation is automatically applied to test setups created using the campaign canvas.

  • Singular: To run the test from the A/B Testing maintain page. This will activate the Lists/Queries tab so you can use it to define the send list and also enable the Activate button, which will enable you to execute the test from this page.

System_CAPS_importantImportant

Once you have added a given A/B testing setup to a campaign canvas (or created one using the canvas), its designation will be locked to “Campaign Automation” and cannot be changed. However, if you copy the setup, you will be able to change the designation of the copy.

Company

Enter the name of the company for which you are creating the test. You cannot change this after saving the test setup. This company must also own the marketing contacts to which the test will be addressed. When you create a test setup from the campaign canvas, this value is automatically set to the company already associated with the campaign.

Test Recipients

Controls how the test subjects will be selected. Choose one of the following:

  1. Use marketing list: Tests using all contacts from an existing list or query. Type the name of the list you want to use in the type-ahead field provided.

  2. Use sample from lists: Tests using a portion of the lists and/or queries defined for your campaign or test setup. Enter the portion (in %) of contacts from your contact list(s) to include in the test.

Note that contacts will never receive both a test email and the winning email; all test recipients will automatically be removed from the send list after the winner is declared, regardless of which option you choose here.

Winning Criteria

Mark the check box for each criterion you would like the test to consider when choosing a winner. Each criterion you mark will be considered equally and combined into a single score calculated as a weighted average of results.

The A and B versions of your email are summarized side-by-side under the Versions heading. It does not matter which version goes on which side; both columns behave the same. In this area, you must choose an email marketing message and associated settings for each version.

When no message is currently selected for a given column, then that column shows a list of current, unused email marketing messages that are current set with a Designation of “A/B Testing”, plus a search field and the usual buttons that enable you to show/hide messages not created by you and for creating a new message. Use these controls to find or create the message you wish to test. If you choose to create a new message, then a fly-out opens, enabling you to create the message without leaving the A/B Testing page.

If just one column has a messages selected, then the other column will show a Copy button in its toolbar. Choose this to create a copy of the message and load it into the empty column, thus giving you a starting point for designing the second version of your message. However, you might instead create or choose an entirely new message to test against.

Once you have selected a message, the column updates with a new toolbar and also with details about your selected messages. Some of the details shown here can be changed as part of your test. See the following table for details.

 Field or button

 Description

Delete button

Removes a mail from your test, but does not remove it from the database. Use this button if you want to change to a completely different message for one of the versions in your test.

Edit button

Opens a fly-out, which enables you to change any aspect of the message version itself. The settings you make here will be saved with the original message, so be careful. The controls are identical to those provided to edit a message.

Copy button

This button appears only when there is a test message in one column but not the other. It will copy the message to the empty column, thus creating a new email message at the same time. This gives a convenient starting point from which to begin your second test version.

Toggle view all button

This buttons works just like it does on standard list pages. It appears only when you have not yet chosen a message for one or both test versions. Click to toggle between showing all available marketing email messages and showing just those messages that you have made. The button icon changes to indicate the toggle state.

Email winner icon

After the test is finished, this icon appears next to the winning version.

Declare winner button

This button appears after the test has started and remains while the test is running. Choose this to manually declare a winner, regardless of the test outcome.

Undeclare winner button

This button appears in the same column as the winning version only if you have declared a winner manually (using the Declare winner button) and there is still time before the test is finished. Choose this to manually remove winner status from the version.

Version name

Enter a descriptive name of each version. This name is internal, so contacts will never see it.

Email name

The name of the marketing email message, also as it appears under Marketing Execution > Email Marketing > Email Marketing Message. The name will change in both places if you edit it here, so be careful.

Subject

Text that will appear in the “subject” field when contacts receive the message. Edit this value to change the subject for one or both email versions.

From

The Microsoft Dynamics Marketing contact that will be shown in the “from” field when contacts receive the message. The contact must already exist in the database and have a valid email address assigned; start typing a name to see type-ahead suggestions.

From email

Shows the “from” name and email address as it will be shown to recipients (read only).

Status

The current status of the message version.

Recipients

The number of contacts to whom the version was sent during the test (does not include recipients to whom the winner was sent after testing was completed).

Sent

The date and time at which the version was sent.

Once you have saved your test setup at least once, the Send winner section becomes available. Choose one of the following options to control what should happen once a winner has been found:

  • No scheduled delivery: The system waits until a user manually inspects the results and decides what to do. To choose the winner, choose the Declare Winner button for the version you wish to declare as the winner, and then choose one of the other radio buttons for Send winner to decide when to send the winner to the remaining (untested) contacts.

  • Number of hours after both versions sent: After the last of the two versions has been sent, the system waits for the number of hours you specify here and then sends the winning version to all remaining contacts from the send list. If you choose this option, you must also specify the time to wait in the Hours field.

  • Specified time: The system waits until a specific date and time and then sends the winning version to all remaining contacts from the send list. If you choose this option, you must also specify a date, time and time zone in the fields provided.

Once you have saved your test setup at least once, the related-information tabs become available at the bottom of the page, below the dotted line. Use the drop-down here to change between tabs, which are briefly described in this table.

Tab

Description

Email 

Enables you to send a simple email related to the current A/B test. All emails sent using this tab for the current test are also listed here. These emails will also be visible in the site-wide email list under Projects > Emails > Emails.

Lists/queries

This tab only has an effect for tests with Designation set to “Singular.” For tests that are part of an automated campaign, use the campaign canvas to establish the send list.

Use this tab to choose the marketing lists and/or queries to which you will send the winning message after the test is complete (when not using campaign automation). Depending on how you have set up your test, the test message may be sent to a random selection of contacts from these lists as part of the test, or you might use a separate list as your test group. Contacts who receive a test message will not also receive the winning message. You can add both send and suppression lists here.

Log

Displays any log messages associated with your test setup. These are generated automatically.

Notes

Enables you to view, add and reply to notes added to the test setup by you and other users.

Performance

Displays key performance indicators that can help you evaluate the success of your test, such as opens, clicks, forwards, unsubscribes, and bounces for each version. Here you can view the following:

  • A chart of overall performance for each available criterion

  • A detailed view of performance over time for a single criterion (use the drop-down list to select a criterion)

  • A summary of performance for each test version and the winner version

Team

Enables you to establish a team of staff contacts that are working on the test setup. This provides a convenient place to look up and contact relevant staff while also ensuring that all team members are able to view and edit the test setup.

The campaign-automation canvas includes an A/B testing tile that enables you to integrate testing into your campaign flow. The canvas itself lets you create a new A/B test item if you wish, or you can start by setting up the test using the maintain page, as described previously, and then select the setup for use on the canvas. Be sure to set the Designation to “Campaign automation”, which enables the setup for use with the canvas. When you create the setup using the canvas, both the Designation and Company will be selected automatically.

Note the following when setting up an A/B test for an automated campaign:

  1. You can create the test setup and both mail versions while working on the campaign canvas if you wish. A convenient fly-out window enables you to make all settings and design work without leaving the canvas.

  2. The recipients lists (to whom the winning version will be sent and from whom the test population will be selected if you are using a proportional test group) will be those defined on the campaign canvas, not those defined on the Lists/queries tab of the A/B test setup or marketing emails (if any). However, if you use a marketing list to define the test group in your A/B test setup, then that list will be used for testing (and added to the campaign, see next point).

  3. If you’re A/B test setup uses a marketing list to define the test group (rather than a portion of the send list), then these contacts will also join the campaign for all stages that come after the A/B test tile.

  4. Both of your email versions must be activated before they will be usable by the automated campaign. It is possible to do this while working on the campaign canvas by choosing the Edit button for each email version. (All emails included in automated campaigns must be activated, whether they are tested or not.)

  5. It is possible to set different send times for each version when you use the campaign canvas to set up your A/B test. This can be an important variable to test for, and is not possible with standalone tests.

See Automate campaigns with the campaign canvasfor more information about how to work with the canvas.

By “standalone” test, we mean one that is not executed via the campaign-automation canvas. A stand-alone test is simpler to set up and run, but does not allow for the complex logic and additional automation provided by the canvas.

To run a stand-alone test:

  1. Create your initial marketing email design and decide on which parts of it you would like to test.

  2. Create your test setup under Marketing Execution > A/B Testing > A/B Testing.

  3. Set up the conditions of your test, as described previously, including the wining criteria, result action, the A and B versions of your design, and the source of test group (as a portion of your send list, or as its own separate list). Be sure to set Designation to “Singular”, which enables the test setup for stand-alone operation.

  4. Use the Lists/queries tab of your test setup to establish the full list to which you will send the winning design. If you set the result action to send after a given number of days, or on a specific day, then the system will automatically send the winning design to the remainder of send list at the designated time.

  5. Choose Activate to start the test. The system will check all of your email designs, dates and other settings to make sure they are valid and if everything passes it will start the test; otherwise you will get a message telling you what to fix before you can proceed.

You can see the status of your test both in the list view of A/B test setups and on the A/B Testing tile on the campaign canvas. The test runs through the following statuses as it runs:

  • Draft: This is the initial status for a new test. It means that the test has never been run. You can freely edit the test setup and both email versions while the test has this status.

  • Testing: This means that the test is currently running and no winner has yet been declared. You cannot edit either email version while the test is in this status and many of the test setup settings will also be locked. However, you can still edit the winning criteria, test end-time and/or declare a winner while the test has this status.

  • Winner declared: The test is over and you can no longer change the test setup. If you wish to run a similar test, then you could create a copy of an existing test setup and then modify it as needed to run a new test.

Show: