A/B Testing

A/B testing is a way to try out different tag configurations to see which one gives better results. For example, do visitors to your website click an ad on the left side or right side of the page.

TDI's approach to A/B testing is more surgical than vendors who perform whole-page A/B or multivariate testing. By focusing testing on the individual tag level, users can avoid interfering with code elsewhere on the page or incurring the added latency that comes with full-page testing. It also provides a quicker route to putting test cases into full production. Users simply have to remove the A/B option and the desired "winning" tag is promoted to 100% production.

A/B testing topics:

 

Examples

Imagine you want to test which design of an ad banner is more effective and on which side of page it is most effective.

Create a tag with four variants:

  • Variant 1: Displays banner A on the left side of the page.
  • Variant 2: Displays banner A on the right side of the page.
  • Variant 3: Displays banner B on the left side of the page.
  • Variant 4: Displays banner B on the right side of the page.

Example

 

Setting up A/B testing

1. In the tag editor, choose an A/B testing option:

A/B testing

    • Off: A/B testing is not enabled for the tag.
    • Tag code: A/B testing is enabled for a tag using code.
    • Visual placement: A/B testing is enabled for a tag using visual placement.
    • Both: A/B testing is enabled for a tag using code and visual placement.

 

2. Click the plus sign to add the number of variants you want to test.

Add variants

3. Type a percentage for how often each tag variant should be presented to a visitor. All values should add up to 100 percent. If the values do not add up to 100 percent when you save the tag, TDI will adjust the values to equal 100 percent.

Percentages

4. Select a variant to edit the tag code, visual placement, or both.

5. Repeat step 4 for each variant.

If you need to edit your tag later, you can edit each variant by selecting it.

6. Normally, every time user encounters a tag that is subjected to A/B testing, TDI will randomize the case and potentially present a different tag version. For instance, a user may visit Page 1 and see Tag Case A, but when she visits Page 2 on the same site, TDI may present Tag Case B. In some instances, you may want to have the same case presented during a single session. An example would be using TDI to test new site navigation -- having the navigation flip between old and new would be poor user experience.

To counter this, TDI lets you persist an A/B testing state through this control.

ab_persist.png

 

Real world applications

Mezzobit's free A/B testing capabilities can solve a variety of business problems. In many of these instances, you may need to create some custom variables to be inserted into the tag code to relay outcomes to your analytics platform and perform analysis. The type of variables used would depend on your specific platform.

Some examples of A/B testing applications include:

  • You have two vendors for a specific service (content recommendation widget, social sharing widget, etc.) and want to do a bake-off between the two to see which yields better results.
  • You want to increase the conversion rate for a particular visual unit. You can place the same unit in two different locations on the page using A/B testing.
  • You want to test design changes on conversion variables. TDI can manage first-party JavaScript to do these tests. As noted above, session persistence should be used here.
  • You want to do some campaign testing for a house ad or a new editorial unit, but don't want to spend money on routing the content through your ad server.

 

Have more questions? Submit a request

0 Comments

Please sign in to leave a comment.