Using the A/B Test Tool
You can run A/B tests to compare two different variations of a form (say, two different titles) in order to find the one that works best. The A/B test splits everyone who lands on your form into two groups. One group receives the normal form (the control) and one receives the altered form (the variable). Once the test has run its course, you can compare the two forms and see which one received more dollars and donations.
To set up an A/B test, click on the A/B Test tab of your contribution form and give your new test a descriptive name. The name will only be seen internally, so choose something that works best for you.
Check off the attribute that you would like to test. You can test the form title, contribution form blurb, using a fundraising thermometer, embedding a video, a pop-up recurring ask, or a pop-up recurring title. This will be your variable.
Say you’ve chosen “Pop-up recurring ask.” Once you’ve checked it off, the page will show you the two different aspects you’ll be testing — “The first variation, your form’s current settings” and “A variation to test.” You can test multiple variations.
In this case, fill out the top section (your form’s current settings) with the pop-up recurring ask you’re already using.
In the bottom section, write up another blurb to test the response from your donors. You should give this part a title — something that will remain internal to your organization. You can test the content, the length, or the way you make your ask. Just make sure your donors know why they should become a recurring donor after reading it.
You should then click Create Test and you’ll be brought to a page like the one below which shows the breakdown of the two variations you’re comparing.
Now all you need to do is send the form out to your donors in a fundraising email, and each version of the form will be served up an equal number of times, to a randomized group of your supporters.
Thanks to our Multi-armed Bandit feature, as the test runs and one variation begins performing better, we’ll start sending more traffic to that form, roughly in proportion to how they’re trending. If there was a false positive and the losing form starts doing better, the traffic allocation will begin to reverse. The test will continue to run indefinitely until you click “Make Winner.” The A/B testing tool will eventually send 100% of volume to the winner if you don’t make either version the winner manually.
You can continuously revisit the A/B Test tab and see your results, including the number of people who landed on each form, and the number of people who donated (conversions), as well as the average contribution and dollars per visit. Next to each conversion rate bar will be a symbol. If you see a checkmark next to the bar, that means that your result was statistically significant.
In laymen’s terms, that means the test won! There was a real (positive) difference between the two tests that you can trust. If the gray symbol at right remains, any differences could be due to normal variations.
Data on your A/B test will be included in your contributor CSV, which can be found under the Statistics tab of your page. If you've offered any additional incentives, like a bumper sticker on a pop-up recurring test, you can find contact information for all of the donors that signed up.
If you want useful results, we recommend that you only change one of the attributes on the form. That way you’ll know any variation in donations is due to one factor.
If you have further questions about A/B tests or best practices, let us know.