Using the A/B Test Tool

People run A/B tests to compare two different variations of a form (say, two different titles) in order to find the one that works best. The test splits everyone who lands on your form into two groups. One group receives the normal form (the control) and one receives the altered form (the variable). You’ll be able to analyze the results afterwards and see which one received more dollars and donations.

To set up an A/B test on your Contribution Form, click on the A/B Test tab and give your test a descriptive name. 

Check off the attribute that you would like to test. Your can test the form title, contribution blurb, fundraising thermometer, embedded video, pop-up recurring ask, or pop-up recurring title. This will be your variable.

You'll see the page's current setting listed as your control. A second box will appear where you can fill in the information for your testing variable. Once you've filled out all of the fields, click "Create Test" to start running the A/B test. 

Data on your A/B test will be included in your contributor CSV, which can be found under the Contributors tab of your page. If you've offered any additional incentives, like a bumper sticker on a pop-up recurring test, you can find contact information for all of the donors that signed up. 

Other articles in Contribution Forms

Ad tracking

Planning events 

Popup recurring

Screen Shot 2013-09-26 at 11.18.03 AM.png

If you want useful results, we recommend that you only change one of the attributes on the form. That way you’ll know any variations in donations is due to one factor. All you need to do is send the form out to your donors in a fundraising email, and each version of the form will be served up an equal number of times, to a randomized group of your supporters.

Once your test is complete, you can revisit the A/B Test tab and see your results, including the number of people who landed on each form, and the number of people who donated (conversions), as well as the average contribution and dollars per visit. Next to each conversion rate bar will be a symbol. If there’s a checkmark next to the bar, that means that your result was statistically significant. In laymen’s terms, it means that there was a real (positive) difference between the two tests that you can trust. If the gray symbol at right remains, any differences could be to normal variations.

If you have further questions about A/B tests or best practices, let us know.