Case Study: A/B Testing a Non-Profit Give Page

It’s really important to test all aspects of your website, especially if you work for a non-profit with a donation page. Because what if I told you that one change could provide 72% more revenue? Or another change could double the conversion rate of your Give page compared to other options?

These types of experiments and possibilities made me very curious at a non-profit I worked for. This is a case study of A/B testing I did on our Give page there.

What is A/B testing?

A/B testing is when you create two or more nearly identical pages, except for one component (a phrase, an image, colors, etc.) you want to test. You then build a test in Google Analytics, which will automatically feed the pages, alternately, to visitors. As the visits accumulate, you can see which page is performing better based on the criteria (donations, purchases, clicks, etc.) you set.

For years, our Give page was largely unchanged. Even when we bought a whole new online giving system, we imported the same text and gift array from our previous pages. But I was already running a bevy of tests on the rest of our website and it just seemed like it was time to start testing our most important metric: donations. We started with the text.

A/B Testing the Gift Text

For nearly a month, we tested the text around our giving options. Our giving options included four giving levels (a gift array), with donation amounts for $30, $50, $100, and $150. But there was text explaining each amount and I wondered if we should simplify that component. So our three tests looked like this:

  1. Provide food for 90 meals for $30.00
  2. Provide 90 meals for $30.00
  3. $30.00

One page served up this gift array:

  • Provide food for 90 meals for $30.00
  • Provide food for 150 meals for $50.00
  • Provide food for 300 meals for $100.00
  • Provide food for 450 meals for $150.00

The second served up:

  • Provide 90 meals for $30.00
  • Provide 150 meals for $50.00
  • Provide 300 meals for $100.00
  • Provide 450 meals for $150.00

And the third served up:

  • $30.00
  • $50.00
  • $100.00
  • $150.00

Want to take a guess now which one performed best on these metrics: revenue, average gift size, and conversion rate?

But there were two hitches in this test. First, we set Google Analytics to the lowest level of certainty (95%) allowed, since we were a small website with relatively low traffic. Still, Google Analytics never came to a solid conclusion as to which test was the winner. That was okay, though, because in stumbling upon that and thinking about exporting the data, we realized that we had serious outliers in each test. These were large or regular donors who would not necessarily have been influenced by the change in text. For example, one $1,000 donor gave every year at the time of the test. Another donor was being cultivated by our staff and so their gift was somewhat predetermined. I worked with our Manager of Fundraising Systems and our Donor and Corporate Relations Manager to pull out the outliers and work with the rough, more accurate, data.

We found that the simplest text (the third option, above) performed the best on two out of three metrics and came in second on the third. In fact, the simplest text brought in 28% more revenue, a 22% higher average gift, and a 77% higher conversion rate than the lowest performing text within each category.

(By the way, the other two nearly tied.)

A/B Testing

A/B Testing the Gift Array

We moved on to A/B testing the gift array, the amounts we asked for. I decided to tackle this metric primarily because our gift array was well below our average online gift value of $81. I wondered if we might even be holding our average gift value down by offering such a low array. And I hoped that we could nudge our average online gift value up by offering a higher array, especially now that we had the best text in place.

We tested four arrays:

  • $30, $50, $100, $150
  • $25, $50, $100, $250
  • $35, $70, $140, $350
  • $50, $100, $200, $500

You might note some idiosyncrasies about these options. First, they’re not in a valued order. That’s because the first one ($30, $50, $100, $150) was what we’d been working with for quite some time. And even though I wanted to nudge our numbers up, I wondered if an array with an even lower starting point ($25, $50, $100, $250) might have some kind of positive psychological effect, even though it mirrored the first array for a few steps and then launched up to $250.

The fourth array ($50, $100, $200, $500) was what I was hoping would nudge our donors upward, since their average online gift was in between the first and second level. Regardless, all of these levels were designed with input from the full development team. We even placed bets (with a snack sized candy bar being the prize) on the ones we thought would win.

Ready to take a guess at how it turned out?

We launched the tests and had similar experiences with the first tests: conclusiveness that wouldn’t reach the 95% minimum threshold for Google to declare a decision and a handful of outliers. Once again, we pulled those numbers out and examined the raw data.

(This, by the way, is why marketing can’t be entirely automated. There needs to be informed humans behind the design of the gift array, the choice in text, the troubleshooting, and the data analysis.)

This one was a tough call.

In terms of overall revenue, the $30 array came in first, delivering 72% more revenue than the lowest performer. For the average gift size, the $25 array (surprise!) delivered gifts 33% higher than the lowest performer. I didn’t see that coming, but would definitely call it reverse psychology. And the conversion rate proved the $35 array to be best, converting 100% more visitors than the lowest performer.

So I did a simple and quick breakdown, ranking each array on each metric. When I tallied the overall points, our tried-and-true $30 array won. Next up was the $25 array and then the $35. The pricey $50 came in last.

“Other” Amounts

There is another interesting point I want to throw into the mix here. First, we continued to offer the “Other” option, with a blank box for any dollar amount on each gift array test page. This was a little daring, but we monitored the data and found that the number of “Other” amounts – and their size – were consistent with our previous data and the control page.

Once we declared the test complete, we decided to stick with the $30 gift array, which is what we’d been working with for years. Was that a disappointment? Not at all! It confirmed our good sense and gave us even better data. It also showed, through discussions about the data, that we valued overall revenue slightly more than size of the gift or the conversion rate. If our best performing conversion rate array ($35) had not performed so poorly on overall revenue, this might have been a tougher call, but I think we chose the best option.

And, we continued testing many other components of the website. As should all non-profits with a serious call to action.

What do you think about the tests documented here? Would you like to start testing your website? I’d be happy to help, so contact me!