2 Testing Ideas for Reducing Checkout Abandonment

As a blogger, I love receiving emails from readers sharing the results they’ve received from testing out ideas shared on Get Elastic on their own shops. The following is a message I received from Robert Hodgson from Bandanashop.com:

Bandanashop.com is a family owned business operating in a niche, selling bandanas and headwear worldwide. We try and compete by offering responsive customer service, rapid shipping and technology, and have used several design elements featured in Get Elastic articles. After reading Linda’s blog on persistent cookies we realised that although we store basket details (as cookies) for 30+ days we weren’t actually telling our customers that.

Using Google Website Optimiser to run an A/B test, this one small change resulted in a 16.5% increase in sales from the MyBasket page. We used the text “Your Basket contents are stored here for 30+ days so you can return anytime.” It seems that telling some customers is enough for
them to come back rather than abandon, confident that they can pick up where they left off.


Click to enlarge image in new window

As an aside, on the same page we had a ‘Jared Spool‘ moment changing the checkout button from “Checkout” to “Start Checkout.” This call-to-action gave us a 43% conversion improvement (confidence interval of 97.5%). This was also inspired by the ‘brain-food’ from Get Elastic!


Click to enlarge image in new window

To sum up, Robert saw significant improvements by testing 2 simple things in the checkout:

1. Telling customers how long their cart contents will be held on the shopping cart summary page
2. Labeling the “Checkout” button “Start Checkout”

These 2 changes would be easy for any online store to try (they’re not industry or market specific) – I challenge you to consider these tactics in your next checkout test!

Thanks to Robert for sharing his test results. If you would like to have your A/B or multivariate test results featured on Get Elastic, ping me at linda.bustos @ elasticpath.com.

Tags:


Related Articles

18 Responses to “2 Testing Ideas for Reducing Checkout Abandonment”

  1. I am very impressed the overall site at Bandanashop.com. They have done a really good job over all. I think I will run a similar test soon. I was thinking ‘Continue to checkout’ instead of ‘Proceed to checkout’. I also like the cookie messaging too. I can see I have a lot of testing and improving to do on my site. Plenty of great ideas to glean from their site alone. They are doing many things right.

    Thanks

    • Thanks for the feedback. We tried lots of combinations before ‘Start Checkout’ won. We have a multi-national audience so simplicity of language may be a factor. We also tried different coloured buttons but the branded colour won (marginally). I would strongly encourage you to try as such as small change had major effect. The “>>” chevrons were also important. We did a split-test with these and there was a big difference. Its crazy stuff, just keep an open mind.

  2. The ‘Gift Voucher Code’ in the Basket display was inspired by the article at
    http://www.getelastic.com/coupon-poaching/
    We previously used ‘Promo Code’ which tends to alienate customers who dont have a promotion code. If you call it ‘Gift Voucher Code’ its not so contentious = higher conversion rate.

    • After reading ht eCoupon Poaching post, I was also inspired to test the coupon field, as well.

      We A/B tested “gift certificate.” Our customers apparently didn’t care because there was no significant change.

      On the same note, I created a Coupon/Promo code web page, which lists the coupons/promos we offer. My thought was people are definitely searching for (Brand + coupon code, Brand + promo code, etc.) and we need to rank well for these terms within the organic results; and we do. To compliment, I also added two new AdWords ad groups surrounding this and we’re seeing good results; very inexpensive costs per click and CPA.

  3. mark says:

    Superb ideas. However, I don’t know what “Jared Spool moment” means, can you explain?

    To go with “Start Checkout,” could there be improvement to the other (AKA “continue”) buttons along the way through the checkout process?

  4. Ben says:

    “Your Basket contents are stored here for 30+ days so you can return anytime.”

    I would have thought that telling customers they can checkout any time would have encouraged some to stop the checkout process to think about the purchase more or to discuss it with their spouse, which ultimately would lead to a drop in conversion rate. I was wrong! Interesting finding.

    • Ben – So did we, but by looking at the visitor records we could see that a certain percentage browse first, put stuff in the basket only to return a couple of days later so it seemed to reassure at least a certian percentage of hesitant shoppers and be ignored by the more committed.

  5. Good idea, especially the second – I’ve passed that onto our design team.

  6. Tony Smith says:

    I am sorry to be the one to call BS on this. I have worked in the industry for 12 years and 5 of it was spent constantly analyzing web data for an e-commerce application. I can tell you with almost 100% confidence that changing the text, nothing else, on a button from “Checkout” to “Start Checkout” will not result in a 43% conversion improvement. God forbid we start paying armies of wordsmiths to A/B test every possible combination button text hoping to strike it rich. Play the lottery. You really want results? Hire UX experts that rely on heuristic analysis and usability studies or start yourself by reading authors such as Donald Norman, Tom Tullis or Bill Albert.

    • Hi Tony, thanks for your comment.

      I’m curious, would you also disagree with the impact that “Secure Checkout” has over “Checkout”?

      Also, would you advocate using heuristic analysis alone without A/B testing? I believe they both have their place, but heuristic analysis can veer into “gut feel” territory.

      • Tony Smith says:

        Hi Linda,
        I would not disagree with any scientific result and A/B testing has always played a role in our usability analysis. However, trying to tweak UI elements by A/B testing prior to doing basic heuristic analysis is “gut feel” territory. The attraction to A/B testing is that it is simple to implement and it is a game everyone can, and will want to, play. Before you know it, you have a queue of test cases from the marketing department, from the business department and even from the CEO himself. You end up with all these players because they are all after the same thing this article is selling: instant millions! The odds are against you and since development throughput has come to a grinding halt to avoid contaminating the variable-vacuum you wish existed, very little gets done.

        A/B testing is where I have found the most “gut feel” analysis being done. Nothing will tear down the wall of what you think users think like actually watching them. Our heuristic team consisted of a guide (greets the tester, sets them up on a machine, gives them basic instructions and gives them a goal), primary note taker and two backup note takers placed in the back of the room. The user is instructed to speak out-loud all thoughts and his/her screen is projected for the note takers. Ideally, roughly ten testers would be analyzed but as few as five can give you plenty of data.

        Whenever two teammates/departments argue over UX design elements, whether it is a designer and a developer or sales and marketing, if there is an obvious impasse, that item is added to the list of “find outs”. Not only is the question answered scientifically but it also got the meeting to move past the debate. Win win!

        Good read to get you started:
        http://www.useit.com/papers/heuristic/heuristic_evaluation.html

        Of course, above all else, have fun with it! We sometimes just print a couple screen shots and then walk around to different offices (coffee shops, bakery’s) and get feedback.

        Tony

        • Hi Tony,

          Absolutely it’s a good idea to have heuristic evaluations before testing. In our consulting practice, the heuristic evaluation IS the springboard for testing. The cart does not come before the horse.

          This post was in no way saying to bypass this process. I was confused that you would “call BS” on the data that Robert shared.

          To be clear, there is a difference between heuristic evaluation and user testing, as you described above. User testing is a qualitative method, heuristic evaluation is a review by a usability expert, using “rule of thumb.”

          Usability testing is valuable, but you cannot measure revenue impact or other quantitative metrics with usability tests.

          Heuristic evaluation alone is not good enough. Even the most seasoned usability gurus will share surprising test results when they challenged their own experience and gut feel.

          Just my dos centavos.

          Linda

    • I’m happy to post my Google A/B results which I have sent as a .gif to Linda for her to consider publishing or email to you directly. The results speak for themselves. They show a 97.5% confidence interval and a 43.4% improvement in sales conversions. I didn’t let the test complete because it was so clear-cut win from the first data point. However, you can see from the graphic that it was extremely close to finishing anyway. I have worked in this industry at least as long as yourself but by keeping an open mind my company has benefited. Google Optimiser has proved to be a robust, cheap and empirical method of improving our website.

  7. Tony Smith says:

    Oh, I know this post wasn’t about bypassing the process and I apologize if my response veered off path. I just wanted to make sure that the audience knows that A/B testing is not the end-all, be-all solution to increasing revenue. I also don’t mean to blanket all A/B initiatives under my criticism but more trying to caution readers that they could end up spending the next three months A/B testing changes to button labels in an attempt to strike it rich and come out with nothing.

    The “BS” comment may have been out of line and I apologize. However, in my opinion, any time someone publishes ecommerce metrics and their conclusions, raw data (to some degree) should be included. I wish I had time to review your findings Robert but maybe you could throw out some more numbers such as Users per day and Revenue per day and I’ll swing back later. When you did the test for the checkout label change, was anything else changed at all? Did Bob down in the basement finally remember to turn on the other two servers in the cluster? Solar flares accounted for?

  8. I know A/B testing is not the only tool for usability, but its a damm good one and its free and fast. I dont have access to Usability experts but I know there are some good web-based ways of getting ey-balls to look at your site. Usability sounds simple because its ‘common sense’ right? I am not threatened by that because anybody reading GetElastic knows that its not that common. I think you SHOULD test the checkout button with different colours/sizes and here’s why: You can only have one test running concurrently on the checkout button so if the CEO or the mailroom Intern puts it forward as long as the hypothesis is reasonable why not – the customer is deciding the outcome not the CEO. Whats the worst that can happen? A test period of x days where 50% of the traffic is not converting as good as the control. So you kill it when the test completes. The upside is an increase in conversions from that day forward, and that is way more value than x days of testing. A no-brainer.
    I will not be releasing revenue for the same reasons that you would face litigation if you release your customers revenue. If you think the screenshot is BS please take it up with the Google Optimiser team because the raw data is implicit in the results and my Photoshop skills are not good enough to create that from scratch! If the results are at the 95% confidence level the bar goes green. For some sites that might be a matter of days for others its a few weeks. The results are the same.

Leave a Reply

© 2014 Get Elastic Ecommerce Blog. All rights reserved. Site Admin · Entries RSS · Comments RSS