Testing: How to figure out the why behind the what

Posted by | January 29, 2016 Conversion Optimization | One Comment
seat view component

If your last conversion test had an unexpected outcome, no doubt you wanted to know why. Optimizely results are great for telling you exactly “what” your customer did, but the reasons “why” site visitors react in certain ways might remain a mystery to you.

If you are like us, you use UserTesting.com to better understand how your customers are thinking as they navigate through your site. This is a helpful practice in the discovery phase for designing new websites as well as preparation for setting your conversion optimization strategy.

But have you ever applied the method of user testing to conversion tests? Asking users to review multiple versions of a webpage to evaluate changes made through conversion tests is not a typical request, but we found it to be quite effective. Using tagged URLs and an adjusted script formulation process, we and our partner, Ticketmaster, made answering the “why” much easier.

Conversion Testing, meet User Testing

Conversion testing takes place online. User testing is a tool that is commonly used to better understand online behavior. Put them together, and you have created a new practice that allows you to understand how customers respond to specific questions and targeted experiences while bucketed into a specific conversion test.

While qualitative results will never be used to determine a winner or a loser of any given A/B test, what this pairing can accomplish is an explanation as to why it performed a certain way via specific customer interactions and both verbal and written feedback.

Applying our Approach

Here is an example of how we applied this approach for Ticketmaster:

The Ticketmaster team executed a conversion test to evaluate and measure the impact of integrating a new component into their interactive seat map seat to assist with ticket selection. The updated feature incorporated an actual view from a seat at the venue, and this was presented upon hovering over that seat on the map. For example, individual customers could see the view of a football field from any available seat.

The hypothesis was that the seat views would increase confidence in purchase decisions and overall conversion rate by minimizing uncertainty. While the quantitative results did support this hypothesis, the team wanted to dig deeper to really understand the intentions and reactions of their customers.

By layering UserTesting.com over top of the live Optimizely experiment, the experience was evaluated by five individual user testers who confirmed exactly what the quantitative Optimizely test results indicated. Technical execution was made possible with specific URL parameters that allowed us to force the test variations within a creative and well-crafted user testing script, which also presented a smooth user experience.

The testers first evaluated the control without the visual view from a seat. Then, the testers evaluated the test variation with the view from a seat displayed as the user hovered over a seat on the interactive map. The reactions to the added feature were overwhelmingly positive. Along with verbal feedback acquired from viewing the video results, the testers are always asked a series of written questions. In this case, both types of responses demonstrated that the added views would influence purchase decisions and a customer’s evaluation of his or her overall shopping experience.

User testing participants made a unique contribution by elaborating on why they preferred the visual view:

“I would recommend this to everyone I know who buys tickets to events. The map took the hassle out of the ticket-buying experience.”

“Selecting tickets was easy and worry-free with the images provided. I liked searching for tickets on my own and viewing the images prior to making a purchase.”

While it is always welcome to see a lift in conversions as the result of A/B testing, the incorporation of user testing can add an additional human layer to your results. This feedback illuminates the quantitative results and can help to drive further optimization strategy.

The New Norm

We are now regularly combining Optimizely and Usertesting.com for our clients to get the most insight into our testing. If you’d like to learn more, contact our optimization team at scott.plumb@blueacorn.com.

About Shana Braun

Shana has worked in the field of data management and analytics for a decade, and she actually enjoys it. She got her start in the “big data” side of business where she worked in data management for large CPG companies. She went on to spend a few years with an advertising agency reporting on all digital activities including eCommerce analysis. As a Conversion Consultant at Blue Acorn, Shana now works with clients to create a data driven approach and client specific strategies to increase key eCommerce performance metrics.

One Comment

  • David says:

    Holy cow. UserTesting is incredible! I can’t believe I’d never heard of it before.

    Seems like a neat way for “normal” people to make a little money on the side, too 🙂

Leave a Reply

Your email address will not be published.