Tag Archives: a/b testing case study

A Definitive Guide to Converting Failed A/B Tests Into Wins

alt : http://feedproxy.google.com/~r/ILoveSplitTesting/~5/LClAaoFWRhc/vba-session-recording02.mp4http://feedproxy.google.com/~r/ILoveSplitTesting/~5/LClAaoFWRhc/vba-session-recording02.mp4

As a marketer, there aren’t many things sweeter than running successful A/B tests and improving your website conversion rate.

It’s sweet because getting a winning A/B test is hard work.

To carry out a successful A/B test, marketers need to follow a robust process. They need to develop data-driven hypotheses, create appropriate website variations, and test on targeted audience. And even by following such a structured process, marketers tend to win just one out of three A/B tests.

[What’s more worrying is that the percentage of winning A/B tests overall is only 14% (one out of seven). That’s largely because most of the marketers still don’t follow a documented process for A/B testing (and CRO as a whole). For instance, only 13% of eCommerce businesses base their testing on extensive historical data.]

But, here is a good news: Your failed A/B tests can still be of value.

By analyzing the A/B tests that didn’t win, you can highlight flaws in your approach, improve the tests, and even identify hidden winners.

This post talks about the key things you can do after encountering an unsuccessful test.

For convenience sake, we’ve segregated unsuccessful tests into two parts: inconclusive tests, and tests with negative results.

When A/B Tests Give Inconclusive Results

An inconclusive result is when an A/B test is unable to declare a winner between variations. Here’s what you need to do with such a test:

Finding Hidden Winners

Even when your A/B test hasn’t found a winner among different variations, there are chances that you can still uncover wins by slicing and dicing your test audience.

What if the A/B test produced results for specific segments of your traffic (segmented the basis of traffic source, device type, etc.)?

This scenario is similar to the Simpson’s Paradox. Let’s understand it with a simple example.

A gender bias study among the UC Berkeley admissions in 1973 showed that men had a higher chance of being admitted as compared to women.

Simpson's paradox example

However, the department-specific data showed that women had a higher admission rate for most departments. Actually, a large number of women had applied for departments with low admission rate (in contrast to a small number of men).

Simpson's paradox example 2

We can see how multiple micro-trends skewed the overall study result.

Likewise, an A/B test can be found working for some traffic segments and not working for some, leading to an inconclusive result.

You can reveal hidden winners (traffic segments where an A/B test delivered results) with post result segmentation.

For instance, you can find if your website conversion rate improved specifically for new visitors or old ones; for paid traffic or organic traffic; or for desktop traffic or mobile traffic.

The analysis can help you identify segments that have the most potential. For example, your inconclusive A/B test might have increased conversions for “returning visitors.” You can run a new (or the same old) test targeting only the returning visitors.

Post Result Segmentation for an A/B Test

Nonetheless, it’s essential to observe the number of visitors for each segment. The conversion rate and other data points for different segments can only be trusted if the individual segment traffic is large enough.

Tracking the Right Metric(s)

The effectiveness of an A/B test’s result depends largely on the metric you’re tracking.

A lot of times, A/B tests aim at improving only micro-conversions for a website. Mostly, that’s either because the test is carried out at a beginning stage of a conversion funnel or on less-critical web pages. Such tests do not track changes in a website’s macro conversions, and fail to notice any rise in the bottom-line (sales/revenue).

When your A/B test is inconclusive, you need to check if you’re optimizing for the correct metric. If multiple metrics are involved, you need to analyze all of them individually.

Let’s suppose, you run an eCommerce store. You create a variation for your product description page that mentions “free shipping,” with the objective of increasing add-to-cart actions (a micro conversion). You A/B test the variation with the control page that gives “no information on shipping.” To your surprise, the test couldn’t come up with a clear winner. Now, you need to see whether the variation boosted your revenue (macro conversion), or not. If it did, the reason can be simple: the “free shipping” variation might have led only the users with high purchase-intent to the checkout page, thus, increasing the number of conversions.

If you realize that you weren’t tracking the most relevant metric with your A/B test, you need to edit the test with new goals. With new metrics in place, you can run the test for a while longer, and find improvements.

It’s advisable to keep your eyes on both micro and macro conversions.

Micro and macro conversions

Analyzing Visitors’ Behavior

Using on-site analysis tools, you can uncover a lot of insights which plain data just can’t offer. With the help of heatmaps/scrollmaps and visitor recordings, you can observe the behavior of your users (A/B test participants) and find probable causes that led to an inconclusive test.

Heatmaps can tell you if the element you’re testing is going unnoticed by most users. For instance, if you’re testing a variation of a CTA button that lies deep down the fold, heatmaps/scrollmaps can highlight the number of users that are reaching the CTA button. An A/B test might be inconclusive if only a handful of users are reaching the CTA button.

Here’s how a scroll map looks:

Scroll Map - VWO Pricing Page

In the same case, visitor recordings can show you how users are interacting with the content and elements above the CTA. With high engagement above the CTA, users might have already made up their mind about their next action (a conversion or an exit). Hence, any changes in the CTA would not affect users and would result in an unsuccessful A/B test.

Apart from giving insights on specific pages, visitor recordings can help you understand user behavior across your entire website (or, conversion funnel). You can learn how critical the page on which you’re testing is in your conversion funnel. Consider a travel website where users can find holiday destinations using a search box and a drop-down navigation bar. An A/B test on the navigation bar will only work if users are actually engaging with it. Visitor recordings can reveal if users are finding the bar friendly and engaging. If the bar itself is too complex, all variations of it can fail to influence users.

Double Checking Your Hypothesis

Whenever an A/B test fails to provide a result, the blaming-fingers invariably point to the hypothesis associated with it.

With an inconclusive A/B test, the first thing to check is the credibility of the test hypothesis.

Start with reviewing the basis of your hypothesis. Ideally, all your test hypothesis should be either backed by website data analysis or user feedback. If that’s not the case, you need to backtrack, and validate your hypothesis with either of the two methods.

When your hypothesis is, in fact, supported by website data or feedback, you need to assess whether your variation closely reflects it. You can also take help of on-site analysis tools, and find ways to improve your variations.

Funnel data analysis
Sample website data that can be used to create hypothesis (Source)

Here’s an example: Let’s suppose you have a form on your website, and data analysis tells you that a majority of users drop off on the form. You hypothesize that reducing friction on the form will increase submissions. For that, you cut down the number of form-fields and run an A/B test. Now, if the test remains inconclusive, you need to see if you’ve removed the friction-inducing form fields or not. Form-analysis can help you find exactly those form-fields that lead to the majority of drop-offs.

Reviewing the Variations

One of the biggest reasons A/B tests remain inconclusive is that the difference between test variations is minuscule.

Now, I know, there are numerous case studies boasting double/triple-digit improvement in conversion rate by just “changing button color.” But what we don’t see are all the tests that fail to achieve the same feat. There probably are tens/hundreds of such failed tests for every single winning test.

For instance, Groove (a helpdesk software), ran six different A/B tests with trivial changes. All of them proved to be inconclusive. Have a look:

CTA button color change A/B test

CTA Text change A/B test

Keeping this in mind, you need to go through your test variations and see if they really have noticeable changes.

If you’re testing for minor elements, you need to start being more radical. Radical or bold A/B tests are usually accompanied by strong hypotheses, tending to deliver results more often.

(Interestingly, testing radical changes is also advisable when you have a low traffic website.)

Deriving Further Learnings from the Tests

So you’ve finished a thorough analysis of your inconclusive A/B test using the above-mentioned points. You now know what went wrong and where you need to improve. But, there’s more.

You also get to know about the elements that (possibly) don’t influence users for conversions.

When your inconclusive test had no hidden winners, you tracked the correct metrics, your hypothesis was spot on, and your variations were disparate enough, you can safely assume that the element tested just didn’t bother your users. You can recognize that the element is not high on your criticality list.

This will help you create a priority list of elements for your future A/B testing.

When A/B Tests Give Negative Results

A negative result for an A/B test means that the control beat the variation. Even with a failed test, you can gain insights and conduct future tests effectively.

Finding What Went Wrong

There could be many reasons because of which your A/B test returned a negative result. Having the hypothesis wrong, or executing the variation poorly are among them.

A negative result will make you question the test hypothesis. Did you follow a data-driven approach to come up with the hypothesis? Did you blindly follow a “best practice?”

Unbounce highlights a few cases where A/B tests performed against “common expectations.”

Example: ”Privacy assurance with form” best practice failed
Example: ”Privacy assurance with form” best practice failed

These tests again emphasize the importance of a data-driven process behind A/B testing and CRO. A negative A/B test result can prove to be a wake-up call for practicing the same.

Knowing Your Users’ Preference

Negative A/B test results let you understand your users’ preferences better. Specifically, you get to know your users’ dislikes (in the form of the changes you made to the losing variation).

Since you know what your users don’t like with your website, you can build on hypotheses about what they might like. In other words, you can use your negative test results to create better tests in the future.

Let’s talk about the Unbounce example used in the point above. The A/B test was performed on a form, where the variation flaunted privacy assurance, saying “100% privacy – we will never spam you.” The variation couldn’t beat the control — it reduced conversions by 17.80%. Upon analyzing the result, it was deduced that users didn’t like the mention of the word “spam.” Knowing what the users hated, the next test was run with a different variation. The form still had privacy assurance but this time it read “We guarantee 100% privacy. Your information will not be shared.” (No mention of the dreaded “spam” word.) This time the result changed — the variation ended up increasing signups by 19.47%.

Learing used from failed A/B test for a win

What’s Your Take?

How often do you encounter failed A/B tests? We’d love to know your thoughts on how to tackle them. Post them in the comments section below.

12

The post A Definitive Guide to Converting Failed A/B Tests Into Wins appeared first on VWO Blog.

View article:

A Definitive Guide to Converting Failed A/B Tests Into Wins

3 Ways Tinkoff Bank Optimized Credit Card Conversions – Case Study

Conversion Rate Optimization (CRO) is a process-oriented practice, which essentially aims at enhancing user experience on a website.

It starts with proactively recognizing challenges faced by users across a conversion funnel, and addressing them through various tools and techniques.

Tinkoff Bank understands the need for a process-oriented approach to CRO and puts it into practice.

The following case study tells us more about Tinkoff’s CRO methodology — and how it delivers incredible results.

About the Client

Tinkoff Bank is a major online financial services provider in Russia, which was launched in 2006 by Oleg Tinkov. In just a small duration, the bank has grown into a leader in credit cards — becoming one of the top four credit card issuers in Russia.

Notably, the bank was named Russia’s Best Consumer Digital Bank in 2015 by Global Finance.

Tinkoff operates through a branch-less digital platform, and relies a lot on its website for finding new customers. Like any other smart business, the bank constantly explores new ways to improve its website’s conversion rate. For this job, Tinkoff has a dedicated web analytics team that plans and executes CRO strategies on the website.

Context

Tinkoff Bank lets users apply for a credit card through an application form on its website. Users can fill up the application form, and submit it for approval from the bank. Once the application is approved, users receive their credit card at their homes — with zero shipment cost.

This is the original application page:

Tinkoff's Application Page

The application page on the website is fairly elaborate, consisting of a multi-step form and details about the application process and the credit card plan. This page is where conversions (form-submits) happen for Tinkoff.

Since the form involves multiple steps for completion, Tinkoff tracks submits for each step of the form along with submits for the complete form. Tinkoff refers to these conversions as short-application submits and long-application submits, respectively.

The ultimate goal for Tinkoff is to increase these conversions.

The Case

The CRO team at Tinkoff was working on improving their website’s usability to get higher conversions. It began with identifying key pages on the website that could be optimized. For this purpose, the team analyzed the website’s user data with Adobe Site Catalyst. It found that the credit-card application page had a significant bounce rate.

Next, the team planned on ways to help users stay on the application page and complete the conversion. They zeroed in on three areas of the web page, where they could introduce new features. The hypothesis was that these new features will improve user experience on the page.

However, the team needed to be absolutely sure about the effectiveness of these new features before applying changes to the web page permanently. There was only one way to do it — through A/B testing!

Tinkoff used VWO to carry out A/B tests on the page, and determine whether it was beneficial to introduce new functions there.

Let’s look at the tests closely.

TEST #1: Providing an Additional Information Box

The Hypothesis

By offering additional details about the credit card above the form, the number of sign-ups will increase.

The Test

Tinkoff created two variations of the original (control) page.

The first variation included a “More details” hyperlink underneath the “Fill out form” CTA button placed above the fold. When clicked, the hyperlink led to a new page which provided additional information about the credit card scheme.

Here is how it looked.

2

The second variation had the same “More details” link below the CTA button. But this time, the link opened up a box right below. The box provided additional information — through text and graphics — about the credit card.

Here’s the second variation.

3

The test was run on more than 60,000 visitors for a period of 13 days.

The Result

The first variation couldn’t outperform the control. It had an even lower conversion rate than the control.

The second variation, however, won against the control, and improved the page’s conversion rate by a handsome 15.5%. Moreover, it had a 100% chance of beating the control.

The Analysis

Displaying Key Differentiators:

Placing key differentiators — factors that make one superior than its competitors — on a web page prominently is one of the leading best practices in CRO. The key differentiators enhance the image of a brand in users’ eyes, which influences them to make a conversion.

Tinkoff, too, wanted to place its differentiators on the application form page. In order to not clutter the page, Tinkoff decided to display these differentiators within a box, behind the “More details” link.

The box clearly illustrated Tinkoff’s key differentiators such as free shipping of the card, free card recharge, and cashback on all purchases made through the card.

Related Post: Optimize Your Marketing Efforts with a Killer Value Proposition

Emphasizing on Free Shipping:

By now, we all know how free shipping influences the minds of the customers. In fact, lack of free shipping is the number one reason why people abandon their shopping carts!

Naturally, displaying “Free shipping” prominently on the application page worked well for Tinkoff.

free shipping

Note: Although free shipping was already mentioned on the original page’s top right corner, it didn’t have much contrast against the background — making it potentially unnoticeable to visitors. The variation, however, increased the chances of visitors spotting the much loved free shipping offer.

Reassuring Users About Tinkoff’s Credibility:

Reassuring users at each step of a conversion process helps improve the conversion rate. This is the reason why trust badges, testimonials, and social proof work for so many websites.

Likewise, the features-box on the application page reassured users about Tinkoff’s credibility. The box mentioned how Tinkoff is the leading internet bank providing more than 300,000 points of recharge, and how its service is completely digital — users don’t ever have to visit bank branches. This helped in making users trust the bank’s services, thereby increasing form submits.

Related Resource: 32% Increase in Conversions by A/B Testing for The Right Reasons

Why Did The First Variation Fail?

The “More details” link on the first variation page led users to a new page where additional information about the credit card was present. This feature, however, distracted some users away from the application form. And since web users have a short attention span, some users probably didn’t return back to complete the form — reducing the total number of conversions.

Furthermore, users had to make an effort leaving the application page to go on the new link, browsing through the content there, and returning back to the previous page to submit the form. Because of this effort involved, many users wouldn’t have visited the “More details” page — nullifying any value that the page could have provided them with. And without enough information, many users wouldn’t have converted.

Unsure users are the first to bounce off. Keep reassuring them about your credibility. Tweet: 3 Ways Tinkoff Bank Optimized Credit Card Conversions (Case Study) Read more at https://vwo.com/blog/tinkoff-case-study

TEST #2: Gamifying the Form Using a Progress Bar

The Hypothesis

Providing a “progress bar” on top of the four-step application form will motivate users to fill the form completely, resulting in a higher conversion rate.

The Test

Here again, Tinkoff designed two variations of the original form page.

The first variation had a yellow banner-like progress bar, right above the form. The progress bar highlighted the step on which the user was present. It also displayed the user’s progression on the form graphically, using a black line at its bottom. The bar mentioned the probability of approval of a credit card based on how far the user had filled the form.

This is the first variation.

8

The second variation also had a progress bar, but with a different design.

Similar to the first variation, the second variation’s progress bar displayed the form’s step number and the probability of approval of a credit card. But, the progress bar here was green in color. And it didn’t have any additional black line to show the user’s progress on the form. Instead, the bar itself represented the user’s progression graphically: The green portion of the bar grew as users moved further on the form.

Take a look.

10

The test ran on more than 190,000 visitors for a period of 39 days.

The Result

Both the variations outperformed the control!

The first variation had a 6.9% higher conversion rate than the control.

However, the second variation was the overall winner. It improved the conversion rate of the page by a robust 12.8%.

Both the variations had a 100% chance to beat the original page.

The Analysis

Curbing Users’ Anxiety:

Nobody likes filling up long forms on websites. Users only do that when they expect equal or higher value in return.

When users find lengthy forms, they often become anxious. This happens because they aren’t sure of gaining satisfactory value after completing the form. Many times, user’s anxiety leads them to bounce off the form (or the website altogether).

However, there are various website elements that are used to reduce users’ anxiety on a website — progress bar being one of them.

Progress barSource

A progress bar helps curb anxiety of users by providing them a visual cue about the effort required to complete a process. It reassures users that the process will be completed in due time and effort, keeping them from bouncing off the page.

The above fact has been concluded by various studies conducted on website and application designing.

Gamifying’ Users’ Experience:

Almost all of the web users today would have played video games on some platform or the other. It’s safe to say that most of them are familiar with progress bars displayed within such games. The progress bars, there, are usually associated with users’ progress within a game, telling how far they’ve reached in finishing the game’s objective (or beating a certain opponent in the game).

progress bar in games

The progress bar on Tinkoff’s credit card application form introduced a similar gaming experience to its users. The progress bar could only be fully filled when users completed their whole form. Whenever users found a partially filled progress bar, they had an additional motivation to fill and submit the form.

The fully filled progress bar, later, provided users with a sense of achievement.

‘Rewarding’ Users:

The progress bar deployed another gamification technique — reward.

On Tinkoff’s form page, the technique was put into force using an overlaid text on the progress bar. For instance, when users were on the second step of the form, the text read “The probability of approval is 30%” and “Get 10% for Step 2 completion.” Since users were investing time and effort in applying for the credit card, they would really want to have the highest probability for its approval. By realizing the importance of each step of the form for their application’s approval, users were further motivated to complete them.

Why Did The Second Variation Perform Better Than The First?

Because the second variation’s progress bar had greater visibility on the application page.

Providing contrast to your key elements on a web page is one of the fundamental principles of web design.

The first variation’s progress bar was a black line, and on the bottom of a yellow banner. Since the color scheme of the overall page included white, grey and yellow, the progress bar and the banner didn’t have much contrast. For some users, the progress bar could have easily blended in with the page’s theme. Moreover, the progress bar was quite thin, possibly making it even harder for some users to notice it.

progress bar close up

The second variation’s progress bar, on the other hand, flaunted green color — giving it ample contrast and visibility on the page. The width of the bar, too, was large enough to make it noticeable to the users. And once the the progress bar was noticed by the users, its persuading factors started to work on them.

Gamify your online forms to increase form-submits and conversions. Tweet: 3 Ways Tinkoff Bank Optimized Credit Card Conversions (Case Study) Read more at https://vwo.com/blog/tinkoff-case-study

TEST #3: Letting Users Fill Their Details Later

The Hypothesis

By giving users an option to fill up their passport details on the application form later, the number of form-submits will increase.

The Test

This test involved only one variation that was pitted against the control.

On the form’s second step, users were required to submit their passport related information. The variation gave an option to the users for completing this step later, using a “Don’t remember passport details” checkbox. Upon clicking this checkbox, a small window appeared, asking users to choose a medium — phone or email — to provide their details later. Users could complete the form whenever they had the passport details handy with them.

Here are the screenshots of the checkbox and the pop-up window.

fill details later - checkbox
Checkbox
Fill details later -- box
Pop-up

The test ran on over 265,000 visitors for a period of 23 days.

The Result

The variation won over the control page convincingly. It improved the conversion rate of the form by a whopping 35.8%. The after-filling conversion rate, too, increased by 10%.

The variation had a 100% chance to beat the control.

The Analysis

Acknowledging Users’ Issues:

The second step on the application form required detailed information about users’ passport. The form asked for information like passport’s date of issue, series and number, code division, and more. Most of the users don’t remember these details about their passport by memory. In order to complete the form, the users had no option but to take out their passports and look for the required information. However, some users wouldn’t have their passport handy with them while completing the form. This would have forced them to leave the form.

Now, with the option to fill out the passport details on the form later, users didn’t have a reason to leave the application form in the middle.

Providing Freedom to Users:

Once users clicked on the “Don’t remember passport details” checkbox on the page, they were met with two options for filling up the form later. They could either have the incomplete form’s link emailed to them, or they could choose the ‘phone’ option. The latter option allowed users to fill up the form through a phone call with Tinkoff’s executives.

Completing the form through a telephone call, particularly, reduced a great deal of effort that users had to make.

Virtually Shortening the Form-length:

Once users chose to fill their passport details later, they were only left with two steps to  complete out of the total four. So effectively, users had already covered half of the application form. And this information was reinforced by the progress bar on top of the form.

As users had completed the first half of the form like a breeze, they looked forward to completing the next half equally quickly.

success kid

In addition, the option to fill the passport data through a phone call, actually, converted the form into a three-step process.

Addressing the convenience of your users should be your top priority, always. Tweet: 3 Ways Tinkoff Bank Optimized Credit Card Conversions (Case Study) Read more at https://vwo.com/blog/tinkoff-case-study

Conclusion

Conversion Rate Optimization is not about testing random ideas on your website. It is about improving your website’s user experience through a coherent process. This process involves identifying areas of improvement on your website and suggesting changes based on traffic data and user behavior, and best practices. It’s followed by A/B testing these changes and learning about the effectiveness of the changes. Only when the changes improve the conversion rate of your website, you apply them permanently.

The post 3 Ways Tinkoff Bank Optimized Credit Card Conversions – Case Study appeared first on VWO Blog.

See original article – 

3 Ways Tinkoff Bank Optimized Credit Card Conversions – Case Study

Thumbnail

Promo Code Box on your Shopping Cart Page could be Bleeding Dollars. A/B Test it.

The Company

Bionic Gloves is an online store that designs and sells a range of gloves, such as golf gloves, fitness gloves, and more. Their focus is to provide customers with gloves that have fine grip, comfort, and durability.

To increase sales from their eCommerce shop, they decided to optimize their website. The task was given to Portland-based marketing & conversion optimization agency, Sq1.

The Test

Sq1 performed many tests on the Bionic Gloves website. In this case study I’ll be taking you through an interesting test that was performed on one of the most important pages of any eCommerce website, the shopping cart page. In fact, one study by Surepayroll estimated that each year eCommerce websites lose a whopping $18 billion because of shopping cart abandonment.

To test their hypothesis that removing the ‘special offer’ and ‘gift card’ code boxes from the shopping cart page would result in more sales and less cart abandonment, they set up an A/B test in VWO.

This is how the original shopping cart page looked like:

Bionic AB - Control

The Result

The test was run on close to 1400 visitors for a duration of 48 days. This is how the variation page (without the code fields) looked like:

Bionic AB - Variation

The primary goal that they were tracking was the revenue made. The variation won and increased the total revenue by 24.7%, and revenue per visitor by 17.1%.

Why the Variation Won?

In the words of David from Sq1, “Anytime you leave the door open for a user to leave the conversion funnel – even if it seems like they’d come right back – you risk losing sales. By showing the Promo Code field on the cart, users were enticed to leave the site in search of a promo code. At that point, the conversion process is interrupted and you are more likely to lose potential customers. As such, hiding it was a very logical test.

A shopping freak myself, I wouldn’t lie that I, too, have gone looking for coupon codes a number of times in the middle of my purchasing process. This, as David pointed out, has a number of risks:

  • The sight of the coupon box triggers visitors to look for one on Google and other places. I did a quick Google search of “Bionic Glove”, and look what I found in the auto-complete searches:
    google_search_result1
    google_search_result_2
  • eCommerce websites also risk losing money to affiliates and websites offering deals, coupons, etc.
  • Many a times, visitors end up finding a better deal on another web store.

To avert this, I have seen many websites now show all available coupon codes right on the product page and also on the cart page. Not only does this help them reduce cart abandonment, but also helps them increase their average order value as many shoppers go ahead and buy more stuff to cross the threshold at which coupons can be applied.

See how Myntra, a fashion ecommerce website based out of India, does this beautifully:

myntra_coupon_codes

Let’s Talk

Tell me what you think about this case study in the comments section below. I am also available for intellectual discussions on CRO and A/B Testing which can fit in less than 140 characters on Twitter @taruna2309. See ya!

8 Checkout Optimization Lessons Based on 5+ years of Testing

The post Promo Code Box on your Shopping Cart Page could be Bleeding Dollars. A/B Test it. appeared first on VWO Blog.

Read this article: 

Promo Code Box on your Shopping Cart Page could be Bleeding Dollars. A/B Test it.

Thumbnail

The Battle between Short and Long Pages Continues. Guess which Scored a Point.

I think I should make a series of all the A/B tests that I have personally come across in which removing a certain element worked for one company, and adding that same element worked for another. (To understand what I mean by element, you should read this post.) After all, every business is different. And so are their target audiences.

Few months back, I came across this wonderful test in which an SEO company went from a content rich page to one with only a form and headline texts, and improved their conversions. I was intrigued, and curious to know the science behind why such pages work, and why even giants like Facebook, LinkedIn and Quora have bare minimum homepages. I have added my findings about why they work, and what the challenges of such a page could be in the same post. Do give it a read.

In fact, we, at VWO, were so inspired by this test that we decided to give it a shot. And hey, have you checked our homepage recently? And may I add, it’s working well for us as well.

For today’s case study, I have a test the bang opposite of this!

The Company

PayPanther is an all-in-one solution for free Online Invoicing, CRM, Time Tracking, & Project Management software for freelancers & businesses.

The Test

PayPanther wanted to test between a long and a short version of the ‘pricing and signup’ page. The first time they made this page, they believed that a shorter page would drive more signups as there would be lesser distraction and content to read. In this test, they setup the original page to be pitted against a page which had 3 more sections: FAQs about pricing, testimonials, and another call to action button asking people to sign up.

This is how the original looked like:

Before

And this is how the new page looked:

After

The test was run for a month on about 1000 visitors and the variation, containing FAQs and testimonials, won! It recorded an increase of 372.62% in signups.

Thrilled by the results, PayPanther has implemented this longer page as their default “pricing and signup” page. They even plan to do further tests to find out the most optimum headlines and button texts.

Why the Variation Won?

  1. The FAQs section answered the common doubts and concerns the website visitors had. It, thus, created a sense of credibility and trust.
  2. Adding testimonials work, always. I am yet to see a test in which adding testimonials hurt conversions. You can look at this, this, and this case study for examples. Of course, they have their own rules and to use them effectively, I suggest you read this excellent post to get the most benefit from testimonials.

Let’s Talk!

Tell me know what you think about this case study. Have a similar test that you did on one of your webpages? Let’s talk about it in the comments section below.

Spread the awesomeness by sharing this post with your network on Twitter, Facebook and LinkedIn.

The post The Battle between Short and Long Pages Continues. Guess which Scored a Point. appeared first on VWO Blog.

Link:

The Battle between Short and Long Pages Continues. Guess which Scored a Point.

Thumbnail

How Removing Cross-Selling Options on Product Page Resulted in a 5.6% Increase in Orders

The Company

Drukwerkdeal is an online printing shop based out of the Netherlands. They deal in a variety of photo products ranging from clothing, corporate gifts, presentations and cutlery to a whole new variety of Christmas-theme gifts like cards, posters and calendars.

The website has a nice warm feel to it owing to all the colorful products they have for offer.

To push more sales from their product pages, Paul at Drukwerkdeal decided to optimize them using Visual Website Optimizer. He went through a number of product pages and realized that the cross-selling message on the pages was not very convincing. And it could just be doing them more harm than good.

This is how one of their product pages looked like (notice the links given in green under product description with an intent to cross sell):

ab_testing_control

The Test

Paul decided to test removing them and see what effect this change would have on their sales. Since this was going to be a big move, he decided to first test this change on only a few product categories. To implement this, he used pattern matching to include some URLs on which they wanted to test.

After removing the cross-selling links, this is how the new product page looked:

ab_testing_variation

More than 14,000 visitors became a part of this test and it was run for 2 weeks. The metrics on which this test was judged were average order value (AOV), number of products purchased per order, add-to-cart conversions and transactions. They pushed all the test data into GA (with a single click while setting up the test) and were able to take out holistic insights from the test on a range of parameters apart from absolute conversions. They also realized that the new page performed better for both the new and returning visitors and all traffic sources, particularly for organic and direct.

The Result

The variation page recorded 5.6% improvement in orders completed. Bolstered by this success, Drukwerkdeal implemented the variation style on all their product pages.

Quoting Paul, “We do a reasonable amount of testing on our website and try to be very curious about all the things we add to our site. I had a sense that it would distract visitors and would have a negative impact on conversions. Now we know that it did.

Why didn’t cross-selling work for Drukwerkdeal?

Marketers, around the globe, swear by cross-selling and up-selling. And why not? Amazon in 2006 was reported to have earned a whopping 35% of their revenue from cross-selling. Do we conclude the days of cross-selling are over? Certainly not!

In case of Drukwerkdeal, here are the things that might not have worked in favor of them:

  1. A few months back, I blogged about how colors can affect the conversions of your website. If you give a 10 sec quick look to the control page, you’ll notice that the orange buttons get the most attention and the next thing striking on the page is the green bar in which they have added the links to other related products. Imagine finding these on every product page and getting distracted from the main goal. The main focus on a product page, just second to CTA button, is to get visitors to notice the product — the product images and its description. The related products links, in the control design, are merging with the product description and thus can lead to distasteful visitor experience.
  2. As rightly mentioned in this excellent article at the-future-of-commerce, the question is not whether to offer cross-sells and up-sells. Rather, it’s how to do it. The way cross-selling was implemented on the original page was not really enticing and neither was it positioned correctly.
    Some ways you can offer cross-sells that will actually sell:

  • Show related products at the shopping cart pages or within the email alert confirming the order. Those are the easiest places to start. See how amazon does the same:
    amazon-screenshot
  • Products should be very relevant to customers’ current order. There is an excellent study which has been done by Altman Dedicated Direct which shows that, as long as you show relevant cross-selling products, customers are ready to buy even if they are slightly on the expensive side. This opposes the marketing myth which says that cross-selling options should typically be 25-35% of the value of the current purchase.
  • Try bundling: phone cases with phone, mascara with eye-liner, assorted seasonings with sauces and so on.

Let’s Talk!

Being a die-hard shopaholic, I have to confess that many times I have bought many more items than I intended. Only because I kept following a loop of “you may like this too”.

What has your experience with cross-selling and up-selling been like? I would love to know your views, both from a customer perspective as well as a seller perspective. Let’s talk in the comments section below.

The post How Removing Cross-Selling Options on Product Page Resulted in a 5.6% Increase in Orders appeared first on VWO Blog.

Read this article -  How Removing Cross-Selling Options on Product Page Resulted in a 5.6% Increase in Orders

Thumbnail

A/B Testing Case Study: Redoing Navigation Bar on Homepage Increased Sales By 15.68%

The Company

Harvard Business Services is a Delaware-registered agent and helps people incorporate their companies in Delaware. They also help their clients form LLCs and corporations and assist with filling their franchise taxes.

To encourage more people to buy their services they decided to redo the navigation bar on their homepage. With that in mind, they tweaked certain tabs, did away with some and also introduced a new tab.

The goal was to get more people to click on the tabs, engage them with the website and ultimately make them buy.

On the original homepage, there were 10 tabs namely — Home, Get Started Now, Our Services, Compare, Learning Center, Blog, Make a Payment, Videos, About Us and Contact Us.

This is how it looked:

ab_testing_control

In the variation, they made a couple of changes:

  1. The “Compare” tab was renamed to “Compare Prices”
  2. “Get Started Now” was renamed to “Form a Company Now”
  3. A new tab “How to Incorporate” was introduced, which is also present as a link in the left pane on the original homepage
  4. The tabs Blog, About Us and Contact Us were removed

Here’s how it looked:

ab_testing_variation

The Test

The test was run on close to 32,000 visitors. The goals that they were tracking were visits to the price comparison page, “How to Incorporate” page. And primarily, the actual sales.

The variation emerged as a winner and recorded 15.68% increase in total orders completed. Visits to the price comparison page and “How to Incorporate” page also increased by 66.26% and 382.45% respectively.

Here’s why I think the variation was able to increase the engagement on their website and also give them a whopping 16% increase in sales:

  1. Renaming the “Compare” tab to “Compare Prices” made it absolutely unambiguous. The word “compare” alone didn’t really give users a clear understanding of what they would see if they click on the tab.

    This was an important business change as Korin, who setup this test, puts it, “This (visits to the comparison page) is especially important for us because we work in a competitive industry and our prices are an obvious way that we stand out from the competition. We’re thrilled that this small change has enticed visitors on our site to click through to a page that compares us with the competition, so that they can be more confident in their purchase.

  2. Changing “Get Started Now” to “Form a Company” made the tone of the tab more authoritative. The new verbiage instilled a sense of confidence and made the mundane process of getting started sound more purposeful.
  3. The new tab “How to Incorporate”, which was originally also present as a link, got them an astounding 382% more visits to the page. This clearly proved to DelawareInc. that a large number of their visitors want to be educated first before they make a purchase.

    Essentially, A/B testing allowed them to hear their users speak — that they needed to understand the process before incorporating their company and wanted to see that information upfront. And not sift through multiple links in the left pane.

    This was an important business learning for HBS. Their analytics tool also told them that a lot of people from this page moved to the final purchase page bridging the much-required gap between bouncing off and making an informed purchase.

Let’s Talk

Korin was thrilled with the results. She told us that she loves VWO and is constantly trying out new tests. Shout-out to Korin — we love power users like you too!

Let us and Korin know your views about this case study in the comments section below.

The post A/B Testing Case Study: Redoing Navigation Bar on Homepage Increased Sales By 15.68% appeared first on VWO Blog.

More: 

A/B Testing Case Study: Redoing Navigation Bar on Homepage Increased Sales By 15.68%

Thumbnail

Adding Sign-up Form on the Homepage Increased Conversions by 43.85%

The Company

Tom’s Planner is a web based project planning software that allows users to create and share Gantt Charts and project plans easily. Individuals can sign up for a free account on their website and begin using the planner right away.

The website’s homepage has two CTA buttons above the fold – to sign up for an account and to watch a demo of the software.

This is how the original home page of the website looked:

tomsplanner_control

The Test

To improve the conversions of the homepage, Tom at Tom’s Planner decided to add a sign-up form, above the fold, on the homepage.

Using Visual Website Optimizer, he created a variation with the sign-up form and set it to test against the original homepage. The form had just 4 fields and a directional cue was added to direct attention to the form.

Here’s how the new homepage looked:

Variation of Tom's Planner homepage

Close to 3000 visitors became part of this A/B test and the result was in favor of the variation. The new homepage with the sign-up form recorded 43.85% more conversions.

Here’s a quick comparison image showing the original homepage and variation:

Comparison image - ab testing on homepage of Tom's Planner website

Why the Variation Won

1) The sign-up form was placed above the fold

To sign up for an account on the original homepage, there was one CTA button above the fold. Adding the form on the page increased the likelihood of visitors signing up as they could see the form right on the homepage, above the fold. And they didn’t have to click-through a CTA button or reach a different page to create an account.

2) Short form with just 4 fields

Since the sign-up form had only 4 fields, it decreased the friction of signing up for an account. The form asked for just one information from the visitors – an email. Visitors didn’t have to shell out any other personal detail to create an account.

3) Adding a directional cue pointing towards the form

It has been proven in multiple eye-tracking studies, that directional cues get immediate attention and visitors can’t help but notice in the direction they point. The arrow pointing towards the sign-up form increased the form’s visibility and gave users a clear path of action.

Let’s Talk!

If you visit Tom’s Planner website you will find couple of other changes on the homepage. Tom is trying out even more tests on his website right now.

Let’s help him drive more conversions on the homepage by suggesting more ideas that he can test. Share your optimization tricks and tips in the comments section below!

The post Adding Sign-up Form on the Homepage Increased Conversions by 43.85% appeared first on VWO Blog.

View original – 

Adding Sign-up Form on the Homepage Increased Conversions by 43.85%

Just another WordPress site