Tag Archives: control

What Science Can Teach Us About How to Create Viral Content

Think about the last thing you shared on the internet. Maybe it was an insightful video on the political turmoil in a far away country, or maybe it was a funny picture of a cat wearing a bow tie. Either way – you saw it, had an emotional reaction to it and decided to share it with others. But in the process of sharing the latest video, picture or article to your social media feeds – did you ever stop to think about why you shared it? What was your emotional response to the content? What about that response made…

The post What Science Can Teach Us About How to Create Viral Content appeared first on The Daily Egg.

Source:

What Science Can Teach Us About How to Create Viral Content

Your mobile website optimization guide (or, how to stop pissing off your mobile users)

Reading Time: 15 minutes

One lazy Sunday evening, I decided to order Thai delivery for dinner. It was a Green-Curry-and-Crispy-Wonton kind of night.

A quick google search from my iPhone turned up an ad for a food delivery app. In that moment, I wanted to order food fast, without having to dial a phone number or speak to a human. So, I clicked.

From the ad, I was taken to the company’s mobile website. There was a call-to-action to “Get the App” below the fold, but I didn’t want to download a whole app for this one meal. I would just order from the mobile site.

Dun, dun, duuuun.

Over the next minute, I had one of the most frustrating ordering experiences of my life. Labeless hamburger menus, the inability to edit my order, and an overall lack of guidance through the ordering process led me to believe I would never be able to adjust my order from ‘Chicken Green Curry’ to ‘Prawn Green Curry’.

After 60 seconds of struggling, I gave up, utterly defeated.

I know this wasn’t a life-altering tragedy, but it sure was an awful mobile experience. And I bet you have had a similar experience in the last 24 hours.

Let’s think about this for a minute:

  1. This company paid good money for my click
  2. I was ready to order online: I was their customer to lose
  3. I struggled for about 30 seconds longer than most mobile users would have
  4. I gave up and got a mediocre burrito from the Mexican place across the street.

Not only was I frustrated, but I didn’t get my tasty Thai. The experience left a truly bitter taste in my mouth.

10 test ideas for optimizing your mobile website!

Get this checklist of 10 experiment ideas you should test on your mobile website.




Why is mobile website optimization important?

In 2017, every marketer ‘knows’ the importance of the mobile shopping experience. Americans spend more time on mobile devices than any other. But we are still failing to meet our users where they are on mobile.

Americans spend 54% of online time on mobile devices. Source: KPCB.

For most of us, it is becoming more and more important to provide a seamless mobile experience. But here’s where it gets a little tricky…

Conversion optimization”, and the term “optimization” in general, often imply improving conversion rates. But a seamless mobile experience does not necessarily mean a high-converting mobile experience. It means one that meets your user’s needs and propels them along the buyer journey.

I am sure there are improvements you can test on your mobile experience that will lift your mobile conversion rates, but you shouldn’t hyper-focus on a single metric. Instead, keep in mind that mobile may just be a step within your user’s journey to purchase.

So, let’s get started! First, I’ll delve into your user’s mobile mindset, and look at how to optimize your mobile experience. For real.

You ready?

What’s different about mobile?

First things first: let’s acknowledge that your user is the same human being whether they are shopping on a mobile device, a desktop computer, a laptop, or in-store. Agreed?

So, what’s different about mobile? Well, back in 2013, Chris Goward said, “Mobile is a state of being, a context, a verb, not a device. When your users are on mobile, they are in a different context, a different environment, with different needs.”

Your user is the same person when she is shopping on her iPhone, but she is in a different context. She may be in a store comparing product reviews on her phone, or she may be on the go looking for a good cup of coffee, or she may be trying to order Thai delivery from her couch.

Your user is the same person on mobile, but in a different context, with different needs.

This is why many mobile optimization experts recommend having a mobile website versus using responsive design.

Responsive design is not an optimization strategy. We should stop treating mobile visitors as ‘mini-desktop visitors’. People don’t use mobile devices instead of desktop devices, they use it in addition to desktop in a whole different way.

– Talia Wolf, Founder & Chief Optimizer at GetUplift

Step one, then, is to understand who your target customer is, and what motivates them to act in any context. This should inform all of your marketing and the creation of your value proposition.

(If you don’t have a clear picture of your target customer, you should re-focus and tackle that question first.)

Step two is to understand how your user’s mobile context affects their existing motivation, and how to facilitate their needs on mobile to the best of your ability.

Understanding the mobile context

To understand the mobile context, let’s start with some stats and work backwards.

  • Americans spend more than half (54%) of their online time on mobile devices (Source: KPCB, 2016)
  • Mobile accounts for 60% of time spent shopping online, but only 16% of all retail dollars spent (Source: ComScore, 2015)

Insight: Americans are spending more than half of their online time on their mobile devices, but there is a huge gap between time spent ‘shopping’ online, and actually buying.

  • 29% of smartphone users will immediately switch to another site or app if the original site doesn’t satisfy their needs (Source: Google, 2015)
  • Of those, 70% switch because of lagging load times and 67% switch because it takes too many steps to purchase or get desired information (Source: Google, 2015)

Insight: Mobile users are hypersensitive to slow load times, and too many obstacles.

So, why the heck are our expectations for immediate gratification so high on mobile? I have a few theories.

We’re reward-hungry

Mobile devices provide constant access to the internet, which means a constant expectation for reward.

“The fact that we don’t know what we’ll find when we check our email, or visit our favorite social site, creates excitement and anticipation. This leads to a small burst of pleasure chemicals in our brains, which drives us to use our phones more and more.” – TIME, “You asked: Am I addicted to my phone?

If non-stop access has us primed to expect non-stop reward, is it possible that having a negative mobile experience is even more detrimental to our motivation than a negative experience in another context?

When you tap into your Facebook app and see three new notifications, you get a burst of pleasure. And you do this over, and over, and over again.

So, when you tap into your Chrome browser and land on a mobile website that is difficult to navigate, it makes sense that you would be extra annoyed. (No burst of fun reward chemicals!)

A mobile device is a personal device

Another facet to mobile that we rarely discuss is the fact that mobile devices are personal devices. Because our smartphones and wearables are with us almost constantly, they often feel very intimate.

In fact, our smartphones are almost like another limb. According to research from dscout, the average cellphone user touches his or her phone 2,167 times per day. Our thumbprints are built into them, for goodness’ sake.

Just think about your instinctive reaction when someone grabs your phone and starts scrolling through your pictures…

It is possible, then, that our expectations are higher on mobile because the device itself feels like an extension of us. Any experience you have on mobile should speak to your personal situation. And if the experience is cumbersome or difficult, it may feel particularly dissonant because it’s happening on your mobile device.

User expectations on mobile are extremely high. And while you can argue that mobile apps are doing a great job of meeting those expectations, the mobile web is failing.

If yours is one of the millions of organizations without a mobile app, your mobile website has got to work harder. Because a negative experience with your brand on mobile may have a stronger effect than you can anticipate.

Even if you have a mobile app, you should recognize that not everyone is going to use it. You can’t completely disregard your mobile website. (As illustrated by my extremely negative experience trying to order food.)

You need to think about how to meet your users where they are in the buyer journey on your mobile website:

  1. What are your users actually doing on mobile?
  2. Are they just seeking information before purchasing from a computer?
  3. Are they seeking information on your mobile site while in your actual store?

The great thing about optimization is that you can test to pick off low-hanging fruit, while you are investigating more impactful questions like those above. For instance, while you are gathering data about how your users are using your mobile site, you can test usability improvements.

Usability on mobile websites

If you are looking take get a few quick wins to prove the importance of a mobile optimization program, usability is a good place to begin.

The mobile web presents unique usability challenges for marketers. And given your users’ ridiculously high expectations, your mobile experience must address these challenges.

mobile website optimization - usability
This image represents just a few mobile usability best practices.

Below are four of the core mobile limitations, along with recommendations from the WiderFunnel Strategy team around how to address (and test) them.

Note: For this section, I relied heavily on research from the Nielsen Norman Group. For more details, click here.

1. The small screen struggle

No surprise, here. Compared to desktop and laptop screens, even the biggest smartphone screen is smaller―which means they display less content.

“The content displayed above the fold on a 30-inch monitor requires 5 screenfuls on a small 4-inch screen. Thus mobile users must (1) incur a higher interaction cost in order to access the same amount of information; (2) rely on their short-term memory to refer to information that is not visible on the screen.” – Nielsen Norman Group, “Mobile User Experience: Limitations and Strengths

Strategist recommendations:

Consider persistent navigation and calls-to-action. Because of the smaller screen size, your users often need to do a lot of scrolling. If your navigation and main call-to-action aren’t persistent, you are asking your users to scroll down for information, and scroll back up for relevant links.

Note: Anything persistent takes up screen space as well. Make sure to test this idea before implementing it to make sure you aren’t stealing too much focus from other important elements on your page.

2. The touchy touchscreen

Two main issues with the touchscreen (an almost universal trait of today’s mobile devices) are typing and target size.

Typing on a soft keyboard, like the one on your user’s iPhone, requires them to constantly divide their attention between what they are typing, and the keypad area. Not to mention the small keypad and crowded keys…

Target size refers to a clickable target, which needs to be a lot larger on a touchscreen than it is does when your user has a mouse.

So, you need to make space for larger targets (bigger call-to-action buttons) on a smaller screen.

Strategist recommendations:

Test increasing the size of your clickable elements. Google provides recommendations for target sizing:

You should ensure that the most important tap targets on your site—the ones users will be using the most often—are large enough to be easy to press, at least 48 CSS pixels tall/wide (assuming you have configured your viewport properly).

Less frequently-used links can be smaller, but should still have spacing between them and other links, so that a 10mm finger pad would not accidentally press both links at once.

You may also want to test improving the clarity around what is clickable and what isn’t. This can be achieved through styling, and is important for reducing ‘exploratory clicking’.

When a user has to click an element to 1) determine whether or not it is clickable, and 2) determine where it will lead, this eats away at their finite motivation.

Another simple tweak: Test your call-to-action placement. Does it match with the motion range of a user’s thumb?

3. Mobile shopping experience, interrupted

As the term mobile implies, mobile devices are portable. And because we can use ‘em in many settings, we are more likely to be interrupted.

“As a result, attention on mobile is often fragmented and sessions on mobile devices are short. In fact, the average session duration is 72 seconds […] versus the average desktop session of 150 seconds.”Nielsen Norman Group

Strategist recommendations:

You should design your mobile experience for interruptions, prioritize essential information, and simplify tasks and interactions. This goes back to meeting your users where they are within the buyer journey.

According to research by SessionM (published in 2015), 90% of smartphone users surveyed used their phones while shopping in a physical store to 1) compare product prices, 2) look up product information, and 3) check product reviews online.

You should test adjusting your page length and messaging hierarchy to facilitate your user’s main goals. This may be browsing and information-seeking versus purchasing.

4. One window at a time

As I’m writing this post, I have 11 tabs open in Google Chrome, split between two screens. If I click on a link that takes me to a new website or page, it’s no big deal.

But on mobile, your user is most likely viewing one window at a time. They can’t split their screen to look at two windows simultaneously, so you shouldn’t ask them to. Mobile tasks should be easy to complete in one app or on one website.

The more your user has to jump from page to page, the more they have to rely on their memory. This increases cognitive load, and decreases the likelihood that they will complete an action.

Strategist recommendations:

Your navigation should be easy to find and it should contain links to your most relevant and important content. This way, if your user has to travel to a new page to access specific content, they can find their way back to other important pages quickly and easily.

In e-commerce, we often see people “pogo-sticking”—jumping from one page to another continuously—because they feel that they need to navigate to another page to confirm that the information they have provided is correct.

A great solution is to ensure that your users can view key information that they may want to confirm (prices / products / address) on any page. This way, they won’t have to jump around your website and remember these key pieces of information.

Implementing mobile website optimization

As I’m sure you’ve noticed by now, the phrase “you should test” is peppered throughout this post. Because understanding the mobile context, and reviewing usability challenges and recommendations are first steps.

If you can, you should test any recommendation made in this post. Which brings us to mobile website optimization. At WiderFunnel, we approach mobile optimization just like we would desktop optimization: with process.

You should evaluate and prioritize mobile web optimization in the context of all of your marketing. If you can achieve greater Return on Investment by optimizing your desktop experience (or another element of your marketing), you should start there.

But assuming your mobile website ranks high within your priorities, you should start examining it from your user’s perspective. The WiderFunnel team uses the LIFT Model framework to identify problem areas.

The LIFT Model allows us to identify barriers to conversion, using the six factors of Value Proposition, Clarity, Relevance, Anxiety, Distraction, and Urgency. For more on the LIFT Model, check out this blog post.

A LIFT illustration

I asked the WiderFunnel Strategy team to do a LIFT analysis of the food delivery website that gave me so much grief that Sunday night. Here are some of the potential barriers they identified on the checkout page alone:

Mobile website LIFT analysis
This wireframe is based on the food delivery app’s checkout page. Each of the numbered LIFT points corresponds with the list below.
  1. Relevance: There is valuable page real estate dedicated to changing the language, when a smartphone will likely detect your language on its own.
  2. Anxiety: There are only 3 options available in the navigation: Log In, Sign Up, and Help. None of these are helpful when a user is trying to navigate between key pages.
  3. Clarity: Placing the call-to-action at the top of the page creates disjointed eyeflow. The user must scan the page from top to bottom to ensure their order is correct.
  4. Clarity: The “Order Now” call-to-action and “Allergy & dietary information links” are very close together. Users may accidentally tap one, when they want to tap the other.
  5. Anxiety: There is no confirmation of the delivery address.
  6. Anxiety: There is no way to edit an order within the checkout. A user has to delete items, return to the menu and add new items.
  7. Clarity: Font size is very small making the content difficult to read.
  8. Clarity: The “Cash” and “Card” icons have no context. Is a user supposed to select one, or are these just the payment options available?
  9. Distraction: The dropdown menus in the footer include many links that might distract a user from completing their order.

Needless to say, my frustrations were confirmed. The WiderFunnel team ran into the same obstacles I had run into, and identified dozens of barriers that I hadn’t.

But what does this mean for you?

When you are first analyzing your mobile experience, you should try to step into your user’s shoes and actually use your experience. Give your team a task and a goal, and walk through the experience using a framework like LIFT. This will allow you to identify usability issues within your user’s mobile context.

Every LIFT point is a potential test idea that you can feed into your optimization program.

Case study examples

This wouldn’t be a WiderFunnel blog post without some case study examples.

This is where we put ‘best mobile practices’ to the test. Because the smallest usability tweak may make perfect sense to you, and be off-putting to your users.

In the following three examples, we put our recommendations to the test.

Mobile navigation optimization

In mobile design in particular, we tend to assume our users understand ‘universal’ symbols.

Aritzia Hamburger Menu
The ‘Hamburger Menu’ is a fixture on mobile websites. But does that mean it’s a universally understood symbol?

But, that isn’t always the case. And it is certainly worth testing to understand how you can make the navigation experience (often a huge pain point on mobile) easier.

You can’t just expect your users to know things. You have to make it as clear as possible. The more you ask your user to guess, the more frustrated they will become.

– Dennis Pavlina, Optimization Strategist, WiderFunnel

This example comes from an e-commerce client that sells artwork. In this experiment, we tested two variations against the original.

In the first, we increased font and icon size within the navigation and menu drop-down. This was a usability update meant to address the small, difficult to navigate menu. Remember the conversation about target size? We wanted to tackle the low-hanging fruit first.

With variation B, we dug a little deeper into the behavior of this client’s specific users.

Qualitative Hotjar recordings had shown that users were trying to navigate the mobile website using the homepage as a homebase. But this site actually has a powerful search functionality, and it is much easier to navigate using search. Of course, the search option was buried in the hamburger menu…

So, in the second variation (built on variation A), we removed Search from the menu and added it right into the main Nav.

Mobile website optimization - navigation
Wireframes of the control navigation versus our variations.

Results

Both variations beat the control. Variation A led to a 2.7% increase in transactions, and a 2.4% increase in revenue. Variation B decreased clicks to the menu icon by -24%, increased transactions by 8.1%, and lifted revenue by 9.5%.

Never underestimate the power of helping your users find their way on mobile. But be wary! Search worked for this client’s users, but it is not always the answer, particularly if what you are selling is complex, and your users need more guidance through the funnel.

Mobile product page optimization

Let’s look at another e-commerce example. This client is a large sporting goods store, and this experiment focused on their product detail pages.

On the original page, our Strategists noted a worst mobile practice: The buttons were small and arranged closely together, making them difficult to click.

There were also several optimization blunders:

  1. Two calls-to-action were given equal prominence: “Find in store” and “+ Add to cart”
  2. “Add to wishlist” was also competing with “Add to cart”
  3. Social icons were placed near the call-to-action, which could be distracting

We had evidence from an experiment on desktop that removing these distractions, and focusing on a single call-to-action, would increase transactions. (In that experiment, we saw transactions increase by 6.56%).

So, we tested addressing these issues in two variations.

In the first, we de-prioritized competing calls-to-action, and increased the ‘Size’ and ‘Qty’ fields. In the second, we wanted to address usability issues, making the color options, size options, and quantity field bigger and easier to click.

mobile website optimization - product page variations
The control page versus our variations.

Results

Both of our variations lost to the Control. I know what you’re thinking…what?!

Let’s dig deeper.

Looking at the numbers, users responded in the way we expected, with significant increases to the actions we wanted, and a significant reduction in the ones we did not.

Visits to “Reviews”, “Size”, “Quantity”, “Add to Cart” and the Cart page all increased. Visits to “Find in Store” decreased.

And yet, although the variations were more successful at moving users through to the next step, there was not a matching increase in motivation to actually complete a transaction.

It is hard to say for sure why this result happened without follow-up testing. However, it is possible that this client’s users have different intentions on mobile: Browsing and seeking product information vs. actually buying. Removing the “Find in Store” CTA may have caused anxiety.

This example brings us back to the mobile context. If an experiment wins within a desktop experience, this certainly doesn’t guarantee it will win on mobile.

I was shopping for shoes the other day, and was actually browsing the store’s mobile site while I was standing in the store. I was looking for product reviews. In that scenario, I was information-seeking on my phone, with every intention to buy…just not from my phone.

Are you paying attention to how your unique users use your mobile experience? It may be worthwhile to take the emphasis off of ‘increasing conversions on mobile’ in favor of researching user behavior on mobile, and providing your users with the mobile experience that best suits their needs.

Note: When you get a test result that contradicts usability best practices, it is important that you look carefully at your experiment design and secondary metrics. In this case, we have a potential theory, but would not recommend any large-scale changes without re-validating the result.

Mobile checkout optimization

This experiment was focused on one WiderFunnel client’s mobile checkout page. It was an insight-driving experiment, meaning the focus was on gathering insights about user behavior rather than on increasing conversion rates or revenue.

Evidence from this client’s business context suggested that users on mobile may prefer alternative payment methods, like Apple Pay and Google Wallet, to the standard credit card and PayPal options.

To make things even more interesting, this client wanted to determine the desire for alternative payment methods before implementing them.

The hypothesis: By adding alternative payment methods to the checkout page in an unobtrusive way, we can determine by the percent of clicks which new payment methods are most sought after by users.

We tested two variations against the Control.

In variation A, we pulled the credit card fields and call-to-action higher on the page, and added four alternative payment methods just below the CTA: PayPal, Apple Pay, Amazon Payments, and Google Wallet.

If a user clicked on one of the four alternative payment methods, they would see a message:

“Google Wallet coming soon!
We apologize for any inconvenience. Please choose an available deposit method.
Credit Card | PayPal”

In variation B, we flipped the order. We featured the alternative payment methods above the credit card fields. The focus was on increasing engagement with the payment options to gain better insights about user preference.

mobile website optimization - checkout page
The control against variations testing alternative payment methods.

Note: For this experiment, iOS devices did not display the Google Wallet option, and Android devices did not display Apple Pay.

Results

On iOS devices, Apple Pay received 18% of clicks, and Amazon Pay received 12%. On Android devices, Google Wallet received 17% of clicks, and Amazon Pay also received 17%.

The client can use these insights to build the best experience for mobile users, offering Apple Pay and Google Wallet as alternative payment methods rather than PayPal or Amazon Pay.

Unexpectedly, both variations also increased transactions! Variation A led to an 11.3% increase in transactions, and variation B led to an 8.5% increase.

Because your user’s motivation is already limited on mobile, you should try to create an experience with the fewest possible steps.

You can ask someone to grab their wallet, decipher their credit card number, expiration date, and ccv code, and type it all into a small form field. Or, you can test leveraging the digital payment options that may already be integrated with their mobile devices.

The future of mobile website optimization

Imagine you are in your favorite outdoor goods store, and you are ready to buy a new tent.

You are standing in front of piles of tents: 2-person, 3-person, 4-person tents; 3-season and extreme-weather tents; affordable and pricey tents; light-weight and heavier tents…

You pull out your smartphone, and navigate to the store’s mobile website. You are looking for more in-depth product descriptions and user reviews to help you make your decision.

A few seconds later, a store employee asks if they can help you out. They seem to know exactly what you are searching for, and they help you choose the right tent for your needs within minutes.

Imagine that while you were browsing products on your phone, that store employee received a notification that you are 1) in the store, 2) looking at product descriptions for tent A and tent B, and 3) standing by the tents.

Mobile optimization in the modern era is not about increasing conversions on your mobile website. It is about providing a seamless user experience. In the scenario above, the in-store experience and the mobile experience are inter-connected. One informs the other. And a transaction happens because of each touch point.

Mobile experiences cannot live in a vacuum. Today’s buyer switches seamlessly between devices [and] your optimization efforts must reflect that.

Yonny Zafrani, Mobile Product Manager, Dynamic Yield

We wear the internet on our wrists. We communicate via chat bots and messaging apps. We spend our leisure time on our phones: streaming, gaming, reading, sharing.

And while I’m not encouraging you to shift your optimization efforts entirely to mobile, you must consider the role mobile plays in your customers’ lives. The online experience is mobile. And your mobile experience should be an intentional step within the buyer journey.

What does your ideal mobile shopping experience look like? Where do you think mobile websites can improve? Do you agree or disagree with the ideas in this post? Share your thoughts in the comments section below!

The post Your mobile website optimization guide (or, how to stop pissing off your mobile users) appeared first on WiderFunnel Conversion Optimization.

See original: 

Your mobile website optimization guide (or, how to stop pissing off your mobile users)

5 test results that made us say ‘A-ha!’ in 2016

Reading Time: 10 minutes

‘A-ha!’ moment (n.): An insight that leads to more substantial revenue lift and profitable growth for your company (e.g. the moment all Optimizers live for).

At WiderFunnel, our mission is create profitable ‘A-ha!’ moments for our clients every day.

Last year, I created a five-part ‘A-ha!’ moments series: Five mini blog posts focused on five of our favorite insights from 2015. Well, turns out 2016 was also full of ‘A-ha!’ moments that were too good to keep to ourselves.

This post explores five of WiderFunnel’s favorite ‘A-ha!’s from the past year. I hope that they inspire you as you begin planning your 2017 experiments!

‘A-ha!’ #1: Using color psychology to increase conversions

If you follow WiderFunnel, you probably know that we are not big fans of conversion optimization ‘best practices’ like “all calls-to-action should be orange”.

Because, frankly, best practices may not be the best thing for your business. They must be proven in your business context, for your users.

That said, this first ‘A-ha!’ moment comes from a color isolation test. But, the ‘A-ha’ isn’t the result, it’s the why behind the hypothesis.

The strategy

One of our clients provides an online consumer information service — users type in a question and get an Expert answer. Once a user asks their question, they have entered a four-step funnel:

  • Step 1: Ask the question
  • Step 2: Add more information
  • Step 3: Pick an Expert
  • Step 4: Get an answer (aka the checkout page)

We have been testing on each step of this funnel, but this particular experiment was on the all-important checkout page, the final conversion.

What can the right color do?

For each WiderFunnel client, we create a customized growth program, however, each program is built with our proven Infinity Optimization Process™. The process cycles between two phases: Explore (information-gathering) and Validate (testing and proving).

Research on consumer behavior, psychological principles, and persuasion techniques is a huge part of the Explore phase. Our Strategists use this research, along with several other information touchpoints, when developing hypotheses.

This past year, one of WiderFunnel’s favorite bloggers and researchers, Nick Kolenda, published a giant piece on color psychology. Kolenda looked at 50 academic studies on color, and compiled his findings. According to him, certain colors can inspire certain actions.

Aha! #1 color spectrum
Can certain colors influence your users’ behavior?

In the case of this client, Optimization Strategist, Nick So, wanted to see if adding a subtle, subconscious visual cue to the checkout page would be more motivational for users. He was looking, specifically, at warm colors.

Persuasion principle
: Warm colors (with high saturation and low brightness) increase arousal because they trigger impulsivity, and tend to increase behavioral responses.

The test: Isolation I and isolation II

In the first isolation, Nick decided to put warm colors to the test.

Hypothesis: Increasing prominence of the checkout area by using a color linked to increasing action and responses will improve visual clarity of the page and increase conversions.

Aha! #1 Control
The client’s original checkout page.
Aha! 1 VarA
Our variation, which emphasized the payment section with a warm color background.

In the variation, Nick removed all other background colors and added a warm orange background to the payment section. And it worked! This variation saw a statistically significant 2.82% increase in conversions.

We wanted to validate this insight across audiences, so Nick created a second isolation for this client’s mobile users.

Aha! #1 mobile
From right to left: the Control, VarA, and the winning VarB.

He tested the Control against two variations: Variation B (the warm color isolation) was built on variation A, so Nick was able to track the isolation properly. In this experiment, the color change was responsible for a 2.7% lift in conversions, almost the exact same increase as in the desktop test.

A-ha!

Nick So WiderFunnel

It’s always amazing how such seemingly subtle psychological cues and persuasion elements can have a big potential impact on user behavior. We are fortunate to be able to have a client that has the traffic, trusts us, and understands testing enough to allow us to run an isolation on such an interesting concept.

– Nick So

‘A-ha!’ #2: Sometimes, all your users need is a clear next step

You may have heard the phrase “if content is king, revenue is queen”…

WiderFunnel Founder & CEO, Chris Goward, wrote, “Content is important for getting people to your site, from search algorithms to social share to links to your site, but content alone doesn’t make you revenue. Content without conversions is just free publishing.

Our second ‘A-ha!’ moment comes from testing we have been doing with one WiderFunnel client: A content site that provides information for the individual investor. This client offers a ton of free resources on its website to help users stay on top of their finances.

Of course, they also offer subscription services, such as their newsletter and professional advisor service, which provides premium stock-picking advice to users. Our goal is to help this client increase profitable conversions.

The strategy

When we began testing with this client, there were many different paths that users could take after landing on an investing article. And there was almost no indication that there were professional services available (which is how this client makes money!)

The WiderFunnel Strategy team did an initial LIFT analysis of the site-wide navigation, which revealed several problems, like:

  • There was not a clear, primary call-to-action in the nav (Clarity)
  • There was a general lack of urgency (Urgency)
  • The menu drop-down for “Stock Picks” had one, ambiguous dropdown (Anxiety)
  • If someone is ready to spend money, it is not clear how to do so (Clarity)
Aha! #2 Control
The original navigation didn’t have a clear call-to-action.

We wanted to test giving users a clear action to take in the site-wide navigation. This way, a user who wanted more would know which path to take.

We tested adding a “Latest Stock Picks” call-to-action in the nav (replacing the “Stock Picks” dropdown); the assumption was that users of this client’s site are looking for stock-picking advice, specifically.

Hypothesis: Creating a clear “Latest Stock Picks” CTA in the site-wide navigation will cause more users to enter a revenue-driving funnel from all parts of the site.

The variations

We tested two variations, each of which featured the “Latest Stock Picks” call-to-action. But, in each variation this CTA took the user to a different page. Our ultimate goal was to find out:

  1. If users were even aware that there are premium paid services offered, and
  2. Which funnel is best to help users make a decision and, ultimately, a purchase?

With variation A, we added the “Latest Stock Picks” CTA in the nav. This call-to-action sent users to the homepage and anchored them in the premium services section. (This is how the functionality of the original dropdown worked.)

This section provides a lot of detail about this client’s different offerings, along with a “Sign Up Today” call-to-action.

Aha! #2 VarA
The winning variation featured a very clear call-to-action, while maintaining the same functionality as the Control.

With variation B, we wanted to test limiting choice. Rather than showing users a bunch of product options, the “Latest Stock Picks” CTA sent them directly to the professional advisor sign up page (this client’s most popular product).

Aha! #2 VarB
In this variation, the CTA sent users to a product page.

A-ha!

Both variations beat the control, with variation A resulting in an 11.17% lift in transactions with 99% confidence and variation B resulting in a 7.9% increase in transactions with 97% confidence.

Interestingly, because variation B was built on variation A, we were able to see that it actually decreased transactions by 3.3%.

So, what does this mean? Here are a few takeaways we plan to explore further in 2017:

  • Users may have been unsure of how to sign up (or that they could sign up) due to lack of CTA prominence on the original site-wide navigation
    • It is also possible that Urgency was a motivator for this client’s users: Changing the “Stock Picks” drop down to a “Latest Stock Picks” CTA increased urgency and led to more conversions. This wasn’t a clear isolation but it’s good evidence to follow-up with!
  • Users prefer some degree of choice over being sent to one product (as seen with the decrease in transactions caused by variation B)

But the main moral of this ‘A-ha!’? Make sure your users know exactly where to find what you’re selling. ‘Cause content without conversions is just free publishing.

‘A-ha!’ #3: The power of proper Design of Experiments

Earlier this year, I published a case study on WiderFunnel client, weBoost. WeBoost is an e-commerce retailer and manufacturer of cellular signal boosters.

This case study explored several tests that we had run on multiple areas of the weBoost site, including a series of design tests we ran on their product category page. Our third A-ha! moment takes up where the case study left off in this series…

A quick refresher

Originally, the weBoost product category pages featured a non-traditional design layout. A large image in the top left corner, very tall product modules, and right-hand filters made these pages unique among e-commerce catalog pages.

Aha! #3 Original
The original product category page layout.

We decided to test displaying products in landscape versus the long, portrait-style modules. According to a Baymard study of e-commerce sites, technical products are easier to compare in a horizontal layout because there is more space to include specs. This was variation A.

Aha! #3 Horizontal
Variation A featured a simple change: vertical modules to horizontal.

In variation B, we wanted to explore the idea that users didn’t need to see a product details page at all. Maybe the information on the category page was all users needed to make a confident purchase.

Variation B was built on variation A, with one isolated change: We changed the primary visual call-to-action from “View Details” to “Add To Cart”.

Aha! #3 Add To Cart
Note the primary CTA in this variation: “Add To Cart”

In a backward ‘A-ha!’ moment, variation A (based on the Baymard study) decreased transactions by -9.6%. Despite our intentions, the horizontal layout might have made it more difficult for users to compare products.

But! Variation B, with the add-to-cart focus, saw a 16.4% increase in transactions against the control page. It turns out that many users are actually comfortable adding products to their cart right from the category page.

Variation B moved more users further through the funnel and ultimately resulted in a large uptick in transactions, despite the negative impact of the horizontal layout.

After comparing variation A to variation B, WiderFunnel Optimization Strategist, Michael St Laurent, estimated that the “Add To Cart” call-to-action was actually worth a lift of 28.7% in transactions.

The follow-up (and subsequent ‘A-ha!’)

We knew that the horizontal layout led to a decrease in transactions and we knew that the horizontal layout plus the isolated CTA change led to a sizable increase in transactions.

So, we ran the obvious follow-up experiment: We tested a variation featuring the vertical module design with the add-to-cart focused call-to-action. We expected to see at least a 29% increase in transactions. We used variation B from the previous test as the Control, following proper Design of Experiments.

Aha! #3 Final
This variation reverted to the vertical modules from the original page, and featured the “Add To Cart” CTA.

As predicted, when we tested the “Add To Cart” call-to-action on the vertical modules, we saw a whopping 38.1% increase in transactions (more than double the 16.4% increase we observed with the horizontal layout, and 9 percentage points more than the estimate).

A-ha!

It never gets old to see isolations at work. The ‘A-ha!’ moment here is that no test ever has to be a ‘loser’. If you structure your tests using isolations, you will be able to track the potential impact of each change.

Michael St Laurent WiderFunnel

This entire time, we were assuming that users needed more information to make a technical product selection. We were focused on making the specs easier to compare, when there was an entire segment of the audience that was ready to put the product in their cart without more investigation. Sometimes you have to challenge your assumptions. In this case it paid off!

– Michael St Laurent, Optimization Strategist, WiderFunnel

‘A-ha!’ #4: De-emphasizing price reduces user anxiety

One of our clients is Vital Choice, a trusted source for fast home delivery of the world’s finest wild seafood and organic fare, harvested from healthy, well-managed wild fisheries and farms.

Our fourth ‘A-ha!’ moment from 2016 came out of the testing we did with Vital Choice on their product detail pages and revolves around de-emphasizing price, in favor of value proposition points.

While the results may not be surprising, the WiderFunnel Strategy team would not have prioritized this particular test if they hadn’t done extensive user research beforehand. Because we took the pulse of Vital Choice users, we were able to reduce anxiety and provide more motivation to purchase.

The strategy

Let’s say you wanted to order a few organic, grass-fed American Wagyu beef patties from the Vital Choice website. You would have eventually landed on a detail page that looked like this (the Control in this experiment):

Aha! #4 Control
Note the prominence of price on the original detail page.

As you can see, price is displayed prominently near the ‘Add To Cart’ call-to-action. But, during the Explore (information gathering) phase, WiderFunnel Optimization Strategist, Dennis Pavlina, identified several common themes of barriers to conversion in user survey responses, including:

  1. Price: Users love Vital Choice and the excellent quality of their products, but they often mention the premium they are paying. For many users, it is a ‘treat’ and a ‘luxury’ to buy from Vital Choice. Price-related themes, such as discount codes or coupons, also came up often in surveys.
  2. Shipping: Users often express concern about how frozen perishable items are shipped, particularly in warmer climates in the U.S.

If we could reduce user anxiety in these two areas, we believed Vital Choice would see a surge in conversions.

The test

Hypothesis: Adding relevant value proposition points that justify the price and quality of the product, and adding copy to reduce anxiety around shipping in close proximity of the order area on the product page, will increase conversions.

With our variation, we wanted to address the following barriers to conversion on the original page:

  • It was unclear what users would receive in their shipment i.e. how it would be shipped to them, how long it would take, etc. (Anxiety)
  • There were no prominently displayed value proposition points to justify the price of the product. (Value Proposition)
  • There was a lot of emphasis on the price of the product. (Anxiety)
Aha! #4 VarA
This variation addressed user anxieties by de-emphasizing price, and reassuring users of shipping guarantees.

A-ha!

This variation led to a 3.3% increase in conversions and a 2.7% increase in average order value, resulting in almost $250,000 in estimated additional annual revenue.

Conversions were up for almost every goal we tracked: Visits to checkout (step 2), visits to checkout (step 3), visits to checkout (step 4), total visits to cart, and average order value. But they were down to unique visits to cart.

Dennis Pavlina WiderFunnel

The most interesting part of analyzing results was noticing that, although unique visits to cart were slightly down, there was a large increase in total visits to cart. It’s a surprising pattern. We hypothesize that users may have been more confident and willing to purchase more items at once, when anxiety was reduced.

– Dennis Pavlina, Optimization Strategist, WiderFunnel

The fact that de-emphasizing price worked for Vital Choice users isn’t what made us say, ‘A-ha!’. But, the proven power of listening to, and addressing their users’ stated concerns, did. When in doubt, ask your users.

A-ha! #5: Quick view, long delay

A-ha! number 5 comes from testing we did with another one of our clients, a large retailer of sports goods, footwear, and apparel. We have been working with this company for more than a year to optimize their e­-commerce experiences, with the goal of increasing transactions.

Like on many e-commerce sites, users on this client’s site could view product details directly on the category page, using a Quick View functionality. When a user hovered over a product, they would see the product details in a Quick View window.

In our final ‘a-ha!’, we explore what (so often) happens when you test a common practice.

The strategy

Distraction is a very common barrier to conversion; often, there are elements on a client’s page that are diverting visitors away the from the ultimate goal.

For Michael St Laurent, the Quick View option on this client’s category page was a potential distraction.

Michael St Laurent WiderFunnel

The more visual cues and action options your visitor has to process, the less likely they are to make a conversion decision. At WiderFunnel, we have found that minimizing distractions such as unnecessary product options, links, and extraneous information will increase your conversion rate.

– Michael St Laurent

So, he decided to put his theory that the Quick View is an unnecessary distraction to the test.

The test

Hypothesis: Disabling the Quick View functionality will result in reduced distraction and ultimately, more conversions.

The Control in this test was the client’s original category page, featuring the Quick View functionality.

Aha! #5 Control
The original Quick View functionality.

In the Quick View, users could quickly move from product to product on the category page without going to a product page itself.

We tested this control against a variation that removed the Quick View functionality completely.

Aha! #5 No Quick View
In our variation, we eliminated the Quick View functionality entirely.

A-ha!

It turns out the Quick View functionality was, indeed, distracting. Disabling it resulted in more product exploration as well as more transactions; transactions increased by 4% (a big lift for a high-traffic company like this one!)

If your site has a functionality, like Quick View or a rotating banner, you should probably test it! While ‘flashy’ functionalities are…well…flashy, they are rarely what your users want, and may be preventing your users from actually purchasing.

At the end of every month, the WiderFunnel Strategy team shares their favorite ‘A-ha!’ moments from the past four weeks. Sometimes, the ‘A-ha!’ is an exciting result and big lift for a client, sometimes it’s a twist insight, sometimes it’s a ‘losing’ test that inspired a winning test.

As Chris Goward explains,

There’s no downside to communicating what you’ve learned from every test. If you view your optimization program as a strategic method for learning about your customers and prospects – for truly understanding their mindset – rather than a tactical tweaking program, you can take a broader perspective and find the gains in every test.

I hope that these ‘A-ha!’ moments inspire you to do the work, structure your tests properly, and learn constantly in 2017. And I encourage you to share your favorite ‘A-ha!’ moments in the comments section below.

The post 5 test results that made us say ‘A-ha!’ in 2016 appeared first on WiderFunnel Conversion Optimization.

Link to article: 

5 test results that made us say ‘A-ha!’ in 2016

Micro and Macro Conversions | Choosing the Right CRO Metrics

Clearly defining the key performance indicators, or KPIs, is the first step to any Conversion Rate Optimization (CRO) campaign. It is only through tracking and measuring results on these KPIs that a business can optimize for growth.

The KPIs in CRO can be broadly divided into two categories: macro and micro conversions (or goals).

  • Macro conversions are the primary goals of a website. Examples of macro conversions for SaaS, eCommerce, or any other online enterprise could be revenue, contact us, request a quote, and free-trial.
  • Micro conversions are defined as steps or milestones that help you reach the end goal. Micro conversion examples would include email clicks, downloads on white paper, blog subscriptions, and so on.

Improving macro goals is imperative to the growth of any enterprise. However, it is equally important that enterprises measure micro goals so as to enhance overall website usability. Avinash Kaushik talks on similar lines: “Focus on measuring your macro (overall) conversions, but for optimal awesomeness, identify and measure your micro conversions as well.”

In this blog post, we discuss why enterprises should:

  • Track Micro Conversions Alongside Macro Conversions
  • Optimize Micro Conversions That Impact Macro Conversions

Track Micro Conversions Alongside Macro Conversions

Each micro conversion acts as a process milestone in the conversion funnel and impacts the ultimate step, or macro conversion. The following example explains this in a simple manner. Let’s take the case of a regular conversion funnel of a SaaS website. The funnel starts at the home page and ends with a purchase.

The visits from the home page to the features page and from the features page to the pricing page are micro conversions in this example. These micro conversions have the same end goal, that is, “purchases”.

conversion funnel SaaS - Micro Goals and Macro Goals

If we were to double micro conversions from the home page to the “features” page, the result would be almost same as shown in the table below:

Increase in Micro Conversions Impacts Macro Conversions

The number of completed purchases, that is, the macro conversion, also doubled. This example illustrates how micro conversions can have an impact on the macro conversions in a funnel.

 Dr. Flint McGlaughlin, Managing Director and CEO of MECLABS, shares the same thoughtThe funnel represents and should be thought of as a representation of what is the heart of marketing, and that is a series of decisions. Those decisions are key transitions; I would call them micro-yeses. There are a series of micro-yeses necessary to help someone achieve an ultimate yes. The Ultimate Yes is the sale in most cases. At each of these junctures, we have to help people climb up the funnel.”

While there are a number of reasons why micro conversions should be tracked, here are the two main arguments:

  • Micro conversions help you assess buyer readiness, or intent.
  • Micro conversions help you assess points of friction in a buyer’s journey.

Micro Conversions Help You Assess Buyer Readiness or Intent

All visitors who land on your website don’t have the intent to make a purchase. Some of them could be running a quick comparative research while others could be checking out your products or services during their first visit. Tracking micro conversions helps you understand whether a visitor could be a potential customer. For instance, tracking micro conversions, such as downloading a product brochure or adding a product to a wishlist, shows the future possibility of conversions on a macro goal.

NN Group has defined micro conversions as, “These are not the primary goals of the site, but are desirable actions that are indicators of potential future macro conversions.”

These micro conversions, or secondary goals, are worth tracking as they clearly show that a visitor might have an interest in your business or product.

Here is an example:

PriceCharting conducted a preliminary A/B test to study if their buyers intended to buy from them at higher prices. They used the learning from this preliminary test for future testing. The objective of the test was defined as: Figuring out how price sensitive were the customers. On the “control,” they used “Starts at $4” next to the “Price Guide” CTA. Two other variations were studied against control. One of them stated a starting price of $2 next to the CTA, and the other mentioned $1 as the starting price. 

micro conversions on CTA for Control
Control
Micro Conversions on CTA for Variations
Variations

The test results showed that the variation which stated the highest buys won the most clicks on the “Pricing Guide” CTA. This implied that people visiting PriceCharting valued their products and showed readiness to buy even at higher prices. The learning from this exercise for PriceCharting’s future tests was that price was not a major factor influencing their visitors. 

Micro Conversions  Help You Assess Points of Friction in Your Buyer’s Journey

Along with providing a complete view of your buyer’s journey, tracking micro conversions also helps identify drop-offs on the conversion funnel. For example, on an eCommerce website, users frequently visiting the product page but not adding products to cart implies something is putting off the visitors for moving from “product” to “add-to-cart.” Optimizing the micro goal here, which is increasing “add-to-cart” actions, will ultimately result in increased revenue.

Here’s an example of a multi-step sign-up form on a SaaS website. Suppose many users do not complete the form. By tracking micro conversions on the form, you will be able to identify the friction points. Maybe one of the steps in the form that asks for credit card information of users brings the most friction. With this knowledge, you can assess where users lie in their buyer journey and optimize the form accordingly. Optimizing each step in the form or micro conversions will help you improve your macro conversion.

When testing, the primary goal in the above examples can be to improve micro conversions. A case study by VWO talks about how displaying a banner for top deals increased engagement by 105% for eCommerce client Bakker-helligom. Ben Vooren, an online marketer at Bakker, realized that visitors go to the information pages and read the information, but leave without buying from the website, which was the macro goal. This was the friction that Ben wanted to address. He hypothesized that adding commercially-focused banners at the top of all the information pages (micro goal) will help resolve this friction. The test was run for 12 days on 8,000 visitors. The winning variation led to a 104.99% increase in visits to the “top deals page” and a statistical significance of 99.99%.

Optimize Those Micro Conversions Which Impact Macro Conversions

While running an A/B test for multiple variations, studying micro conversions on each of those variations can provide valuable insights. It can show you which changes impacted micro conversions, resulted in improved macro conversions, and which ones did not. As we mentioned, there are a number of micro conversions that you can look to optimize. But not all of these would contribute equally to macro conversions.

For instance, to optimize an eCommerce product page with macro goal of “increasing checkout”, there could be a number of test variations that you can run:

In Variation 1 the CTA ‘add to wish-list’ is made prominent
In Variation 2 the CTA ‘save for later’ is made prominent

Both of these variations will not yield the same impact on the macro goal of increasing checkouts. You may realise that making the “save for later” CTA more prominent is yielding more increase in checkouts. So you would want to prioritize that micro conversion in the subsequent tests.

That said, when running a conversion rate optimization program, the test goal should be set as close to revenue as possible. There are two scenarios explained here wherein optimization for micro conversions can prove disastrous:

When Macro Conversions are Not Considered

Solely optimizing for micro conversions without considering how it impacts a major business goal is a total waste of time and efforts.

Peep Laja from ConversionXL says, “If you don’t have enough transaction volume to measure final purchases or revenue, that sucks. But if you now optimize for micro conversions instead, you might be just wasting everyone’s time as you can’t really measure business impact.”

For example, an eCommerce website  can increase micro conversions (visits from the home page to the product page) by making the menu bar prominent on the home page. This change might result in higher visits to the product page. However, if you are not tracking the impact of this change on macro conversion (checkouts),  the whole optimization process would lack direction.

When the Focus is on Quick Results

A/B tests with macro conversions as the primary goal can take a long time to provide conclusive results. Conversely, certain tests which measure micro conversions have a lesser testing time.

This happens because macro goals are always less in number in comparison to micro. For statistically significant results, a good amount of conversions on the macro goal are required. This exercise would take comparatively much more time than collecting micro conversions.

For example, on a SaaS website, if your primary goal was to increase visits from “the home page to the products page,” the test will take lesser time (because it has higher traffic) to give conclusive results compared to if the primary goal for the test was “Request a Demo.”

However, testing micro conversions with the objective of completing an experiment faster can lead to failure. While you can track different micro conversions, each of them may not result in a winning variation. This happens because each of those micro conversions might not directly lead to a lift in conversion rates. The false Micro-Conversion Testing Assumption example explained in a post on WidderFunnel, is one example that explains this. The gist of the proposed example is that optimizing micro conversions by assuming an equal drop-off at each stage of the funnel ultimately led to the loss of revenue.

Conclusion

The success of a CRO program rests on how well you define your micro and macro goals. The closer your micro goals are to the end goal in the funnel, the higher are your chances of getting a winning variation. On the other hand, tracking micro conversions and improving them can help you enhance the overall UX of your website.

What metrics are you tracking and optimizing for your conversion rate optimization program? Drop us a comment and let us know.

Free-trial CTA

The post Micro and Macro Conversions | Choosing the Right CRO Metrics appeared first on VWO Blog.

This article – 

Micro and Macro Conversions | Choosing the Right CRO Metrics

How to A/B test for long-term success (don’t underestimate insights!)

Reading Time: 6 minutes

Imagine you’re a factory manager.

You’re under pressure from your new boss to produce big results this quarter. (Results were underwhelming last quarter). You have a good team with high-end equipment, and can meet her demands if you ramp up your production speed over the coming months.

Production

You’re eager to impress her and you know if you reduce the time you spend on machine maintenance you can make up for the lacklustre results from last quarter.

Flash forward: The end of the Q3 rolls around, and you’ve met your output goals! You were able to meet your production levels by continuing to run the equipment during scheduled down-time periods. You’ve achieved numbers that impress your boss…

…but in order to maintain this level of output you will have to continue to sacrifice maintenance.

In Q4, disaster strikes! One of your 3 machines breaks down leaving you with zero output, and no way to move the needle forward for your department. Your boss gets on your back for your lack of foresight, and eventually your job is given to the young hot-shot on your team and you are left searching for a new gig.

A sad turn of events, right? Many people would label this a familiar tale of poor management (and correctly so!). Yet, when it comes to conversion optimization, there are many companies making the same mistake.

Optimizers are so often under pressure to satisfy the speed side of the equation that they are sacrificing its equally important counterpart…

Insights.

Consider the following graphic.

Growth-insights-spectrum
The spectrum ranges from straight forward growth-driving A/B tests, to multivariate insight-driving tests.

If you’ve got Amazon-level traffic and proper Design of Experiments (DOE), you may not have to choose between growth and insights. But in smaller organizations this can be a zero-sum equation. If you want fast wins, you sacrifice insights, and if you want insights, you may have to sacrifice a win or two.

Sustainable, optimal progress for any organization will fall somewhere in the middle. Companies often put so much emphasis on reaching certain testing velocities that they shoot themselves in the foot for long-term success.

Maximum velocity does not equal maximum impact

Sacrificing insights in the short-term may lead to higher testing output this quarter, but it will leave you at a roadblock later. (Sound familiar?) One 10% win without insights may turn heads your direction now, but a test that delivers insights can turn into five 10% wins down the line. It’s similar to the compounding effect: collecting insights now can mean massive payouts over time.

As with factory production, the key to sustainable output is to find a balance between short-term (maximum testing speed) and long-term (data collection/insights).

Growth vs. Insights

Christopher Columbus had an exploration mindset.

He set sail looking to find a better trade-route to India. He had no expectation of what that was going to look like, but he was open to anything he discovered and his sense of adventure rewarded him with what is likely the largest geographical discovery in History.

insight-driving-mindset
Have a Christopher Columbus mindset: test in pursuit of unforeseeable insights.

Exploration often leads to the biggest discoveries. Yet this is not what most companies are doing when it comes to conversion optimization. Why not?

Organizations tend to view testing solely as a growth-driving process— a way of settling long-term discussions between two firmly held opinions. No doubt growth is an important part of testing, but you can’t overlook exploration.

This is the testing that will propel your business forward and lead to the kind of conversion rate lift you keep reading about in case studies. Those companies aren’t achieving that level of lift on their first try; it’s typically the result of a series of insight-driving experiments that help the tester land on the big insight.

At WiderFunnel we classify A/B tests into two buckets: growth-driving and insight-driving…and we consider them equally important!

Growth-driving experiments (Case study here)

During our partnership with Annie Selke, a retailer of home-ware goods, we ran a test featuring a round of insight-driving variations. We were testing different sections on the product category page for sensitivity: Were users sensitive to changes to the left-hand filter? How might users respond to new ‘Sort By’ functionality?

Insight-driving-test
Round I of testing for Annie Selke: Note the left-hand filter and ‘Sort By’ functionality.

Neither of our variations led to a conversion rate lift. In fact, both lost to the Control page. But the results of this first round of testing revealed key, actionable insights ― namely that the changes we had made to the left-hand filter might actually be worth significant lift, had they not been negatively impacted by other changes.

We took these insights and, combined with supplementary heatmap data, we designed a follow-up experiment. We knew exactly what to test and we knew what the projected lift would be. And we were right. In the end, we turned insights into results, getting a 23.6% lift in conversion rate for Annie Selke.

In Round II of testing, we reverted to the original 'Sort By' functionality.
In Round II of testing, we reverted to the original ‘Sort By’ functionality.

For more on the testing we did with Annie Selke, you should read this post >> “A-ha! Isolations turn a losing experiment into a winner

This follow-up test is what we call a growth-driving experiment. We were armed with compelling evidence and we had a strong hypothesis which proved to be true.

But as any optimizer knows, it can be tough to gather compelling evidence to inform every hypothesis. And this is where a tester must be brave and turn their attention to exploration. Be like Christopher.

Insight-driving experiments

The initial round of testing we did for Annie Selke, where we were looking for sensitivities, is a perfect example of an insight-driving experiment. In insight-driving experiments, the primary purpose of your test is to answer a question, and lifting conversion rates is a secondary goal.

This doesn’t mean that the two cannot go hand-in-hand. They can. But when you’re conducting insight-driving experiments, you should be asking “Did we learn what we wanted to?” before asking “What was the lift?”. This is your factory down-time, the time during which you restock the cupboard with ideas, and put those ideas into your testing piggy-bank.

We’ve seen entire organizations get totally caught up on the question “How is this test going to move the needle?”

But here’s the kicker: Often the right answer is “It’s not.”

At least not right away. This type of testing has a different purpose. With insight-driving experiments, you’re setting out on a quest for your unicorn insight.

unicorn insight
What’s your unicorn insight?

These are the ideas that aren’t applicable to any other business. You can’t borrow them from industry-leading websites, and they’re not ideas a competitor can steal.

Your unicorn insight is unique to your business. It could be finding that magic word that helps users convert all over your site, or discovering that key value proposition that keeps customers coming back. Every business has a unicorn insight, but you are not going to find it by testing in your regular wheelhouse. It’s important to think differently, and approach problem solving in new ways.

We sometimes run a test for our clients where we take the homepage and isolate, removing every section of that page individually. Are we expecting this test to deliver a big lift? Nope, but we are expecting this test to teach us something.

We know that this is the fastest possible way to answer the question “What do users care about most on this page?” After this type of experiment, we suddenly have a lot of answers to our questions.

That’s right: no lift, but we have insights and clear next steps. We can then rank the importance of every element on the page and start to leverage the things that seem to be important to users on the homepage on other areas of a site. Does this sound like a losing test to you?

Rather than guessing at what we think users are going to respond to best, we run an insight-driving test and let the users give us the insights that can then be applied all over a site.

The key is to manage your expectations during a test like this. This variation won’t be your homepage for eternity. Rather, it should be considered a temporary experiment to generate learning for your business. By definition it is an experiment.

Optimization is an infinite process, and what your page looks like today is not what it will look like in a few months.

Proper Design of Experiments (DOE)

It’s important to note that these experimental categories do have grey lines. With proper DOE and high enough traffic levels, both growth-driving and insight-driving strategies can be executed simultaneously. This is what we call “Factorial Design”.

Factorial design
Factorial design allows you to test with both growth and insights in mind.

Factorial design allows you to test more than one element change within the same experiment, without forcing you to test every possible combination of changes.

Rather than creating a variation for every combination of changed elements (as you would with multivariate testing), you can design a test to focus on specific isolations that you hypothesize will have the biggest impact or drive insights.

How to get started with Factorial Design

Start by making a cluster of changes in one variation (producing variations that are significantly different from the control), and then isolate these changes within subsequent variations (to identify the elements that are having the greatest impact). This hybrid test, using both “variable cluster” with “isolation” variations gives you the best of both worlds: radical change options and the ability to gain insights from the results.

For more on proper Design of Experiments, you should read this post >> “Design your A/B tests to get consistently better results

We see Optimization Managers make the same mistakes over and over again, discounting the future for results today. If you overlook testing “down-time” (those insight-driving experiments), you’ll prevent your testing program from reaching its full potential.

You wouldn’t run a factory without down-time, you don’t collect a paycheck without saving for the future, so why would you run a testing program without investing in insight exploration?

Rather, find the balance between speed and insights with proper factorial design that promises growth now as well as in the future.

How do you ensure your optimization program is testing for both growth and insights? Let us know in the comments!

The post How to A/B test for long-term success (don’t underestimate insights!) appeared first on WiderFunnel Conversion Optimization.

Continue reading here – 

How to A/B test for long-term success (don’t underestimate insights!)

3 Ways Tinkoff Bank Optimized Credit Card Conversions – Case Study

Conversion Rate Optimization (CRO) is a process-oriented practice, which essentially aims at enhancing user experience on a website.

It starts with proactively recognizing challenges faced by users across a conversion funnel, and addressing them through various tools and techniques.

Tinkoff Bank understands the need for a process-oriented approach to CRO and puts it into practice.

The following case study tells us more about Tinkoff’s CRO methodology — and how it delivers incredible results.

About the Client

Tinkoff Bank is a major online financial services provider in Russia, which was launched in 2006 by Oleg Tinkov. In just a small duration, the bank has grown into a leader in credit cards — becoming one of the top four credit card issuers in Russia.

Notably, the bank was named Russia’s Best Consumer Digital Bank in 2015 by Global Finance.

Tinkoff operates through a branch-less digital platform, and relies a lot on its website for finding new customers. Like any other smart business, the bank constantly explores new ways to improve its website’s conversion rate. For this job, Tinkoff has a dedicated web analytics team that plans and executes CRO strategies on the website.

Context

Tinkoff Bank lets users apply for a credit card through an application form on its website. Users can fill up the application form, and submit it for approval from the bank. Once the application is approved, users receive their credit card at their homes — with zero shipment cost.

This is the original application page:

Tinkoff's Application Page

The application page on the website is fairly elaborate, consisting of a multi-step form and details about the application process and the credit card plan. This page is where conversions (form-submits) happen for Tinkoff.

Since the form involves multiple steps for completion, Tinkoff tracks submits for each step of the form along with submits for the complete form. Tinkoff refers to these conversions as short-application submits and long-application submits, respectively.

The ultimate goal for Tinkoff is to increase these conversions.

The Case

The CRO team at Tinkoff was working on improving their website’s usability to get higher conversions. It began with identifying key pages on the website that could be optimized. For this purpose, the team analyzed the website’s user data with Adobe Site Catalyst. It found that the credit-card application page had a significant bounce rate.

Next, the team planned on ways to help users stay on the application page and complete the conversion. They zeroed in on three areas of the web page, where they could introduce new features. The hypothesis was that these new features will improve user experience on the page.

However, the team needed to be absolutely sure about the effectiveness of these new features before applying changes to the web page permanently. There was only one way to do it — through A/B testing!

Tinkoff used VWO to carry out A/B tests on the page, and determine whether it was beneficial to introduce new functions there.

Let’s look at the tests closely.

TEST #1: Providing an Additional Information Box

The Hypothesis

By offering additional details about the credit card above the form, the number of sign-ups will increase.

The Test

Tinkoff created two variations of the original (control) page.

The first variation included a “More details” hyperlink underneath the “Fill out form” CTA button placed above the fold. When clicked, the hyperlink led to a new page which provided additional information about the credit card scheme.

Here is how it looked.

2

The second variation had the same “More details” link below the CTA button. But this time, the link opened up a box right below. The box provided additional information — through text and graphics — about the credit card.

Here’s the second variation.

3

The test was run on more than 60,000 visitors for a period of 13 days.

The Result

The first variation couldn’t outperform the control. It had an even lower conversion rate than the control.

The second variation, however, won against the control, and improved the page’s conversion rate by a handsome 15.5%. Moreover, it had a 100% chance of beating the control.

The Analysis

Displaying Key Differentiators:

Placing key differentiators — factors that make one superior than its competitors — on a web page prominently is one of the leading best practices in CRO. The key differentiators enhance the image of a brand in users’ eyes, which influences them to make a conversion.

Tinkoff, too, wanted to place its differentiators on the application form page. In order to not clutter the page, Tinkoff decided to display these differentiators within a box, behind the “More details” link.

The box clearly illustrated Tinkoff’s key differentiators such as free shipping of the card, free card recharge, and cashback on all purchases made through the card.

Related Post: Optimize Your Marketing Efforts with a Killer Value Proposition

Emphasizing on Free Shipping:

By now, we all know how free shipping influences the minds of the customers. In fact, lack of free shipping is the number one reason why people abandon their shopping carts!

Naturally, displaying “Free shipping” prominently on the application page worked well for Tinkoff.

free shipping

Note: Although free shipping was already mentioned on the original page’s top right corner, it didn’t have much contrast against the background — making it potentially unnoticeable to visitors. The variation, however, increased the chances of visitors spotting the much loved free shipping offer.

Reassuring Users About Tinkoff’s Credibility:

Reassuring users at each step of a conversion process helps improve the conversion rate. This is the reason why trust badges, testimonials, and social proof work for so many websites.

Likewise, the features-box on the application page reassured users about Tinkoff’s credibility. The box mentioned how Tinkoff is the leading internet bank providing more than 300,000 points of recharge, and how its service is completely digital — users don’t ever have to visit bank branches. This helped in making users trust the bank’s services, thereby increasing form submits.

Related Resource: 32% Increase in Conversions by A/B Testing for The Right Reasons

Why Did The First Variation Fail?

The “More details” link on the first variation page led users to a new page where additional information about the credit card was present. This feature, however, distracted some users away from the application form. And since web users have a short attention span, some users probably didn’t return back to complete the form — reducing the total number of conversions.

Furthermore, users had to make an effort leaving the application page to go on the new link, browsing through the content there, and returning back to the previous page to submit the form. Because of this effort involved, many users wouldn’t have visited the “More details” page — nullifying any value that the page could have provided them with. And without enough information, many users wouldn’t have converted.

Unsure users are the first to bounce off. Keep reassuring them about your credibility. Tweet: 3 Ways Tinkoff Bank Optimized Credit Card Conversions (Case Study) Read more at https://vwo.com/blog/tinkoff-case-study

TEST #2: Gamifying the Form Using a Progress Bar

The Hypothesis

Providing a “progress bar” on top of the four-step application form will motivate users to fill the form completely, resulting in a higher conversion rate.

The Test

Here again, Tinkoff designed two variations of the original form page.

The first variation had a yellow banner-like progress bar, right above the form. The progress bar highlighted the step on which the user was present. It also displayed the user’s progression on the form graphically, using a black line at its bottom. The bar mentioned the probability of approval of a credit card based on how far the user had filled the form.

This is the first variation.

8

The second variation also had a progress bar, but with a different design.

Similar to the first variation, the second variation’s progress bar displayed the form’s step number and the probability of approval of a credit card. But, the progress bar here was green in color. And it didn’t have any additional black line to show the user’s progress on the form. Instead, the bar itself represented the user’s progression graphically: The green portion of the bar grew as users moved further on the form.

Take a look.

10

The test ran on more than 190,000 visitors for a period of 39 days.

The Result

Both the variations outperformed the control!

The first variation had a 6.9% higher conversion rate than the control.

However, the second variation was the overall winner. It improved the conversion rate of the page by a robust 12.8%.

Both the variations had a 100% chance to beat the original page.

The Analysis

Curbing Users’ Anxiety:

Nobody likes filling up long forms on websites. Users only do that when they expect equal or higher value in return.

When users find lengthy forms, they often become anxious. This happens because they aren’t sure of gaining satisfactory value after completing the form. Many times, user’s anxiety leads them to bounce off the form (or the website altogether).

However, there are various website elements that are used to reduce users’ anxiety on a website — progress bar being one of them.

Progress barSource

A progress bar helps curb anxiety of users by providing them a visual cue about the effort required to complete a process. It reassures users that the process will be completed in due time and effort, keeping them from bouncing off the page.

The above fact has been concluded by various studies conducted on website and application designing.

Gamifying’ Users’ Experience:

Almost all of the web users today would have played video games on some platform or the other. It’s safe to say that most of them are familiar with progress bars displayed within such games. The progress bars, there, are usually associated with users’ progress within a game, telling how far they’ve reached in finishing the game’s objective (or beating a certain opponent in the game).

progress bar in games

The progress bar on Tinkoff’s credit card application form introduced a similar gaming experience to its users. The progress bar could only be fully filled when users completed their whole form. Whenever users found a partially filled progress bar, they had an additional motivation to fill and submit the form.

The fully filled progress bar, later, provided users with a sense of achievement.

‘Rewarding’ Users:

The progress bar deployed another gamification technique — reward.

On Tinkoff’s form page, the technique was put into force using an overlaid text on the progress bar. For instance, when users were on the second step of the form, the text read “The probability of approval is 30%” and “Get 10% for Step 2 completion.” Since users were investing time and effort in applying for the credit card, they would really want to have the highest probability for its approval. By realizing the importance of each step of the form for their application’s approval, users were further motivated to complete them.

Why Did The Second Variation Perform Better Than The First?

Because the second variation’s progress bar had greater visibility on the application page.

Providing contrast to your key elements on a web page is one of the fundamental principles of web design.

The first variation’s progress bar was a black line, and on the bottom of a yellow banner. Since the color scheme of the overall page included white, grey and yellow, the progress bar and the banner didn’t have much contrast. For some users, the progress bar could have easily blended in with the page’s theme. Moreover, the progress bar was quite thin, possibly making it even harder for some users to notice it.

progress bar close up

The second variation’s progress bar, on the other hand, flaunted green color — giving it ample contrast and visibility on the page. The width of the bar, too, was large enough to make it noticeable to the users. And once the the progress bar was noticed by the users, its persuading factors started to work on them.

Gamify your online forms to increase form-submits and conversions. Tweet: 3 Ways Tinkoff Bank Optimized Credit Card Conversions (Case Study) Read more at https://vwo.com/blog/tinkoff-case-study

TEST #3: Letting Users Fill Their Details Later

The Hypothesis

By giving users an option to fill up their passport details on the application form later, the number of form-submits will increase.

The Test

This test involved only one variation that was pitted against the control.

On the form’s second step, users were required to submit their passport related information. The variation gave an option to the users for completing this step later, using a “Don’t remember passport details” checkbox. Upon clicking this checkbox, a small window appeared, asking users to choose a medium — phone or email — to provide their details later. Users could complete the form whenever they had the passport details handy with them.

Here are the screenshots of the checkbox and the pop-up window.

fill details later - checkbox
Checkbox
Fill details later -- box
Pop-up

The test ran on over 265,000 visitors for a period of 23 days.

The Result

The variation won over the control page convincingly. It improved the conversion rate of the form by a whopping 35.8%. The after-filling conversion rate, too, increased by 10%.

The variation had a 100% chance to beat the control.

The Analysis

Acknowledging Users’ Issues:

The second step on the application form required detailed information about users’ passport. The form asked for information like passport’s date of issue, series and number, code division, and more. Most of the users don’t remember these details about their passport by memory. In order to complete the form, the users had no option but to take out their passports and look for the required information. However, some users wouldn’t have their passport handy with them while completing the form. This would have forced them to leave the form.

Now, with the option to fill out the passport details on the form later, users didn’t have a reason to leave the application form in the middle.

Providing Freedom to Users:

Once users clicked on the “Don’t remember passport details” checkbox on the page, they were met with two options for filling up the form later. They could either have the incomplete form’s link emailed to them, or they could choose the ‘phone’ option. The latter option allowed users to fill up the form through a phone call with Tinkoff’s executives.

Completing the form through a telephone call, particularly, reduced a great deal of effort that users had to make.

Virtually Shortening the Form-length:

Once users chose to fill their passport details later, they were only left with two steps to  complete out of the total four. So effectively, users had already covered half of the application form. And this information was reinforced by the progress bar on top of the form.

As users had completed the first half of the form like a breeze, they looked forward to completing the next half equally quickly.

success kid

In addition, the option to fill the passport data through a phone call, actually, converted the form into a three-step process.

Addressing the convenience of your users should be your top priority, always. Tweet: 3 Ways Tinkoff Bank Optimized Credit Card Conversions (Case Study) Read more at https://vwo.com/blog/tinkoff-case-study

Conclusion

Conversion Rate Optimization is not about testing random ideas on your website. It is about improving your website’s user experience through a coherent process. This process involves identifying areas of improvement on your website and suggesting changes based on traffic data and user behavior, and best practices. It’s followed by A/B testing these changes and learning about the effectiveness of the changes. Only when the changes improve the conversion rate of your website, you apply them permanently.

The post 3 Ways Tinkoff Bank Optimized Credit Card Conversions – Case Study appeared first on VWO Blog.

See original article – 

3 Ways Tinkoff Bank Optimized Credit Card Conversions – Case Study

Creating Cel Animations With SVG


What if I told you there was an image format like GIF, but it worked with vectors? What if I said it was possible to reverse the direction of its animation? What if you could take one base image and animate different parts of it separately, at different speeds? Well, the image format, SVG, already exists. It just needs a little gentle encouragement.

Creating Cel Animations With SVG

In this article, I’ll be mixing old with new, taking a somewhat primitive art and breathing new life into it. With the help of Sass, I’ll be streamlining the necessary workflow and hopefully demonstrating that automation can, sometimes, be a friend to creativity.

The post Creating Cel Animations With SVG appeared first on Smashing Magazine.

Originally posted here: 

Creating Cel Animations With SVG

Thumbnail

Three Award Winning A/B Test Cases You Should Know About

(This is a guest post, authored by Danny de Vries, Senior CRO Consultant with Traffic4U)

Every year, Conversion Optimizers around the world vie for the annual WhichTestWon Online Testing Awards, which are awarded by an independent organization situated in the USA. Anyone can enter the competition by submitting their A/B and multivariate test cases which are then reviewed and judged on multiple factors. The most interesting and inspiring cases are then eligible to win either a Gold, Silver or Bronze badge across a range of categories.

This year, twelve out of the thirty test case winners of the 6th annual international WhichTestWon Online Testing Awards are Dutch. With one golden award, two silver awards and an honorable mention, Traffic4u emerged as one of the strong pillars of the Dutch Optimization prowess. This blog covers our three award winning A/B test cases, starting with the golden award winner.

De Hypothekers Associatie: Users Need Guidance

The test case of De Hypothekers Associatie, the biggest independent mortgage consultancy service in the Netherlands, received a golden award in the category ‘Form Elements’. As a consultancy firm, they rely on advising clients about mortgages and other related financial decisions. However, before contacting a consultancy, users typically want to understand for themselves what their financial possibilities are regarding mortgages and buying of properties. So, a user who’s just begun exploring options is unlikely to contact De Hypothekers Associatie or check for an appointment.

Case Situation

In order to empower users to research the possibilities regarding mortgages, De Hypothekers Associatie created several pages on which users could calculate their maximum mortgage loan, monthly mortgage payments, etc. The experiment included the control page shown below, on which users could calculate their mortgage loan:

Translated Version Control

Hypothesis

Previous A/B tests on the website of De Hypotheker Associatie clearly showed the need for guidance on the website, which was the result of testing call-to-action buttons. For instance, a button that said ‘Next step’ significantly outperformed other CTAs with copy like ‘Contact us’ and ‘Advise me’. This result implied two things:

  • Users want information in small digestible chunks
  • Users like to explore what lies ahead instead of being plainly told what the next step is

The follow-up action was to apply this insight to the calculation page, as the lack of guidance could potentially result in fewer mortgage appointments and paying clients.

The hypothesis was that users need to be guided through the process of calculating the maximum loan amount they could receive. The test variation of the “Loan calculation page” included a clear step-by-step flow guiding users through the calculation process. This was in stark contrast with the control situation that had a more simplistic flow. It was assumed that guiding users through the calculation process would lead to more calculations and hence, more appointments for the consultancy. The screenshot of the variant can be found below.

De Hypotheker Associatie - Variation for A/B test

Results

Guiding customers through the loan-calculation process resulted in a significant uplift of more than 18% in terms of number of loan calculations on that particular page. Furthermore, the number of mortgage appointments also increased by more than 18%.

Why Do Users Need Guidance?

It goes without saying that mortgages are boring and complex. But it becomes a necessity when you are (or want to be) a home owner. Also, taking out a mortgage is a high stakes financial decision that isn’t typically made in a day without sufficient information. Because of this, people need advice on where to begin, what steps to undertake, what the possibilities are and what options suit their situation best. The test results show that including clear guidance on the steps to follow can result in a statistically significant uplift in conversion.

Fietsvoordeelshop: Display Customer Savings Prominently

In the category ‘Copy Test’, the A/B test of Fietsvoordeelshop received a Silver Award. Fietsvoordeelshop is one of the leading bike web-shops in the Netherlands offering an assortment of bikes from top brands for discounted prices.

Case Situation

The website lacked a prominently visible indication of the actual discount users would get on the different products. Discounts were displayed in an orange text right next to the big orange CTA button.

Control Image - for A/B Test

Hypothesis

It was hypothesized that Fietsvoordeelshop was losing potential sales by not showing customer savings very effectively. We expected an increase in click-through-rate to the shopping cart by making the savings prominently visible. The discount, which was shown in orange text Uw voordeel: €550,00, was changed to a more visible green badge that contrasted with the orange CTA button (here’s more on the importance of contrast in design). See the variant below:

Variation Image - for A/B Test

Results

Results showed that the variation outperformed the control with 26.3% statistically significant uplift in Shopping Cart entries. So it’s one thing to offer discounts on products, but unless the benefit clearly stands out, users are likely to miss it and never convert.

Follow-through and Stay Consistent

Although we found an increase in click-throughs to the shopping cart, we didn’t see this effect (or somewhat similar) in the checkout steps following the shopping cart entry. The reason for this could be that the discount badge was only shown on the pages before ‘add to shopping cart’ and not on the subsequent check out pages. In order to sustain the positive influence, it might be a good idea to retain the badge all the way through the checkout. However, it has to be tested if repeatedly showing the savings during the final steps in the checkout process leads to an increase in actual sales.

Omoda: Icons Perform Better (on mobile devices)

The second Silver Award Winning test case belongs to the Dutch shoe retailer Omoda. It came in second in the category ‘Images & Icons’. Omoda is one of the top shoe retailers in The Netherlands offering a range of shoes from world-class brands for women, men and kids. The case serves to show how important it is to segment your test results. Read more about visitor segmentation and how it can help increase website conversions.

Case Situation

Each of the Omoda product pages feature their unique selling points. While these were placed near the call-to-action Plaats in shopping bag and were definitely visible, we believed they weren’t visible enough. The Reasons?

  • The USPs appeared in a bulleted list, but it blended too well with other text on the page and did not command attention.
  • The page also included a big black area for customer service elements. Because the page was largely white, the black areas would get more attention, distracting users from the primary goal of the page – viewing shoe details and adding the product to the shopping bag.

Below is an image of the control version:

Omoda Control for Multivariate Test

Hypothesis

The hypothesis was that addressing both these issues to make the USPs more visible would lead to an increase in sales. We created a Multivariate test which allowed us to test both assumptions – USPs aren’t visible enough and the black area is too distracting. All variations are shown below:

Combination 2

Omoda - Combination 2 for Multivariate Test Combination 2: changing the black color to a more neutral grey and moving the customer review rating to the top of the box

Combination 3

Omoda Combination 3 for Multivariate Test Combination 3: using icons and black text instead of grey text to let the USPs stand out better

Combination 4

Omoda Combination 4 for Multivariate Test Combination 4: using elements from combination-2 and combination-3

Results

Overall results for this test told us that the hypothesis should be rejected; there was no convincing proof that any combination would perform significantly better or worse than the control situation. But, through segmentation we found that the hypothesis did work positively on mobile devices and resulted in a whopping 13.6% uplift in sales. Initially, the overall results seemed inconclusive because of a 5.2% drop in sales on desktop and tablet devices.

Users Behave Differently on Different Devices

The results of this test show the device-dependency of hypotheses and the effectiveness of using icons to make USPs stand out better. On the basis of this test, we recommend that you always segment test results to observe the effect of the hypothesis through different dimensions and not make blind decisions.

In the light of previous A/B tests, we believe that the reason why icons perform better on mobile is because desktop and tablet users are more likely to click on the prominent USPs — like terms of payment or delivery — in order to see more details. But, since the USPs aren’t clickable, desktop users would not able to get any additional information. This could irk potential buyers and get them to bounce away. On a mobile device however, with less screen real-estate and the device being less suited to opening multiple tabs, users are less likely to search for additional information.

Understand What Drives Your Visitors And Keep Testing

The above cases have one thing in common. No, it’s not the awards. The commonality is that in each of these cases, we were able to successfully ‘assume’ what drove website visitors. Research using data and/or user feedback told us that a certain effect was occurring. We put this understanding in the required perspective (depending on the type of website and/or product, device, seasonality, user flow etc.) and made certain assumptions about the possible causes for these effects. Then we used A/B and multivariate testing to check if our assumptions were correct. Testing, in fact, is all about learning from your website visitors.

The post Three Award Winning A/B Test Cases You Should Know About appeared first on VWO Blog.

Link – 

Three Award Winning A/B Test Cases You Should Know About

Thumbnail

32% Increase in Conversions by A/B Testing for The Right Reasons

Good design is good business, as Thomas J Watson so succinctly put. Naturally then, the problem of business is discovering ‘good’ design. And the answer, on-going testing. Offline businesses struggle at this because data is infinitely difficult to gather. Luckily for online businesses, gathering data has never been a problem.

Over the last few years, we’ve published case studies of over 150 successful conversion optimization tests. A lot of these case studies make for some intriguing reading, and if you observe closely enough, you’ll find a few tests and changes thereof that have consistently delivered results. Today, we’ll examine a case where a handful of such ‘best practices’ (scroll down to the bottom to see my thoughts on what ‘best practices’  is) came together to deliver amazing results. Validated, all through, by data.

The Client

‘White Card Courses’ offers induction training for workers operating in the construction space across Australia. Aimed at replacing a range of other certification cards, it delivers standard and consistent training that complies with the National Code of Practice of Australia. The FAQs section on the site explains that it is mandatory for a prospect to have a white card if he wishes to work in the construction industry down under.

Such validation from the industry helps www.whitecardcourses.com.au receive strong traffic. However, sales, could always get better (wink). And for that, they looked in the direction of conversion optimization and found Conversion UP (a team of conversion specialists based out of Australia). Then, Grant Merriel at the agency turned to VWO.

The Hypothesis

Grant hypothesized that even though the home page contained the differentiating factors of the business (money-back guarantee and same-day-dispatch of certificates), they were buried somewhere deep in the page (below the fold) where no visitor ever goes. After doing his research he concluded that these trust badges were competitive advantages that deserved better visibility to create the desired impact – sales.

Three changes were proposed to the original home page

  • Change CTA text from “Click to Purchase” to “Start Now”
  • Change CTA and subheading background color
  • Add trust and guarantee badges below the hero image

How Was the Hypothesis Arrived At?

Grant explains, “Before going into a lot of data, the test idea came from solving the most common questions that customers were asking, as it suggests that there was a large disconnect between what the business offered and the website.

To understand the pulse of customers and their concerns about the site, Grant and team went through support tickets filed by users, sat down with the client for a one-to-one discussion and went through site analytic.

At this point it’s worth noting that Conversion UP didn’t test these changes because they were ‘best practices’, but because they gathered customer insights pointing towards these changes. Sadly, a lot of us a/b test certain changes because it worked for another business.

The A/B Test

Control vs Variation

The Result

The test ran on 6585 visitors over a period of 3 weeks and the winning variation recorded a robust 32% increase in conversion(visits to the payment page) from the main homepage, and a 20.9% increase in clicks on the payment page.

To put that in perspective, the control gave 21.79% conversions (visits to the payment page linked to the homepage CTA) and 10.13% on the payment page CTA. The variation trumped the control with 28.76% conversions from the homepage and 12.25% on the payment page. Both the results had a confidence level of 99.9%, which is to say the client could be 99% confident that they would achieve these conversion rates with the variation every time.

Why did the Variation Win?

To arrive at a plausible explanation, we’ll need to understand the typical user.

The average user reads about 28% of the text on a webpage, spending less than a minute doing so. We don’t have much time to impress the user and nudge him/her towards engaging with the site.

How does the learning from above tie in with web design?

The control had one color theme for its header, the hero message and the CTA button. While all three were important parts of the user experience, serving different functions – the header lays out the site structure in a palatable format for a quick browse-through, the hero message is intended to clearly communicate the core value proposition and the CTA button exists to egg a visitor towards completing a particular action. By keeping a similar color theme for these three key elements, the control created a visual barrier to the user, against taking any meaningful action.

It all comes down to contrast. Our eyes are led by colors, and the perceived contrast among different elements on a page. Zero contrast equates to zero attention.

By representing the three page elements in different colors, the variation succeeded in effectively leading the visitor’s attention from one element to another, all separately perceptible.

Messaging

Let me now tell you about Joe. He is hoping to get into the Australian construction industry. He read up on the prerequisites for a job in the industry and is convinced that he needs to get the training done and receive the white card.

Joe comes to know about White Card Courses and immediately looks up it upon internet and lands on the homepage.

He’s greeted with this hero message:

Get White Card Online $50.00” and below it,

Click here to Purchase

But Joe hadn’t gone there to purchase anything. Joe only wanted to do the training and get his white card. By to ‘purchase’, the CTA button tells Joe that he needs to finish another action first (in fact, there is no mention of beginning a course, at all). And what’s the action Joe’s asked to do? Part with hard-earned money, without any evidence or guarantee that he’d get his training.

Joe hesitates. Joe leaves.

And really who could blame Joe?

(except perhaps, Bad Design, The Evil)

With the variation, however, the messaging was changed to something more appropriate, something more in-line with the immediate intent of the visitor. If Joe visited the site after the change, Joe would find the below message instead:

Get White Card Online $50.00

Start Now

Joe is told exactly what he’d have hoped to listen. He could start with his course right away. Clicking on it would still lead Joe to a payment page. But Joe doesn’t mind it so much anymore, because he knows he’s clicked on “Start Now”, and is in the right direction.

Right direction, that’s all Joe ever needed out of life, and design. But I digress, as always.

Check out this kickass post from WordStream on creating effective call-to-action messaging.

And here’s another of those excellent belling-the-CTA (Oh, is that an almost-pun?) case studies from our archives.

Trust Badges

They work. They do,

especially on eCommerce sites that accept payment first and deliver later. The control page had no trust or guarantee badges leaving the credibility score pretty low for ‘White Card Courses’. In human-to-human interactions, their brains labor, crunching verbal and non-verbal cues to create a measure of credibility or trust. But on a web page, in the absence of any human element, visitors require clear reasons to trust.

The variation carried three badges, one each for guarantee (money-back), trust (recognized Australia-wide) and an assurance of quick turn around time.

Here are some more case studies that show why badges are a relatively safe bet in conversion optimization – Bag Servant (conversions up by 72%), Horloges (sales up by 41%), House of Kids (32% increase in conversions). Want more? Here’s a link to our resources section, simply filter by “Trust Badge” in the ‘element’ drop down box and select ‘case studies’ from the ‘resources’ box.

Wait, what about the 20.9% increase in conversion on the payments page? There was never a mention of any change to that page. True, and yet, the page converted 21% more than the control. Trust badges to the rescue again; more importantly, they appeared above the fold, ensuring that most visitors saw the badges before they clicked through to the payments page.

Grant concurs, on being asked why he thinks the variation won:

1. Change from a confusing heading to an actionable button

2. Easy to understand what the ‘Next Step’ is for the user (and above the fold)

3. Prominent supporting sales propositions just below the fold

So, Should You Too Test These Same Changes?

You could, and you should, if you see clear commonalities between your business and the ones that have already tried it and succeeded. But not because x y z businesses tested it and profited out of it. I think, and practically believe, that there’s no such thing as a best practice. ‘Best’ practices followed over time become common practices, and sooner more often than later, we’ll need better practices. Why not come up with them right away, instead of waiting for ‘best practices’ to go stale? There’s room for many new discoveries in the conversion optimization space.

Keep testing. Keep discovering.

Do you have conversion optimization ideas in mind that you’d want to test on your site, but feel that it could do with some brain-storming? Head over to the comments section and let us know!

You can engage with me @SharanTheSuresh, and connect with us @wingify

We’re listening!

The post 32% Increase in Conversions by A/B Testing for The Right Reasons appeared first on VWO Blog.

Original post:  

32% Increase in Conversions by A/B Testing for The Right Reasons

Thumbnail

How a Dutch Major Achieved 7.8% Increase in Conversion, by Simply Removing a line

The Client

VVAA, an association of over 75,000 Dutch healthcare professionals, specializes in providing quality advice to its members on areas ranging from setting up and managing a practice, to portfolio management and mortgages.

As a market leader in the healthcare industry and a pioneer in the area of medical liability insurance, VVAA attracts a very healthy traffic to its site. Visitors are greeted with a fairly big header image with a list of benefits and a CTA button. Things were good.

Then the VVAA corporate communication had a design idea. A horizontal line representing a  “lifeline” was added right at the bottom of the header image. The lifeline (that’s what I’m going to call it henceforth), it seemed, would be a good addition since it ties in directly with the industry that VVAA operates in. See the ‘The Test’ section for relevant images of the page.

The Hypothesis

Alwyn de Bruijn, the webmaster over at VVAA felt that the lifeline could be a distraction, leading visitors’ attention away from the CTA button. So, VVAA decided to A/B test the design against a variation that would be similar in all respects, except that it wouldn’t contain the ‘lifeline’.  The hypothesis was that the variation would convert more visitors (measured as clicks on the CTA).

The Test

The Control

A/B test control

VS

The Variation

Variation

The A/B test ran for 20 days, on 7885 visitors who were randomly shown one of the two versions – the control and the variation.

The Result

How often does it happen that a lifeline gets discarded and the patient gets better?

The variation without the horizontal line effected a 7.8% increase in CTA clicks, with a 99.93% confidence level. In other words, VVAA can be 99.93% confident that the variation (without the ‘lifeline’) will yield them a 7.8% increase on the number of clicks on the main CTA.

Thanks to the A/B test, VVAA now have an objective basis to form a decision, and optimize its page for better conversions.

Now that we’ve got the facts out of our way, let’s look at even more interesting things, like,

Why did the Variation Win?

In Brujin’s own words, the variation (without the horizontal line) won “because the life line affects where visitors eyeline is placed on the page and might miss the CTA at the beginning.”

Let’s break Brujin’s analysis down, further.

Eye-Tracking

Web Page visitors largely read along an F-shaped pattern.

first, horizontally across the page

- then vertically down the page a bit - and further across the page horizontally – 

- before settling into a quick vertical scan of the rest of the page.

Hold on to that information. There’s another piece to solving this puzzle.  In this study by Neilsen Norman Group, it emerged that visitors spend as much as 69% of viewing time on the left half of the screen.

Now, keeping both these insights in mind, let’s try and track the eye movement of Bob, a visitor on the VVAA page.

He scans across the page horizontally, covering most of the header elements. Then he moves down the largely empty left section of the page. That’s when the bright orange “lifeline” disrupts his flow. It acts as a leading line, guiding Bob horizontally along the line. By this time, Bob has already moved past the CTA button. Bob might then scan the rest of the page. But, rest assured, VVAA has already lost Bob by then.

But, it’s right there, the CTA button, you might argue. Why wouldn’t Bob just notice it, and go right back to the real CTA?

Consider this.

What if Bob couldn’t notice the actual CTA button? (no, Bob is not blind)

The Curious Case of ‘False Button’

The ‘lifeline’ carries a bubble about right below the intended CTA button. The bubble also has a text element below it that says “Zelf uw zorg kiezen”, which translates to “Choose your care”.

Could the bubble pass for a prominent element – a false CTA, perhaps?

False Button

There’s more. The dialogue box that holds the actual CTA button has an appendage pointing right to the center of the bubble. That’s another cue for the visitor that the bubble is of some importance, diluting the attention that the actual CTA should otherwise receive. Also, that the lifeline has the same color as the CTA, doesn’t help with creating any discernible contrast, or conversion.

But, the CTA is still there! Can’t you expect some Bobs to realize this? Yes, you could expect, but it might not be reasonable. Why?

Fitts’s Law!

A simplistic explanation of a key tenet of the law is this – the closer and larger a target (a clickable element on a page), the faster it is to click on the target. This awesome Smashing Magazine article, points out that the larger the absolute or relative size of a target button, the higher the probability for it to be clicked.

With the false ‘bubble’ button right below the actual CTA, and in the same color, the relative size (and visibility) of the CTA diminishes in the eyes of Bob, killing that tiny weeny chance for the CTA to be noticed.

Interesting? We’re not done yet.

Directional Cue

Directional Cue

The VVAA header image plays on an important psychological cue – directional prompts.By nature, our sight locks in on the eyes of a human or human-like subject and then follow the line of sight of the subject. In the case in question, the woman’s eyes lead towards the part of the image with quite a few elements – the actual CTA button, the false button and the pointy bit of the dialogue box appendage. By clustering together these elements, the actual CTA button loses its prominence. There goes your Big Orange Button for a toss. And Bob, too.

By taking the lifeline away, and with it the false CTA, the actual CTA gets ample white space around it. This accentuates the importance of the button, improving conversions – useful clicks on the CTA.

What Do You Think?

Should such seemingly trivial changes (removing the lifeline) be implemented based simply on intuition? Or would it be prudent to have changes, however minor, tested first?

It would also be interesting to know if you are able to identify any potential for such minor major changes on your current page.

Let us know in the comments section right below. You can also reach me on twitter @SharanTheSuresh or hit us with your thoughts on twitter @wingify.

We’re listening.

UX guide to increase eCommerce sales

The post How a Dutch Major Achieved 7.8% Increase in Conversion, by Simply Removing a line appeared first on VWO Blog.

View article:  How a Dutch Major Achieved 7.8% Increase in Conversion, by Simply Removing a line