Tag Archives: conversion

Got A Conversion Question for Neil Patel? Ask Him During The Crazy Egg Conversion Talks

conversion talks with Neil Patel

Got conversion rate optimization questions? Wanna ask Neil Patel for help? Then join us for our weekly Crazy Egg Conversion Talks. Anyone can attend and ask as many questions as they’d like! Our first talk will be on May 5th, 2017 at 10 AM Pacific. Here are a few more details about the talks: Frequency: Weekly Duration: 1 Hour No sign-up required Learn From Neil Patel – Ask Him Direct Questions! Neil has been continually optimizing websites for over a decade and is an industry leader when it comes to traffic generation, growth hacks and conversion funnel improvements. Plus, he’s…

The post Got A Conversion Question for Neil Patel? Ask Him During The Crazy Egg Conversion Talks appeared first on The Daily Egg.

See original article here:  

Got A Conversion Question for Neil Patel? Ask Him During The Crazy Egg Conversion Talks

How to Convert More Customers by Adding Perceived Value

What separates a Hermès Birkin bag from a high-quality leather handbag you could buy anywhere? A label, a fancy charm, and about $22,000. But unlike the tangible qualities of a purchase, like the grade of leather used or the fact that the utterly useless bag charm is 14 karat gold, the perception of value is what really separates one bag from the other. One bag contains social cachet and the ability to draw envy from other women – intangible benefits so valuable; it justifies the raised price. The nameless bag, however, has its own set of benefits for a different…

The post How to Convert More Customers by Adding Perceived Value appeared first on The Daily Egg.

Link: 

How to Convert More Customers by Adding Perceived Value

12 Eye-Opening Video Marketing Stats to Help Boost Your Landing Page Conversions

12 video marketing stats

Video marketing has been on the rise for more than a decade now. Consumers are getting more and more used to consuming video content wherever they go, be it on Facebook or on a product page. Which may make one think: Isn’t video content expected by now? Shouldn’t we produce a video every chance we get? However, the real question is: Will videos be a conversion ignitor or a conversion killer? Let’s find out! First, Some Tempting Stats… There are plenty of case studies and reports claiming that using a video on a landing page is a great idea for…

The post 12 Eye-Opening Video Marketing Stats to Help Boost Your Landing Page Conversions appeared first on The Daily Egg.

Link: 

12 Eye-Opening Video Marketing Stats to Help Boost Your Landing Page Conversions

How to get evergreen results from your landing page optimization

Reading Time: 7 minutes

Landing page optimization is old news.

Seriously. A quick google will show you that Unbounce, QuickSprout, Moz, Qualaroo, Hubspot, Wordstream, Optimizely, CrazyEgg, VWO (and countless others), have been writing tips and guides on how to optimize your landing pages for years.

Not to mention the several posts we have already published on the WiderFunnel blog since 2008.

And yet. This conversation is so not over.

Warning: If your landing page optimization goals are short-term, or completely focused on conversion rate lift, this post may be a waste of your time. If your goal is to continuously have the best-performing landing pages on the internet, keep reading.



Marketers are funnelling more and more money into paid advertising, especially as Google allocates more and more SERP space to ads.

In fact, as an industry, we are spending upwards of $92 billion annually on paid search advertising alone.

landing-page-optimization-SERP-space
The prime real estate on a Google search results page often goes to paid.

And it’s not just search advertising that is seeing an uptick in spend, but social media advertising too.

It makes sense that marketers are still obsessing over their landing page conversion rates: this traffic is costly and curated. These are visitors that you have sought out, that share characteristics with your target market. It is extremely important that these visitors convert!

But, there comes a time in every optimizer’s life, when they face the cruel reality of diminishing returns. You’ve tested your landing page hero image. You’ve tested your value proposition. You’ve tested your form placement. And now, you’ve hit a plateau.

So, what next? What’s beyond the tips and guides? What is beyond the optimization basics?

1) Put on your customer’s shoes.

First things first: Let’s do a quick sanity check.

When you test your hero image, or your form placement, are you testing based on tips and recommended best practices? Or, are you testing based on a specific theory you have about your page visitors?

landing-page-optimization-customer-shoes
Put on your customer’s shoes.

Tips and best practices are a fine place to start, but the insight behind why those tactics work (or don’t work) for your visitors is where you find longevity.

The best way to improve experiences for your visitors is to think from their perspective. And the best way to do that is to use frameworks, and framework thinking, to get robust insights about your customers.

– Chris Goward, Founder & CEO, WiderFunnel

Laying the foundation

It’s very difficult to think from a different perspective. This is true in marketing as much as it is in life. And it’s why conversion optimization and A/B testing have become so vital: We no longer have to guess at what our visitors want, but can test instead!

That said, a test requires a hypothesis. And a legitimate hypothesis requires a legitimate attempt to understand your visitor’s unique perspective.

To respond to this need for understanding, WiderFunnel developed the LIFT Model® in 2008: our foundational framework for identifying potential barriers to conversion on a page from the perspective of the page visitor.

Get optimization ideas with the LIFT poster!

Get the LIFT Model poster, and challenge yourself to keep your visitor’s perspective in mind at all times. Use the six conversion factors to analyze your pages, and get optimization ideas!



By entering your email, you’ll receive bi-weekly WiderFunnel Blog updates and other resources to help you become an optimization champion.


The LIFT Model attempts to capture the idea of competing forces in communication, narrowing them down to the most salient aspects of communication that marketers should consider.

I wanted to apply the principles of Relevance, Clarity, Distraction, Urgency and Anxiety to what we were delivering to the industry and not just to our clients. And the LIFT Model is a part of that: making something as simple as possible but no simpler.

– Chris Goward

When you look at your page through a lens like the LIFT Model, you are forced to question your assumptions about what your visitors want when they land on your page.

landing-page-optimization-LIFT-Model
View your landing pages through a framework lens.

You may love an interactive element, but is it distracting your visitors? You may think that your copy creates urgency, but is it really creating anxiety?

If you are an experienced optimizer, you may have already incorporated a framework like the LIFT Model into your optimization program. But, after you have analyzed the same page multiple times, how do you continue to come up with new ideas?

Here are a few tips from the WiderFunnel Strategy team:

  1. Bring in fresh eyes from another team to look at and use your page
  2. User test, to watch and record how actual users are using your page
  3. Sneak a peek at your competitors’ landing pages: Is there something they’re doing that might be worth testing on your site?
  4. Do your page analyses as a team: many heads are better than one
  5. Brainstorm totally new, outside-the-box ideas…and test one!

You should always err on the side of “This customer experience could be better.” After all, it’s a customer-centric world, and we’re just marketing in it.

2) Look past the conversion rate.

“Landing page optimization”, like “conversion rate optimization”, is a limiting term. Yes, on-page optimization is key, but mature organizations view “landing page optimization” as the optimization of the entire experience, from first to last customer touchpoint.

Landing pages are only one element of a stellar, high-converting marketing campaign. And focusing all of your attention on optimizing only one element is just foolish.

From testing your featured ads, to tracking click-through rates of Thank You emails, to tracking returns and refunds, to tracking leads through the rest of the funnel, a better-performing landing page is about much more than on-page conversion rate lift.

landing-page-optimization-big-picture
On-page optimization is just one part of the whole picture.

An example is worth 1,000 words

One of our clients is a company that provides an online consumer information service—visitors type in a question and get an Expert answer. One of the first zones (areas on their website) that we focused on was a particular landing page funnel.

Visitors come from an ad, and land on page where they can ask their question. They then enter a 4-step funnel: Step 1: Ask the question > Step 2: Add more information > Step 3: Pick an Expert > Step 4: Get an answer (aka the checkout page)

Our primary goal was to increase transactions, meaning we had to move visitors all the way through the funnel. But we were also tracking refunds and chargebacks, as well as revenue per visitor.

More than pushing a visitor to ‘convert’, we wanted to make sure those visitors went on to be happy, satisfied customers.

In this experiment, we focused on the value proposition statements. The control landing page exclaimed, “A new question is answered every 9 seconds!“. Our Strategy team had determined (through user testing) that “speed of answers” was the 8th most valuable element of the service for customers, and that “peace of mind / reassurance” was the most important.

So, they tested two variations, featuring two different value proposition statements meant to create more peace of mind for visitors:

  • “Join 6,152,585 satisfied customers who got professional answers…”
  • “Connect One on One with an Expert who will answer your question”

Both of these variations ultimately increased transactions, by 6% and 9.4% respectively. But! We also saw large decreases in refunds and chargebacks with both variations, and large increases in net revenue per visitor for both variations.

By following visitors past the actual conversion, we were able to confirm that these initial statements set an impactful tone: visitors were more satisfied with their purchases, and comfortable investing more in their expert responses.

3) Consider the big picture.

As you think of landing page optimization as the optimization of a complete digital experience, you should also think of landing page optimization as part of your overall digital optimization strategy.

When you discover an insight about visitors to your product page, feed it into a test on your landing page. When you discover an insight about visitor behavior on your landing page, feed it into a test on your website.

It’s true that your landing pages most likely cater to specific visitor segments, who may behave totally differently than your organic visitors. But, it is also true that landing page wins may be overall wins.

Plus, landing page insights can be very valuable, because they are often new visitor insights. And now, a little more advice from Chris Goward, optimization guru:

“Your best opportunities for testing your value proposition are with first impression visitors. These are usually new visitors to your high traffic landing pages or your home page […]

By split testing your alternative value propositions with new visitors, you’ll reduce your exposure to existing customers or prospects who are already in the consideration phase. New prospects have a blank canvas for you to present your message variations and see what sticks.

Then, from the learning gained on landing pages, you can validate insights with other target audience groups and with your customers to leverage the learning company-wide.

Landing page testing can do more than just improve conversion rates on landing pages. When done strategically, it can deliver powerful, high-leverage marketing insights.”



Just because your landing pages are separate from your website, does not mean that your landing page optimization should be separate from your other optimization efforts. A landing page is just another zone, and you are free to (and should) use insights from one zone when testing on another zone.

4) Go deeper, explore further.

A lot of marketers talk about landing page design: how to build the right landing page, where to position each element, what color scheme and imagery to use, etc.

But when you dig into the why behind your test results, it’s like breaking into a piñata of possibilities, or opening a box of idea confetti.

landing-page-optimization-ideas
Discovering the reason behind the result is like opening a box of idea confetti!

Why do your 16-25 year old, mobile users respond so favorably to a one-minute video testimonial from a past-purchaser? Do they respond better to this indicator of social proof than another?

Why do your visitors prefer one landing page under normal circumstances, and a different version when external factors change (like a holiday, or a crisis)? Can you leverage this insight throughout your website?

Why does one type of urgency phrasing work, while slightly different wording decreases conversions on your page? Are your visitors sensitive to overly salesy copy? Why or why not?

Not only are there hundreds of psychological principles to explore within your landing page testing, but landing page optimization is also intertwined with your personalization strategy.

For many marketers, personalized landing pages are becoming more normal. And personalization opens the door to even more potential customer insights. Assuming you already have visitor segments, you should test the personalized experiences on your landing pages.

For example, imagine you have started using your visitors’ first names in the hero banner of your landing page. Have you validated that this personalized experience is more effective than another, like moving a social proof indicator above the fold? Both can be deemed personalization, but they tap into very different motivations.

From psychological principles, to validating your personalized experiences, the possibilities for testing on your landing pages are endless.

Just keep testing, Dory-style

Your landing page(s) will never be “optimized”. That is the beauty and cruelty of optimization: we are always chasing unattainable perfection.

But your landing pages can definitely be better than they are now. Even if you have a high-converting page, even if your page is listed by Hubspot as one of the 16 best designed landing pages, even if you’ve followed all of the rules…your landing page can be better.

Because I’m not just talking about conversions, I’m talking about your entire customer experience. If you give them the opportunity, your new users will tell you what’s wrong with your page.

They’ll tell you where it is unclear and where it is distracting.

They’ll tell you what motivates them.

They’ll tell you how personal you should get.

They’ll tell you how to set expectations so that they can become satisfied customers or clients.

A well-designed landing page is just the beginning of landing page optimization.

The post How to get evergreen results from your landing page optimization appeared first on WiderFunnel Conversion Optimization.

More: 

How to get evergreen results from your landing page optimization

The Unbounce Conversion Benchmark Report – Average Conversion Rates by Industry and Expert Recommendations

Benchmark conversion rates by industry
The Unbounce Conversion Benchmark Report is filled with industry-specific data, graphs and actionable takeaways.

What is a good conversion rate for my landing page?

If you knew the answer to this question, you could get more out of every optimization dollar you spend. Armed with this one “little” data point, you could double down on pages with lots of growth potential or confidently switch gears to focus on other projects because you know your page is performing well.

But here’s the problem:

  • Companies in different industries use a wide range of landing page copy, traffic generation strategies and product offers. Because of this, average conversion rates across industries vary dramatically.
  • This one “little” data point isn’t really so “little.” Conversion rate benchmarks are the function of a ton of variables, requiring access to hordes of data (not to mention people with the skills to mine and interpret that data).

Scientifically grounded answers to, “What is a good conversion rate?” just haven’t been available… until now.

We created the Unbounce Conversion Benchmark Report by analyzing the behavior of 74,551,421 visitors to 64,284 lead generation landing pages created in the Unbounce platform over the last quarter, using a rigorous scientific methodology and our proprietary machine learning technology.

How do your landing page conversion rates compare against your industry competitors?

Get the Unbounce Conversion Benchmark Report and find out.
By entering your email you’ll receive other resources to help you improve your conversion rates.

For 10 popular industries, we’ll share:

  1. An overview of average conversion rates per industry (the graph on the left).
  2. A summary of how marketers in that industry are performing (right):
real-estate-industry-specific-example-benhcmark-page
The Unbounce Conversion Benchmark Report is filled with charts, graphs and actionable takeaways for 10 of our customers’ most popular industries.
  1. Industry-specific copy recommendations from our team of data scientists and conversion marketing experts (who have interpreted the data in the report for you). For each industry, we explore how the following factors could be impacting your conversion rates:

    • Reading ease
    • Page length
    • Emotion and sentiment

Our goal isn’t just to help you answer the question, “What is a good conversion rate for my landing page?”

Our goal is to eliminate some of the guesswork so you can build higher-converting landing pages, better prioritize your work and get back to the strategy and creativity that drives your business.

Download the Unbounce Conversion Benchmark Report (FREE)

Data-driven insights on average conversion rates per industry (+ expert copywriting advice)
By entering your email you’ll receive other resources to help you improve your conversion rates.

Taken from:  

The Unbounce Conversion Benchmark Report – Average Conversion Rates by Industry and Expert Recommendations

Lessons Learned From 2,345,864 Exit Overlay Visitors

sup

Back in 2015, Unbounce launched its first ever exit overlay on this very blog.

Did it send our signup rate skyrocketing 4,000%? Nope.

Did it turn our blog into a conversion factory for new leads? Not even close — our initial conversion rate was barely over 1.25%.

But what it did do was start us down the path of exploring the best ways to use this technology; of furthering our goals by finding ways to offer visitors relevant, valuable content through overlays.

Overlays are modal lightboxes that launch within a webpage and focus attention on a single offer. Still fuzzy on what an overlay is? Click here.

In this post, we’ll break down all the wins, losses and “holy smokes!” moments from our first 2,345,864 exit overlay viewers.

Psst: Towards the end of these experiments, Unbounce launched Convertables, and with it a whole toolbox of advanced triggers and targeting options for overlays.

Goals, tools and testing conditions

Our goal for this project was simple: Get more people to consume more Unbounce content — whether it be blog posts, ebooks, videos, you name it.

We invest a lot in our content, and we want it read by as many marketers as possible. All our research — everything we know about that elusive thing called conversion, exists in our content.

Our content also allows readers to find out whether Unbounce is a tool that can help them. We want more customers, but only if they can truly benefit from our product. Those who experience ‘lightbulb’ moments when reading our content definitely fit the bill.

As for tools, the first four experiments were conducted using Rooster (an exit-intent tool purchased by Unbounce in June 2015). It was a far less sophisticated version of what is now Unbounce Convertables, which we used in the final experiment.

Testing conditions were as follows:

  1. All overlays were triggered on exit; meaning they launched only when abandoning visitors were detected.
  1. For the first three experiments, we compared sequential periods to measure results. For the final two, we ran makeshift A/B tests.
  1. When comparing sequential periods, testing conditions were isolated by excluding new blog posts from showing any overlays.
  1. A “conversion” was defined as either a completed form (lead gen overlay) or a click (clickthrough overlay).
  1. All experiments were conducted between January 2015 and November 2016.

Experiment #1: Content Offer vs. Generic Signup

Our first exit overlay had a simple goal: Get more blog subscribers. It looked like this.

blog-subscriber-overlay

It was viewed by 558,488 unique visitors over 170 days, 1.27% of which converted to new blog subscribers. Decent start, but not good enough.

To improve the conversion rate, we posed the following.

HYPOTHESIS
Because online marketing offers typically convert better when a specific, tangible offer is made (versus a generic signup), we expect that by offering a free ebook to abandoning visitors, we will improve our conversion rate beyond the current 1.27% baseline.

Whereas the original overlay asked visitors to subscribe to the blog for “tips”, the challenger overlay offered visitors The 23 Principles of Attention-Driven Design.

add-overlay

After 96 days and over 260,000 visitors, we had enough conversions to call this experiment a success. The overlay converted at 2.65%, and captured 7,126 new blog subscribers.

overlay-experiment-1-results

Since we didn’t A/B test these overlays, our results were merely observations. Seasonality is one of many factors that can sway the numbers.

We couldn’t take it as gospel, but we were seeing double the subscribers we had previously.

Observations

  • Offering tangible resources (versus non-specific promises, like a blog signup) can positively affect conversion rates.

Stay in the loop and get all the juicy test results from our upcoming overlay experiments

Learn from our overlay wins, losses and everything in between.
By entering your email you’ll receive weekly Unbounce Blog updates and other resources to help you become a marketing genius.

Experiment #2: Four-field vs. Single-field Overlays

Data people always spoil the party.

The early success of our first experiment caught the attention of Judi, our resident marketing automation whiz, who wisely reminded us that collecting only an email address on a large-scale campaign was a missed opportunity.

For us to fully leverage this campaign, we needed to find out more about the individuals (and organizations) who were consuming our content.

Translation: We needed to add three more form fields to the overlay.

overlay-experiment-2

Since filling out forms is a universal bummer, we safely assumed our conversion rate would take a dive.

But something else happened that we didn’t predict. Notice a difference (besides the form fields) between the two overlays above? Yup, the new version was larger: 900x700px vs. 750x450px.

Adding three form fields made our original 750x450px design feel too cramped, so we arbitrarily increased the size — never thinking there may be consequences. More on that later.

Anyways, we launched the new version, and as expected the results sucked.

overlay-experiment-2-results
Things weren’t looking good after 30 days.

For business reasons, we decided to end the test after 30 days, even though we didn’t run the challenger overlay for an equal time period (96 days).

Overall, the conversion rate for the 30-day period was 48% lower than the previous 96-day period. I knew it was for good reason: Building our data warehouse is important. Still, a small part of me died that day.

Then it got worse.

It occurred to us that for a 30-day period, that sample size of viewers for the new overlay (53,460) looked awfully small.

A closer inspection revealed that our previous overlay averaged 2,792 views per day, while this new version was averaging 1,782. So basically our 48% conversion drop was served a la carte with a 36% plunge in overall views. Fun!

But why?

It turns out increasing the size of the overlay wasn’t so harmless. The size was too large for many people’s browser windows, so the overlay only fired two out of every three visits, even when targeting rules matched.

We conceded, and redesigned the overlay in 800x500px format.

overlay-experiment-redesign

Daily views rose back to their normal numbers, and our new baseline conversion rate of 1.25% remained basically unchanged.

loads-vs-views

Large gap between “loads” and “views” on June 4th; narrower gap on June 5th.

Observations

  • Increasing the number of form fields in overlays can cause friction that reduces conversion rates.
  • Overlay sizes exceeding 800×500 can be too large for some browsers and reduce load:view ratio (and overall impressions).

Experiment #3: One Overlay vs. 10 Overlays

It seemed like such a great idea at the time…

Why not get hyper relevant and build a different exit overlay to each of our blog categories?

With our new baseline conversion rate reduced to 1.25%, we needed an improvement that would help us overcome “form friction” and get us back to that healthy 2%+ range we enjoyed before.

So with little supporting data, we hypothesized that increasing “relevance” was the magic bullet we needed. It works on landing pages why not overlays?

HYPOTHESIS  
Since “relevance” is key to driving conversions, we expect that by running a unique exit overlay on each of our blog categories — whereby the free resource is specific to the category — we will improve our conversion rate beyond the current 1.25% baseline.

blog-categories

We divide our blog into categories according to the marketing topic they cover (e.g., landing pages, copywriting, design, UX, conversion optimization). Each post is tagged by category.

So to increase relevance, we created a total of 10 exit overlays (each offering a different resource) and assigned each overlay to one or two categories, like this:

category-specific-overlays

Creating all the new overlays would take some time (approximately three hours), but since we already had a deep backlog of resources on all things online marketing, finding a relevant ebook, course or video to offer in each category wasn’t difficult.

And since our URLs contain category tags (e.g., all posts on “design” start with root domain unbounce.com/design), making sure the right overlay ran on the right post was easy.

unbounce-targeting

URL Targeting rule for our Design category; the “include” rule automatically excludes the overlay from running in other categories.

But there was a problem: We’d established a strict rule that our readers would only ever see one exit overlay… no matter how many blog categories they browsed. It’s part of our philosophy on using overlays in a way that respects the user experience.

When we were just using one overlay, that was easy — a simple “Frequency” setting was all we needed.

unbounce-frequency

…but not so easy with 10 overlays running on the same blog.

We needed a way to exclude anyone who saw one overlay from seeing any of the other nine.

Cookies were the obvious answer, so we asked our developers to build a temporary solution that could:

  • Pass a cookie from an overlay to the visitor’s browser
  • Exclude that cookie in our targeting settings

They obliged.

unbounce-advanced-targeting

We used “incognito mode” to repeatedly test the functionality, and after that we were go for launch.

Then this happened.

rooster-dashboard
Ignore the layout… the Convertables dashboard is much prettier now :)

After 10 days of data, our conversion rate was a combined 1.36%, 8.8% higher than the baseline. It eventually crept its way to 1.42% after an additional 250,000 views. Still nowhere near what we’d hoped.

So what went wrong?

We surmised that just because an offer is “relevant” doesn’t mean it’s compelling. Admittedly, not all of the 10 resources were on par with The 23 Principles of Attention-Driven Design, the ebook we originally offered in all categories.

That said, this experiment provided an unexpected benefit: we could now see our conversion rates by category instead of just one big number for the whole blog. This would serve us well on future tests.

Observations

  • Just because an offer is relevant doesn’t mean it’s good.
  • Conversion rates vary considerably between categories.

Experiment #4: Resource vs. Resource

“Just because it’s relevant doesn’t mean it’s good.”

This lesson inspired a simple objective for our next task: Improve the offers in our underperforming categories.

We decided to test new offers across five categories that had low conversion rates and high traffic volume:

  1. A/B Testing and CRO (0.57%)
  2. Email (1.24%)
  3. Lead Gen and Content Marketing (0.55%)
Note: We used the same overlay for the A/B Testing and CRO categories, as well as the Lead Gen and Content Marketing Categories.

Hypothesis
Since we believe the resources we’re offering in the categories of A/B testing, CRO, Email, Lead Gen and Content Marketing are less compelling than resources we offer in other categories, we expect to see increased conversion rates when we test new resources in these categories.

With previous studies mentioned in this post, we compared sequential periods. For this one, we took things a step further and jury-rigged an A/B testing system together using Visual Website Optimizer and two Unbounce accounts.

And after finding what we believed to be more compelling resources to offer, the new test was launched.

topic-experiment

We saw slightly improved results in the A/B Testing and CRO categories, although not significant. For the Email category, we saw a large drop-off.

In the Lead Gen and Content Marketing categories however, there was a dramatic uptick in conversions and the results were statistically significant. Progress!

Observations

  • Not all content is created equal; some resources are more desirable to our audience.

Experiment #5: Clickthrough vs. Lead Gen Overlays

Although progress was made in our previous test, we still hadn’t solved the problem from our second experiment.

While having the four fields made each conversion more valuable to us, it still reduced our conversion rate a relative 48% (from 2.65% to 1.25% back in experiment #2).

We’d now worked our way up to a baseline of 1.75%, but still needed a strategy for reducing form friction.

The answer lay in a new tactic for using overlays that we dubbed traffic shaping.

Traffic Shaping: Using clickthrough overlays to incentivize visitors to move from low-converting to high-converting pages.

Here’s a quick illustration:

traffic-shaping-diagram

Converting to this format would require us to:

  1. Redesign our exit overlays
  2. Build a dedicated landing page for each overlay
  3. Collect leads via the landing pages

Basically, we’d be using the overlays as a bridge to move readers from “ungated” content (a blog post) to “gated” content (a free video that required a form submission to view). Kinda like playing ‘form field hot potato’ in a modern day version of Pipe Dream.

Hypothesis
Because “form friction” reduces conversions, we expect that removing form fields from our overlays will increase engagement (enough to offset the drop off we expect from adding an extra step). To do this, we will redesign our overlays to clickthrough (no fields), create a dedicated landing page for each overlay and add the four-field form to the landing page. We’ll measure results in Unbounce.

By this point, we were using Unbounce to build the entire campaign. The overlays were built in Convertables, and the landing pages were created with the Unbounce landing page builder.

We decided to test this out in our A/B Testing and CRO as well as Lead Gen and Content Marketing categories.

clickthrough-overlays

After filling out the form, visitors would either be given a secure link for download (PDF) or taken to a resource page where their video would play.

Again, for this to be successful the conversion rate on the overlays would need to increase enough to offset the drop off we expected by adding the extra landing page step.

These were our results after 21 days.

clickthrough-overlays-results

Not surprisingly, engagement with the overlays increased significantly. I stress the word “engagement” and not “conversion,” because our goal had changed from a form submission to a clickthrough.

In order to see a conversion increase, we needed to factor in the percentage of visitors who would drop off once they reached the landing page.

A quick check in Unbounce showed us landing page drop-off rates of 57.7% (A/B Testing/CRO) and 25.33% (Lead Gen/Content Marketing). Time for some grade 6 math…

clickthrough-overlays-results-2

Even with significant drop-off in the landing page step, overall net leads still increased.

Our next step would be applying the same format to all blog categories, and then measuring overall results.

Onward!

All observations

  • Offering specific, tangible resources (vs. non-specific promises) can positively affect conversion rates.
  • Increasing the number of form fields in overlays can cause friction that reduces conversion rates.
  • Overlay sizes exceeding 800×500 can be too large for some browsers and reduce load:view ratio (and overall impressions).
  • Just because an offer is relevant doesn’t mean it’s good
  • Conversion rates vary considerably between blog categories
  • Not all content is created equal; some resources are more desirable to our audience.
  • “Form friction” can vary significantly depending on where your form fields appear.

Stay tuned…

We’re continuing to test new triggers and targeting options for overlays, and we want to tell you all about it.

So what’s in store for next time?

  1. The Trigger Test — What happens when test our “on exit” trigger against a 15-second time delay?
  2. The Referral Test — What happens when we show different overlays to users from different traffic sources (e.g., social vs. organic)?
  3. New v.s. Returning Visitors — Do returning blog visitors convert better than first-time visitors?

Stay in the loop and get all the juicy test results from our upcoming overlay experiments

Learn from our overlay wins, losses and everything in between.
By entering your email you’ll receive weekly Unbounce Blog updates and other resources to help you become a marketing genius.

More: 

Lessons Learned From 2,345,864 Exit Overlay Visitors

First CRO Certification Course in Italy – An Initiative Supported by VWO

alt : http://feedproxy.google.com/~r/ILoveSplitTesting/~5/7jRxHo7WIRI/madri_ok.mp4http://feedproxy.google.com/~r/ILoveSplitTesting/~5/7jRxHo7WIRI/madri_ok.mp4

How can you learn Conversion Rate Optimization in a way that you can apply it easily to any project?  How can you make a low performing website to a highly remunerative one without redesigning it from scratch?

Those are just two of the questions that Luca Catania, Director of Madri Internet Marketing & Head of Marketing of Catchi, answered during the First Certification CRO certification Course in Italy supported by VWO.

The course targeted a wide audience—from people with no experience in CRO to experts in the field. Attendees comprised c-suite executives—Entrepreneurs, Head of Marketing, Managing Directors, Consultants, from more than 20 different industries.

The objective of the training was to teach participants an innovative step-by-step approach to CRO, in which participants are guided to learn a system that they can apply to any business to increase conversion rates, increase leads, increase sales online.

Participants got the chance to learn how to optimize their websites in a real-time setup. Using the VWO platform live in the course allowed the participants to understand and experience how the software can help optimize websites and achieve better conversions.

Do you want to improve you CRO skills? 

You can read interesting case studies and find the dates of upcoming courses in Europe/Australasia, following Luca Catania on LinkedIn.

The post First CRO Certification Course in Italy – An Initiative Supported by VWO appeared first on VWO Blog.

Jump to original:

First CRO Certification Course in Italy – An Initiative Supported by VWO

The Beginner’s Guide To Making Facebook Advertising Convert

beginners guide to facebook ads

Most people use Facebook ads to pump up their visitor numbers. Wait, what? Let me be clear. They don’t wake up with that goal in mind, but it’s usually what ends up happening. I see this all the time with clients I work with. The problem isn’t that they can’t set up the Facebook ad campaigns or get visitors to their websites. It’s the next step where things go haywire. The visitors they’re paying for don’t download ebooks, sign up for accounts or buy products. The reason for this simply is that the thing they offer doesn’t appeal to those…

The post The Beginner’s Guide To Making Facebook Advertising Convert appeared first on The Daily Egg.

See the original article here: 

The Beginner’s Guide To Making Facebook Advertising Convert

Infographic: How To Turn Browsers Into Buyers

browsers into buyers cro

98% of website visitors don’t convert. That’s right. On average, only 2% of your website visitors convert and drive your online revenue. Imagine if you could increase that number just a little bit. You could potentially increase your online revenue by 50 to a 100 percent or more. But too many people think conversion rate optimization is testing different button colors on landing pages. In reality, it’s so much bigger than that. If you want to better understand the powerful process that it takes to increase your conversion rate, then you’ll love this infographic. It’s a simple visual that will…

The post Infographic: How To Turn Browsers Into Buyers appeared first on The Daily Egg.

Excerpt from: 

Infographic: How To Turn Browsers Into Buyers

“The more tests, the better!” and other A/B testing myths, debunked

Reading Time: 8 minutes

Will the real A/B testing success metrics please stand up?

It’s 2017, and most marketers understand the importance of A/B testing. The strategy of applying the scientific method to marketing to prove whether an idea will have a positive impact on your bottom-line is no longer novel.

But, while the practice of A/B testing has become more and more common, too many marketers still buy into pervasive A/B testing myths. #AlternativeFacts.

This has been going on for years, but the myths continue to evolve. Other bloggers have already addressed myths like “A/B testing and conversion optimization are the same thing”, and “you should A/B test everything”.

As more A/B testing ‘experts’ pop up, A/B testing myths have become more specific. Driven by best practices and tips and tricks, these myths represent ideas about A/B testing that will derail your marketing optimization efforts if left unaddressed.

Avoid the pitfalls of ad-hoc A/B testing…

Get this guide, and learn how to build an optimization machine at your company. Discover how to use A/B testing as part of your bigger marketing optimization strategy!



By entering your email, you’ll receive bi-weekly WiderFunnel Blog updates and other resources to help you become an optimization champion.



But never fear! With the help of WiderFunnel Optimization Strategist, Dennis Pavlina, I’m going to rebut four A/B testing myths that we hear over and over again. Because there is such a thing as a successful, sustainable A/B testing program…

Into the light, we go!

Myth #1: The more tests, the better!

A lot of marketers equate A/B testing success with A/B testing velocity. And I get it. The more tests you run, the faster you run them, the more likely you are to get a win, and prove the value of A/B testing in general…right?

Not so much. Obsessing over velocity is not going to get you the wins you’re hoping for in the long run.

Mike St Laurent

The key to sustainable A/B testing output, is to find a balance between short-term (maximum testing speed), and long-term (testing for data-collection and insights).

Michael St Laurent, Senior Optimization Strategist, WiderFunnel

When you focus solely on speed, you spend less time structuring your tests, and you will miss out on insights.

With every experiment, you must ensure that it directly addresses the hypothesis. You must track all of the most relevant goals to generate maximum insights, and QA all variations to ensure bugs won’t skew your data.

Dennis Pavlina

An emphasis on velocity can create mistakes that are easily avoided when you spend more time on preparation.

Dennis Pavlina, Optimization Strategist, WiderFunnel

Another problem: If you decide to test many ideas, quickly, you are sacrificing your ability to really validate and leverage an idea. One winning A/B test may mean quick conversion rate lift, but it doesn’t mean you’ve explored the full potential of that idea.

You can often apply the insights gained from one experiment, when building out the strategy for another experiment. Plus, those insights provide additional evidence for testing a particular concept. Lining up a huge list of experiments at once without taking into account these past insights can result in your testing program being more scattershot than evidence-based.

While you can make some noise with an ‘as-many-tests-as-possible’ strategy, you won’t see the big business impact that comes from a properly structured A/B testing strategy.

Myth #2: Statistical significance is the end-all, be-all

A quick definition

Statistical significance: The probability that a certain result is not due to chance. At WiderFunnel, we use a 95% confidence level. In other words, we can say that there is a 95% chance that the observed result is because of changes in our variation (and a 5% chance it is due to…well…chance).

If a test has a confidence level of less than 95% (positive or negative), it is inconclusive and does not have our official recommendation. The insights are deemed directional and subject to change.

Ok, here’s the thing about statistical significance: It is important, but marketers often talk about it as if it is the only determinant for completing an A/B test. In actuality, you cannot view it within a silo.

For example, a recent experiment we ran reached statistical significance three hours after it went live. Because statistical significance is viewed as the end-all, be-all, a result like this can be exciting! But, in three hours, we had not gathered a representative sample size.

Claire Vignon Keser

You should not wait for a test to be significant (because it may never happen) or stop a test as soon as it is significant. Instead, you need to wait for the calculated sample size to be reached before stopping a test. Use a test duration calculator to understand better when to stop a test.

After 24 hours, the same experiment had dropped to a confidence level of 88%, meaning that there was now only an 88% likelihood that the difference in conversion rates was not due to chance – i.e. statistically significant.

Traffic behaves differently over time for all businesses, so you should always run a test for full business cycles, even if you have reached statistical significance. This way, your experiment has taken into account all of the regular fluctuations in traffic that impact your business.

For an e-commerce business, a full business cycle is typically a one-week period; for subscription-based businesses, this might be one month or longer.

Myth #2, Part II: You have to run a test until reaches statistical significance

As Claire pointed out, this may never happen. And it doesn’t mean you should walk away from an A/B test, completely.

As I said above, anything below 95% confidence is deemed subject to change. But, with testing experience, an expert understanding of your testing tool, and by observing the factors I’m about to outline, you can discover actionable insights that are directional (directionally true or false).

  • Results stability: Is the conversion rate difference stable over time, or does it fluctuate? Stability is a positive indicator.
ab testing results stability
Check your graphs! Are conversion rates crossing? Are the lines smooth and flat, or are there spikes and valleys?
  • Experiment timeline: Did I run this experiment for at least a full business cycle? Did conversion rate stability last throughout that cycle?
  • Relativity: If my testing tool uses t-test to determine significance, am I looking at the hard numbers of actual conversions in addition to conversion rate? Does the calculated lift make sense?
  • LIFT & ROI: Is there still potential for the experiment to achieve X% lift? If so, you should let it run as long as it is viable, especially when considering the ROI.
  • Impact on other elements: If elements outside the experiment are unstable (social shares, average order value, etc.) the observed conversion rate may also be unstable.

You can use these factors to make the decision that makes the most sense for your business: implement the variation based on the observed trends, abandon the variation based on observed trends, and/or create a follow-up test!

Myth #3: An A/B test is only as good as its effect on conversion rates

Well, if conversion rate is the only success metric you are tracking, this may be true. But you’re underestimating the true growth potential of A/B testing if that’s how you structure your tests!

To clarify: Your main success metric should always be linked to your biggest revenue driver.

But, that doesn’t mean you shouldn’t track other relevant metrics! At WiderFunnel, we set up as many relevant secondary goals (clicks, visits, field completions, etc.) as possible for each experiment.

Dennis Pavlina

This ensures that we aren’t just gaining insights about the impact a variation has on conversion rate, but also the impact it’s having on visitor behavior.

– Dennis Pavlina

When you observe secondary goal metrics, your A/B testing becomes exponentially more valuable because every experiment generates a wide range of secondary insights. These can be used to create follow up experiments, identify pain points, and create a better understanding of how visitors move through your site.

An example

One of our clients provides an online consumer information service — users type in a question and get an Expert answer. This client has a 4-step funnel. With every test we run, we aim to increase transactions: the final, and most important conversion.

But, we also track secondary goals, like click-through-rates, and refunds/chargebacks, so that we can observe how a variation influences visitor behavior.

In one experiment, we made a change to step one of the funnel (the landing page). Our goal was to set clearer visitor expectations at the beginning of the purchasing experience. We tested 3 variations against the original, and all 3 won resulted in increased transactions (hooray!).

The secondary goals revealed important insights about visitor behavior, though! Firstly, each variation resulted in substantial drop-offs from step 1 to step 2…fewer people were entering the funnel. But, from there, we saw gradual increases in clicks to steps 3 and 4.

Our variations seemed to be filtering out visitors without strong purchasing intent. We also saw an interesting pattern with one of our variations: It increased clicks from step 3 to step 4 by almost 12% (a huge increase), but decreased actual conversions by -1.6%. This result was evidence that the call-to-action on step 4 was extremely weak (which led to a follow-up test!)

ab testing funnel analysis
You can see how each variation fared against the Control in this funnel analysis.

We also saw large decreases in refunds and chargebacks for this client, which further supported the idea that the right visitors (i.e. the wrong visitors) were the ones who were dropping off.

This is just a taste of what every A/B test could be worth to your business. The right goal tracking can unlock piles of insights about your target visitors.

Myth #4: A/B testing takes little to no thought or planning

Believe it or not, marketers still think this way. They still view A/B testing on a small scale, in simple terms.

But A/B testing is part of a greater whole—it’s one piece of your marketing optimization program—and you must build your tests accordingly. A one-off, ad-hoc test may yield short-term results, but the power of A/B testing lies in iteration, and in planning.

ab testing infinity optimization process
A/B testing is just a part of the marketing optimization machine.

At WiderFunnel, a significant amount of research goes into developing ideas for a single A/B test. Even tests that may seem intuitive, or common-sensical, are the result of research.

ab testing planning
The WiderFunnel strategy team gathers to share and discuss A/B testing insights.

Because, with any test, you want to make sure that you are addressing areas within your digital experiences that are the most in need of improvement. And you should always have evidence to support your use of resources when you decide to test an idea. Any idea.

So, what does a revenue-driving A/B testing program actually look like?

Today, tools and technology allow you to track almost any marketing metric. Meaning, you have an endless sea of evidence that you can use to generate ideas on how to improve your digital experiences.

Which makes A/B testing more important than ever.

An A/B test shows you, objectively, whether or not one of your many ideas will actually increase conversion rates and revenue. And, it shows you when an idea doesn’t align with your user expectations and will hurt your conversion rates.

And marketers recognize the value of A/B testing. We are firmly in the era of the data-driven CMO: Marketing ideas must be proven, and backed by sound data.

But results-driving A/B testing happens when you acknowledge that it is just one piece of a much larger puzzle.

One of our favorite A/B testing success stories is that of DMV.org, a non-government content website. If you want to see what a truly successful A/B testing strategy looks like, check out this case study. Here are the high level details:

We’ve been testing with DMV.org for almost four years. In fact, we just launched our 100th test with them. For DMV.org, A/B testing is a step within their optimization program.

Continuous user research and data gathering informs hypotheses that are prioritized and created into A/B tests (that are structured using proper Design of Experiments). Each A/B test delivers business growth and/or insights, and these insights are fed back into the data gathering. It’s a cycle of continuous improvement.

And here’s the kicker: Since DMV.org began A/B testing strategically, they have doubled their revenue year over year, and have seen an over 280% conversion rate increase. Those numbers kinda speak for themselves, huh?

What do you think?

Do you agree with the myths above? What are some misconceptions around A/B testing that you would like to see debunked? Let us know in the comments!

The post “The more tests, the better!” and other A/B testing myths, debunked appeared first on WiderFunnel Conversion Optimization.

Excerpt from:

“The more tests, the better!” and other A/B testing myths, debunked