Seven Answers Conversion Optimizers Want to Know

Have you been to Bucharest?

I hadn’t until last week, and I have to say, I loved it!

The GPeC eCommerce Summit organizers invited me to keynote at their 500+ attendee conference and, due to popular demand, I also did a workshop the following day with 200+ registrants.

I wasn’t sure what to expect, but was pleasantly surprised by the quality of content, merchants and technology environment there. They’re also a very warm, inviting and open community of marketers. If you get a chance, I totally recommend checking out Romania on your next European vacation.

During the event, one of the organizers, Raluca Georgescu, interviewed me and asked me seven very insightful questions that you might find interesting.

In this 8 minute interview, we discuss:

  1. Why should conversion optimization not be approached as a project?
  2. What are typical mistakes online shops make?
  3. What are the best types of tips for conversion optimization?
  4. At which stage of the conversion funnel should marketers focus their effort if they have limited resources?
  5. How should you prioritize optimization?
  6. How is the conversion optimization process for mobile different?
  7. What trends will online marketers benefit from this year? Social media, content marketing? (Where I reveal the biggest content marketing problem and opportunity)

Take 8 min to check it out.

The post Seven Answers Conversion Optimizers Want to Know appeared first on WiderFunnel Marketing Conversion Optimization.

Originally posted here: 

Seven Answers Conversion Optimizers Want to Know

Thumbnail

An A/B Test That Will Shatter Your Notions About Privacy Policies

The most exciting A/B tests are the ones which bust myths, defy general best practices and make the jaw drop.

Jaw-dropping

The following case study is in the league of such intuition-defying A/B tests.

Test Information
Company: 
The Solution For Diabetes
Category: Health
Goal: Sign-ups
Conversation Rate: -24%

The Company

TheSolutionForDiabetes.com, as the name suggests, is a website that offers natural remedies for reversing diabetes.

Among the various solutions it offers is a free downloadable e-book on ways to control Type 2 diabetes.

The Goal

The website had a landing page for the e-book with an opt-in box asking for visitors’ e-mail address. Here’s how the landing page originally looked.

Control

The Test

Martin Malmberg of TheSolutionForDiabetes.com wanted to see if adding a line about respecting the privacy of the visitors would affect the sign-ups. So he used Visual Website Optimizer to set up a simple A/B test. He created a variation which had the text — “We respect your privacy” — just below the call to action (CTA) button.

Variation

A/B Test Result

Martin’s hunch was correct. Adding the text about privacy did affect the conversions — but surprisingly negatively! The version with the privacy text registered a 24.41% drop in conversions! The test was run for over a month with the original recording a 99% chance to beat the variation.

Comparison image

What really happened here?

You might be wondering what really happened here. Using any kind of trust indicator or copy that addresses visitors’ fears is usually considered a best practice. And why not? According to a survey, 36% online customers consider ‘identity fraud’ as their most prominent worry.

Then why did adding “We respect your privacy” right below the call to action (CTA) button decrease conversions in this case?

Possible reasons why the Variation failed

1) The privacy text introduced fear

When asked why he thought the variation failed, Martin said, “I guess people started to worry about spam and privacy concerns after reading such a text. Whereas, the thought doesn’t even enter their heads without that text. So, at least in my niche, with my traffic sources, it seems best not to use such a text, even when worded positively.”

So instead of assuaging visitors’ apprehensions, the text had a counteractive effect as it instilled the seeds of distrust and worry in the minds of visitors.

ContentVerve ran similar privacy policy experiments on a sign-up form last year. In one of the experiments, it was found that the version with a privacy policy saying “100% privacy – We will never spam you” decreased sign-ups by 18% when pitted against a version with no privacy text.

2) The text acts as a distraction

This is just an extension of the first point. The privacy text right below the CTA might have been distracting visitors and took the focus away from the offer. It’s usually considered a good practice to keep the landing page as focused on the offer as possible.

Your views

Why do you think the version with the privacy policy failed? We would love to hear your views.

The post An A/B Test That Will Shatter Your Notions About Privacy Policies appeared first on Visual Website Optimizer Blog.

View original article – 

An A/B Test That Will Shatter Your Notions About Privacy Policies

Thumbnail

Your Intuition is Wrong – A/B Testing Results that Surprised the Experts

One of the things I absolutely love about split testing is its inherent ability to humble opinions of even the most seasoned testers. Sometimes even the most well researched hypotheses fail.

This is the strongest reason why companies must test everything from offer copy to page design, instead of relying on gut instinct or personal preferences. In this post, I’ll share some case studies that either saw huge opinion disparity among the WhichTestWon community, or whose results absolutely shocked our editorial team and Testing Awards’ judges.

Social Proof Is Not the End-All-Be-All

Can you guess which of these two versions of an in-line form generated more opt-ins for a well-known Web design blog?

With social proofWithout social proof

As of today, 71% of the WhichTestWon community picked the version on the left – the one with social proof copy – as the winner. When I present this test to live audiences at conferences, usually 90-95% of attendees pick the social proof version.

However, they are all wrong. The variation at the left — without the subscriber count — saw 122% more newsletter opt-ins than its social proof counterpart. In a world where we are jaded by the user counts of Facebook, Twitter, etc…it seems that 14k visitors wasn’t compelling enough to get prospects to act.

Normally we see companies just add social proof without testing it because virtually every blog post and ‘social media expert’ has told them it will help conversions. Thankfully the team at this site tested this before implementation, or they would have missed out on a lot of opt-ins – more than double the opt-ins in fact.

Don’t get me wrong, there is a ton of value in social proof. For one, it is great at reducing a visitor’s anxiety and can give your brand a sense of authority. The trick is finding out where it is best to publish your social stats and what exactly is worth sharing. Should you post newsletter subscribers, Facebook likes, awards, or all of the above? Simply put: test it out. Never add something blindly; you don’t know how it will impact the bottom line.

Still not convinced? A VWO customer saw a similar result recently on its product page. Read the whole story here.

Icons May Be Trending, but they Might Hurt Conversions

Icons have been making a major comeback in web design. Overall, icons have been useful, especially when they are used as a replacement for bullets in stylized lists. The team at Build.com wanted to find out whether icons would be a useful navigational tool…the results surprised them.

Here are the two versions of the header they tested, one with icons and one without:

Without icons

With icons

The challenger variation included icons that represented different category pages on the site. The team believed that an increased focus on navigation with their most visited categories would increase interactions and sales. However, the version without the icons saw 21% more product purchases.

Why? We suspect that although the icons provided a sleek navigation pane, overall they likely added more clutter that confused the visitor.

Security Seal on a Lead Gen Form Test

The highest point of friction on any lead generation page is the form itself. You need to identify the optimal number of form fields, choose an intuitive design, and add visible privacy policies and/or security icons to reduce anxiety.

These are well-known best practices that all lead gen marketers understand… and that’s probably why 74% of the WhichTestWon community guessed the wrong winner for this A/B test.

Without trust sealWith trust seal

The variation without the TRUSTe logo next to the button got 12.6% more completed forms. Yes, the ‘submit’ button did shrink to accommodate the TRUSTe logo; but, we strongly suspect the primary cause for this lift has to do with the logo itself.

Trust seals can be essential to your conversion rate; the real trick is knowing where and when to place them.

In this particular circumstance, the TRUSTe logo was the wrong security seal at the wrong time. Visitors are used to seeing this seal, and others like it, directly in a shopping cart; not on a top funnel lead generation form. It’s quite likely that many of them suspected a payment transaction when they saw the trust seal here.

Instead of using a security seal, the team could have tested providing assurance by adding a simple fine-print text link to the privacy policy.

Remember, context is the key!

If Your Conversion Rates are Falling, Put on a Happy Face?

CRO specialists and designers love using faces! I get it; faces are the first thing the eye identifies when it looks at a web page. Numerous eye tracking studies support this claim.

However, sometimes a human face can be too much of a distraction. So, you need to test when you add faces to your page to make sure they aren’t competing with your important headlines and calls to actions (CTAs).

Here’s an example:

Without human face

With human face

The version without the image won 24% more form completions. This wasn’t a perfectly clean test. There were some slight alterations in copy but nothing too dramatic. To tell you the truth, I’m more of a fan of the story behind this test than the actual unexpected results. However, it’s not the first or the last test we’ve seen where removing a face increased conversions.

By the way: I am happy that the team used the photo of an actual employee rather than a stock photo. Models and stock photos tend to get even worse conversions than “real” people.

What’s perhaps most amazing is that at the time of this split test HubSpot was about to make it a mandatory practice to include a person’s image on each of their landing pages. On some level this makes sense, they had found that some of their pages performed better with people’s pictures. However, what’s true for some pages may not always be true for all pages. Luckily, this test casted a seed of doubt and the company changed their design mandate.

Before you create any new landing page or design policies, please test beforehand…you have no idea just how many conversions you could leave on the table.

Video Icons – Product-Centric or Person-Centric?

Here is another case that tests whether using a face is appropriate.

Person-centric video thumbnails

Product-centric video thumbnails

The version of this Autodesk product page that used faces got 50% fewer video clicks. Nothing else on this page changed except for the video preview image. I am not anti-faces on websites; I simply want you to test before you implement!

Needless to say, the testing team was surprised by the results. So they ran a user survey to try to figure it out. The responses showed that Autodesk’s prospective buyers were more interested in seeing how the product worked over individuals talking about the product.

This just comes down to a case of knowing your audience and that best practices are not one-size-fits-all!

In Summary

Leaders in the testing field have all been stumped by the unexpected results before, and will be stumped again. The trick is to understand what to do after your test goes counter to your hypothesis or flat-lines.

Your next steps may include evaluating a litany of things such as your hypothesis, technology, source traffic, device…the list goes on. You need to learn if the test itself was flawed – or if your understanding of what your visitors really want from the page was flawed. Either way, you’ve learned something valuable.

Remember testing is an evolving process, future iterations are borne from our successes and our failures.

Keep testing my friends! There are so many variables to consider while running a test, it is no wonder that we often see lifts or losses where we least expect it.

The post Your Intuition is Wrong – A/B Testing Results that Surprised the Experts appeared first on Visual Website Optimizer Blog.

Link:

Your Intuition is Wrong – A/B Testing Results that Surprised the Experts

Thumbnail

Measuring an Inbound Campaign through the Conversion Funnel

If you thought measuring the success of your inbound campaign is a tedious job, well don’t worry, you are not alone. According to a study conducted by Hubspot, 25% of marketing professionals admitted to ‘proving ROI’ as the biggest challenge they face.

Inbound marketing challenges

For measuring success, it is important to understand that the effectiveness of an inbound campaign is the collective result of various activities. Let’s have a look at how we can evaluate the success of an inbound campaign across various stages of the conversion funnel.

Inbound Funnel

Measuring the Inbound Campaign at Top of the Funnel (ToFU)

The primary objective of your activities at ToFU is to attract a greater share of the target market. Hence, all metrics at this stage must focus on what percentage of the audience are you reaching out to. Here’s a list of what you should measure:

  • Growth in Traffic
    You know your inbound campaign is doing well if the number of visitors to your website increases during the campaign period. A simple way to keep track of this is through Google Analytics. Follow this path in Google Analytics to see the growth in your visitor traffic: Reporting -> Audience -> OverviewWebsite traffic

    Another important factor you should keep in mind while analyzing website traffic is the percentage of new vs returning visitors. Repeat visitors indicate visitor loyalty to your website. A low rate of repeat visitors means that your inbound campaign does not offer long term benefit to users.

    Visitor comparison

    In order to keep activities in line with the objectives, you must focus on maintaining a higher percentage of new visitors at ToFU. You can focus on generating greater visitor loyalty during later stages of the funnel.

  • Sources of Traffic
    Besides growth in traffic, you must also keep in mind where your visitors are coming from. Analyzing traffic sources tells if your SEO efforts are bringing fruit. A good chunk of organic traffic is indicative of well performing keywords. On the other hand, referral traffic helps you gauge the effectiveness of your link-building efforts. You should keep an eye on the ‘referring urls’ to develop a greater understanding of the sources of traffic. Follow this path in Google Analytics to have a look at the referral traffic on your website: Reporting -> Audience -> Overview -> Referral TrafficReferral traffic
  • Social Reach
    The most popular metric to track for social performance is the wide reach of your social channels. These can be easily assessed by tracking the number of ‘likes’ on your Facebook page, number of ‘followers’ on Twitter or LinkedIn. However, these numbers in absolute terms do not make much sense for measuring success. So instead of simply looking at the number of ‘likes’ on your Facebook page, try and analyse how these ‘likes’ have grown over a period of time. Have your likes seen a sudden upward trend during a certain campaign? A comparison of growth trends will help you understand performance in a better way. Follow this path on your Facebook page to view the ‘likes’ trend for your page: See Insights -> Likes -> Net Likes

    Social reach

    You can go through this article for tips on how you can drive your social leads through the sales funnel.

  • Blog Views and Social Shares
    Analyzing individual blog posts helps you differentiate between a good post and a bad one. By continuously monitoring the views and shares of individual posts over a period of time, you will be able to identify patterns so as decide what kind of posts work best for you. Keep in mind that a highly viewed post might not result in good engagement (comments) and shares. Have a look at how our post on ‘Snackable Content’ gathered popularity on various social channels.

    Blog shares

  • Email Click Through Rate
    CTR is the most important metric of analysing e-mail marketing campaigns. A high CTR indicates that your message is clear and relevant for the target audience. However, the ideal CTR varies from one type of message to another. For example, newsletter e-mails sent to an opt-in list would have a higher CTR than a promotional message sent to the same set of customers. Hence you should define the target CTR for each form of e-mail and try achieving that for each campaign.

    Email metrics

Measuring the Inbound Campaign at Middle of the Funnel (MoFU)

Once you have attracted a large chunk of your target audience towards your offering, the next important step is to keep the audience hooked to your offer until they make the final decision to buy. Here’s what you should do to analyse whether your inbound campaign is going to help generate qualified leads or not:

  • Social Engagement
    Analyse your social media properties to see if your audience is engaging with you or not. Facebook provides its users with a ‘Talking About This’ score which measures the level of engagement on your page.

    Facebook page engagement

    In addition, you can also measure the engagement on individual messages on your Facebook page using Facebook insights. Follow this path to view the post wise engagement on your Facebook page: See Insights -> Posts -> All Published Posts

    Social engagement on Facebook

  • Lead Generation and Conversion
    Conversion rates need to be tracked for various channels of your inbound campaign. At a broad level, you must link all your campaigns to a pre-defined goal (based on your conversion objective) and see which inbound campaign is performing the best according to your goals. Follow this path in Google Analytics to see the goal conversion rate for your campaigns: Reporting -> Acquisition -> Campaigns -> Conversions (All Goals)

    Goal conversion

    If you are using your blog as an inbound channel, the call-to-action (CTA) on your blog becomes an important metric to measure success. The CTA helps drive viewers of your blog to take the required action.

    Blog CTA

  • Visitor to lead ratio
    Attracting visitors from a channel is of no use unless these visitors are taking the required action on your landing page. The visitor to lead ratio is defined as the percentage of visitor who converted to a lead. You can calculate this percentage for all you inbound channels and thus analyse which one is giving the best results.Visitor to lead Conversion = Leads Generated / Total Visitors
  • Bounce Rate and Time on Page
    You can measure the level of engagement on your blog by tracking the average time spent on the blog and the bounce rate. A high bounce rate indicates that your are not attracting the right kind of audience to your blog.

    Bounce rate

All said and done, we would all agree that measuring the performance of your inbound campaign is probably as important as executing it in the first place. The key to success lies in finding out the best way to do it. ‘Coz if you don’t measure it, you will never be able to say that it works! Have you defined measurement criterion for your inbound campaign? Do share your insights with us.

Image Credits
Impulse Creative

The post Measuring an Inbound Campaign through the Conversion Funnel appeared first on Visual Website Optimizer Blog.

Excerpt from:

Measuring an Inbound Campaign through the Conversion Funnel

Thumbnail

New to marketing apps? Stay informed with ion interactive’s blog!

We were so thrilled to announce our brand new website earlier this month. Not only did our website have a complete redesign, but so did our blog!

If you’ve noticed that your RSS feed or subscription to the ion interactive blog has stopped replenishing, we’ve made it simple to get back on track. Head on over to www.ioninteractive.com/blog-summary.

If you would like to subscribe to the blog for emails on the new and exciting announcements and updates in the industry, simply enter your information in the form on the right hand side. If a blog RSS feed is more of what you’re looking for, simply click “Blog RSS” underneath the form on the right side to add ion to your RSS reader for your source of Marketing Apps news.

New to marketing apps? Want to learn more about how you can make your digital experience useful and engaging? Check our blog for the latest news, trends, and ideas in the digital marketing industry.

Visit link: 

New to marketing apps? Stay informed with ion interactive’s blog!

Thumbnail

Testimonial Messaging A/B Test Increases Conversion Rate by 22%

This case study is Part 1 of a two-part series that comprises A/B tests conducted by Buildium, a property management software company, that is used to manage more than 600,000 residential units in 31 countries worldwide. The aim of these tests was to market their product to a wider range of property managers.

Background

Buildium‘s competitors have pigeonholed us by claiming we are only appropriate for small property managers (50 units or less)*. To combat this perception, we decided to change a few elements on our website through A/B testing different variations of our messaging.

Test Information
buildium_logo_small Company: Buildium Category: Real Estate Goal: Form Submission / Free Trial
Conversation Rate: 22%

We ran two separate A/B tests to see how we could improve the messaging on both the homepage and pricing page. The aim of these tests was to increase conversions by more effectively advertising the number of units that Buildium supports. In this case study, we’ll take a look at how improving our homepage testimonials improved our conversions.

Hypothesis

The test on the homepage was to see whether we could increase conversions by displaying testimonials from small, medium, and large property managers, communicating that our software works great no matter how many units you manage.

Test

The control on our website had a heading: “Proof, meet pudding. We can help you too”, with the subhead that read: “We have over 8000 happy customers across the world.”  This was followed by testimonials from our customers with no mention of the the customer’s unit size. This failed to bring out the fact that we’re serving customers ranging from small, medium to large property managers. This is what the control looked like:

Builium-Control-Testimonials

In an attempt to boost free trial sign-ups, we decided to test a new presentation of our testimonials that more clearly articulated the scalability of our product. Each testimonial displayed the company’s unit count for context, and the testimonial copy itself spoke to the specific benefits that Buildium provides to a company of that size. We also introduced a new headline (Whatever your size, Buildium is the perfect fit) to echo this messaging. Though we understand that the most statistically significant tests only change one element on the page, for the sake of this test we chose to introduce these three changes that all work together to form the new experience. We tested three variations of this new experience against the control, each with a different subhead. Here’re how the winning variation read:

Builium-Variation-Testimonials

 Results

For the home page, the winning variation was the one that specifically mentioned our ability to handle small, medium, and large companies. The new headline, testimonial copy and the addition of the unit count had a positive effect on conversions, as all of the new variations converted at a higher rate than the control. I believe that subhead of the variation above was the most successful because it set the expectation that the three testimonials below would be from companies of these three sizes. And since we followed this by mentioning the units of the property owners, we were successful in getting across the message that Buildium serves more than just small businesses. 

buildium-comparison

Our new marketing site has a trial form embedded directly at the bottom of the homepage, which means that we needed to track for two separate conversion goals: submitting the form, or clicking the CTA to go to free-trial. Below you will find the results for each goal, with the sum of the two used to calculate the total success of the variation.

buildium-table-1

Based on the VWO results, all of the variations had conversion rates higher than the control. This leads me to believe that regardless of the subhead, the addition of the new headline, testimonials, and unit count all created an experience that more effectively communicated the scalability of our product, attracting a wider range of customers.

Note*: In our industry, small property management companies generally manage up to 100 units, medium ones manage 100 to 600 units, and large ones manage more than 600 units. Our largest customer uses our software to manage over 7,000 units. There are also “extra-large” companies who might manage tens of thousands of units, but we do not market to these companies as they would be better suited with an enterprise solution.

The post Testimonial Messaging A/B Test Increases Conversion Rate by 22% appeared first on Visual Website Optimizer Blog.

Link:  

Testimonial Messaging A/B Test Increases Conversion Rate by 22%