Tag Archives: website

Eating Our Own Dogfood – How To Optimize For Revenue As A SaaS Business

It wouldn’t be an exaggeration to say that we at VWO are very passionate about experimentation.

Not only have we built a product around A/B testing and conversion optimization, but we are always looking for ways to run experiments on our website.

Recently, we got our entire team to actively research and contribute ideas for optimization on our website and ran multiple tests. This post is a narrative of what we did after.

Who Is This Post for?

This post will help SaaS growth-hackers, marketers, and optimization experts to predict the business value from a test.

The aim of this post is to not only share the tests we ran on our website, but also introduce a revenue-based framework that predicts the business impact of an A/B test and prioritizing on the basis of it.

Revenue-Based Optimization

Need for a Model

After we propelled our team to suggest ideas for testing, we had more than 30 hypotheses looking at us, but no distinct way of knowing which of these to take up first. Of course, there is a range of prioritizing frameworks available, but we particularly wanted to look at the ones that would directly impact our revenue.

This framework helped us project the potential impact on the revenue from each test. Here’s what we did:

Step 1

We decided to identify high-impact pages and winnow the pages that were not as important for our business, that is, pages where no goal conversions take place. We looked at Google Analytics for pages with the:

  • Highest Amount of Traffic
    (We used “New Users” to nullify visits by existing customers.)
  • Highest Number of Goal Conversions
    (Goal conversion, which contributes to your overall business goal, is the main goal for your website. In our case, this meant all qualified lead-generating forms. A free trial or request a demo qualifies a visitor as a lead with a genuine interest in our product; or, as the industry popularly refers to it, a Marketing Qualified Lead.)

This gave us a list of pages which were high-value in terms of, either traffic generation or last touch before conversions.

We identified the following key pages:

  • Free-trial page
  • Request-a-demo page
  • Homepage
  • Pricing page
  • Features page
  • Blog pages (All)
  • Contact-us page

Step 2

Our main objective was to project an estimated increase in the revenue due to a particular test. If your test increases the conversion rate by say 20%, what would this mean for your business and, in turn, the revenue?

This is how our marketing funnel looked like:

VWO Marketing Funnel

Note: You should use data from the recent 3–6 months, and the average (mean) of each step. This is to accurately reflect what to expect from your testing and be relevant to your business.

For each of the “Key Pages” we identified in the first step, we also dug out the corresponding numbers at each funnel stage. We’ve explained each stage of the funnel and how it is calculated:

a) Key Page Traffic: The total number of pageviews per Key Page (new users in our case). You can find the data in Google Analytics.

b) Total Conversions: The total number of leads generated from each particular page. If there is an additional qualification your company follows, source this data from your preferred CRM or Marketing Automation software. For example, at VWO, we use Clearbit to qualify our leads in Salesforce.

c) Opportunities: The total number of opportunities generated for your sales team. This data will be available in your CRM; make sure to count qualified opportunities only.

d) Customers:  The total number of customers created in a month.

e) MRR (New): Or monthly recurring revenue, means revenue booked on a monthly basis; you can use this to estimate annual recurring revenue, or ARR, as well.

Step 3

Now that we had all the numbers needed in our arsenal, I decided to calculate some more internal benchmarks. This gave us the performance of our marketing and/or sales funnel.

  1. We computed the conversion rate of a particular page, using the following formula:
    Existing conversion rate = (Total Conversions Key Page Traffic); this is represented as %
  2. The conversion of your leads into opportunities:
    (Opportunities ÷ Total conversions) × 100, represented as %
  3.  The conversion rate of opportunities into customers:
    (Customers ÷ Opportunities) × 100, represented as %
  4.  The average revenue per user or ARPU:
    Total MRR  ÷ Total number of paying customers

Now all you have to do is to impute these numbers in this template.
Revenue-based Testing Model
The model uses all of that data and projects how much revenue increase or decrease you can estimate based on your test results. This estimate can give you a good idea of where to begin or prioritize your testing.

Step 4 (Optional)

This is where it may get tricky. At VWO, we sell both Enterprise plans and Standard plans. So to be fair, we must estimate each cohort with separate data and individual conversion rates.

For example, Opportunity creation % for an Enterprise plan may be lower, but a Standard plan is easier to convert. You may want to decide what type of plan do you want to focus on.

We, for instance, used website traffic and Alexa rank as the benchmark for lead qualification. We attributed more value to the leads that came in through key pages and prioritized them.

This led us to the next step, which is the qualification rate of the said lead of high value. This rate may be in the range 30–50%, depending on your definition.

It was interesting to note that each page had a different qualification rate. For example, we get better quality leads from our Request a demo page than we do from our free trial or blog post page.

Tests Conducted:

After we had the model in place, we played around with the increase or decrease in our conversion rates. This was to identify what would be our best optimization opportunities?

The free trial pages and the home page were among the high-priority pages, in terms of the impact of revenue. (Unfortunately, I can’t share the exact numbers with you.) We first looked at the hypotheses on the free trial page:

Test 1 – Free Trial Page

Our hypothesis was “Illustrating VWO features and social proof on the free trial page will compel users to sign up for the free trial.”

Here is a screenshot of what it looks like in VWO.
hypothesis-free-trial

Bonus tip: VWO has recently launched a new capability called PLAN that lets you manage and prioritize your testing hypotheses. To learn more about this capability, visit the VWO evolution page.

This is what the control looked like:

Free Trial Control

Our heatmap data also showed a lot of users clicking the features page after accessing the free trial page.

Screenshot of heatmap data:

Heatmap Screenshot for test

We created a variation which included the features we offer to solve this issue. Here’s a screenshot of the same.

This is our current free trial page:

Free Trial Page(New)(Variation)

We ran the test for over 2 months. The result was an increase of 6% in our conversion rate, which led to increased revenues.

Test 2 – Request a Demo CTA (A/B Test)

The main CTA on the homepage has been the free trial CTA. The headline on the homepage was “A/B Testing Software for Marketers.”

The hypothesis for the test was “We will get more qualified leads through a request a demo CTA on the homepage.”

This is what the control looked like:

Homepage Control

We came up with a more targeted copy and changed the existing CTA to Request A Demo. Here is what the variation looked like:

Homepage variation

We also wanted to change our positioning due to our foray into Conversion Optimization. The results from this test were that our variation beat the control and had more than 31% improvement in the conversion rate.

Based on the first example, we have already implemented the new free-trial page as our main free-trial page now. Based on the second test, we updated our current home page.

All in all, this model helped us correctly predict the best optimization opportunities, make our testing better, and more strategically aligned to business goals.

Let me know your experience with this model and how you go about testing.

Would love to hear your feedback on this!

Free-trial CTA

The post Eating Our Own Dogfood – How To Optimize For Revenue As A SaaS Business appeared first on VWO Blog.

Read more – 

Eating Our Own Dogfood – How To Optimize For Revenue As A SaaS Business

How to do server-side testing for single page app optimization

Reading Time: 5 minutes

Gettin’ technical.

We talk a lot about marketing strategy on this blog. But today, we are getting technical.

In this post, I team up with WiderFunnel front-end developer, Thomas Davis, to cover the basics of server-side testing from a web development perspective.

The alternative to server-side testing is client-side testing, which has arguably been the dominant testing method for many marketing teams, due to ease and speed.

But modern web applications are becoming more dynamic and technically complex. And testing within these applications is becoming more technically complex.

Server-side testing is a solution to this increased complexity. It also allows you to test much deeper. Rather than being limited to testing images or buttons on your website, you can test algorithms, architectures, and re-brands.

Simply put: If you want to test on an application, you should consider server-side testing.

Let’s dig in!

Note: Server-side testing is a tactic that is linked to single page applications (SPAs). Throughout this post, I will refer to web pages and web content within the context of a SPA. Applications such as Facebook, Airbnb, Slack, BBC, CodeAcademy, eBay, and Instagram are SPAs.


Defining server-side and client-side rendering

In web development terms, “server-side” refers to “occurring on the server side of a client-server system.”

The client refers to the browser, and client-side rendering occurs when:

  1. A user requests a web page,
  2. The server finds the page and sends it to the user’s browser,
  3. The page is rendered on the user’s browser, and any scripts run during or after the page is displayed.
Static app server
A basic representation of server-client communication.

The server is where the web page and other content live. With server-side rendering, the requested web page is sent to the user’s browser in final form:

  1. A user requests a web page,
  2. The server interprets the script in the page, and creates or changes the page content to suit the situation
  3. The page is sent to the user in final form and then cannot be changed using server-side scripting.

To talk about server-side rendering, we also have to talk a little bit about JavaScript. JavaScript is a scripting language that adds functionality to web pages, such as a drop-down menu or an image carousel.

Traditionally, JavaScript has been executed on the client side, within the user’s browser. However, with the emergence of Node.js, JavaScript can be run on the server side. All JavaScript executing on the server is running through Node.js.

*Node.js is an open-source, cross-platform JavaScript runtime environment, used to execute JavaScript code server-side. It uses the Chrome V8 JavaScript engine.

In laymen’s (ish) terms:

When you visit a SPA web application, the content you are seeing is either being rendered in your browser (client-side), or on the server (server-side).

If the content is rendered client-side, JavaScript builds the application HTML content within the browser, and requests any missing data from the server to fill in the blanks.

Basically, the page is incomplete upon arrival, and is completed within the browser.

If the content is being rendered server-side, your browser receives the application HTML, pre-built by the server. It doesn’t have to fill in any blanks.

Why do SPAs use server-side rendering?

There are benefits to both client-side rendering and server-side rendering, but render performance and page load time are two huge pro’s for the server side.

(A 1 second delay in page load time can result in a 7% reduction in conversions, according to Kissmetrics.)

Server-side rendering also enables search engine crawlers to find web content, improving SEO; and social crawlers (like the crawlers used by Facebook) do not evaluate JavaScript, making server-side rendering beneficial for social searching.

With client-side rendering, the user’s browser must download all of the application JavaScript, and wait for a response from the server with all of the application data. Then, it has to build the application, and finally, show the complete HTML content to the user.

All of which to say, with a complex application, client-side rendering can lead to sloooow initial load times. And, because client-side rendering relies on each individual user’s browser, the developer only has so much control over load time.

Which explains why some developers are choosing to render their SPAs on the server side.

But, server-side rendering can disrupt your testing efforts, if you are using a framework like Angular or React.js. (And the majority of SPAs use these frameworks).

The disruption occurs because the version of your application that exists on the server becomes out of sync with the changes being made by your test scripts on the browser.

NOTE: If your web application uses Angular, React, or a similar framework, you may have already run into client-side testing obstacles. For more on how to overcome these obstacles, and successfully test on AngularJS apps, read this blog post.


Testing on the server side vs. the client side

Client-side testing involves making changes (the variation) within the browser by injecting Javascript after the original page has already loaded.

The original page loads, the content is hidden, the necessary elements are changed in the background, and the ‘new’ version is shown to the user post-change. (Because the page is hidden while these changes are being made, the user is none-the-wiser.)

As I mentioned earlier, the advantages of client-side testing are ease and speed. With a client-side testing tool like VWO, a marketer can set up and execute a simple test using a WYSIWYG editor without involving a developer.

But for complex applications, client-side testing may not be the best option: Layering more JavaScript on top of an already-bulky application means even slower load time, and an even more cumbersome user experience.

A Quick Hack

There is a workaround if you are determined to do client-side testing on a SPA application. Web developers can take advantage of features like Optimizely’s conditional activation mode to make sure that testing scripts are only executed when the application reaches a desired state.

However, this can be difficult as developers will have to take many variables into account, like location changes performed by the $routeProvider, or triggering interaction based goals.

To avoid flicker, you may need to hide content until the front-end application has initialized in the browser, voiding the performance benefits of using server-side rendering in the first place.

WiderFunnel - client side testing activation mode
Activation Mode waits until the framework has loaded before executing your test.



When you do server-side testing, there are no modifications being made at the browser level. Rather, the parameters of the experiment variation (‘User 1 sees Variation A’) are determined at the server route level, and hooked straight into the javascript application through a service provider.

Here is an example where we are testing a pricing change:

“Ok, so, if I want to do server-side testing, do I have to involve my web development team?”

Yep.

But, this means that testing gets folded into your development team’s work flow. And, it means that it will be easier to integrate winning variations into your code base in the end.

If yours is a SPA, server-side testing may be the better choice, despite the work involved. Not only does server-side testing embed testing into your development workflow, it also broadens the scope of what you can actually test.

Rather than being limited to testing page elements, you can begin testing core components of your application’s usability like search algorithms and pricing changes.

A server-side test example!

For web developers who want to do server-side testing on a SPA, Tom has put together a basic example using Optimizely SDK. This example is an illustration, and is not functional.

In it, we are running a simple experiment that changes the color of a button. The example is built using Angular Universal and express JS. A global service provider is being used to fetch the user variation from the Optimizely SDK.

Here, we have simply hard-coded the user ID. However, Optimizely requires that each user have a unique ID. Therefore, you may want to use the user ID that already exists in your database, or store a cookie through express’ Cookie middleware.

Are you currently doing server-side testing?

Or, are you client-side testing on a SPA application? What challenges (if any) have you faced? How have you handled them? Do you have any specific questions? Let us know in the comments!

The post How to do server-side testing for single page app optimization appeared first on WiderFunnel Conversion Optimization.

Continue reading – 

How to do server-side testing for single page app optimization

Are Your Keyword Rankings You See On Google Correct?

Google Search Results Differ

Have you ever doubted Google? When it comes to the keyword ranking accuracy, we can be skeptical about rank tracker tools we use or SEOs we hired. But when we check rankings manually, we trust our eyes and Google. But you shouldn’t be so careless. Google is clever and agile. They have a massive list of factors that affect the search results they display for you. Even if you see your website in the Number 1 position, it doesn’t mean you really are on top of the world. Your customers may see a very different Top 10. Fortunately, you can…

The post Are Your Keyword Rankings You See On Google Correct? appeared first on The Daily Egg.

See original – 

Are Your Keyword Rankings You See On Google Correct?

12 Eye-Opening Video Marketing Stats to Help Boost Your Landing Page Conversions

12 video marketing stats

Video marketing has been on the rise for more than a decade now. Consumers are getting more and more used to consuming video content wherever they go, be it on Facebook or on a product page. Which may make one think: Isn’t video content expected by now? Shouldn’t we produce a video every chance we get? However, the real question is: Will videos be a conversion ignitor or a conversion killer? Let’s find out! First, Some Tempting Stats… There are plenty of case studies and reports claiming that using a video on a landing page is a great idea for…

The post 12 Eye-Opening Video Marketing Stats to Help Boost Your Landing Page Conversions appeared first on The Daily Egg.

Link: 

12 Eye-Opening Video Marketing Stats to Help Boost Your Landing Page Conversions

Lessons Learned From 2,345,864 Exit Overlay Visitors

sup

Back in 2015, Unbounce launched its first ever exit overlay on this very blog.

Did it send our signup rate skyrocketing 4,000%? Nope.

Did it turn our blog into a conversion factory for new leads? Not even close — our initial conversion rate was barely over 1.25%.

But what it did do was start us down the path of exploring the best ways to use this technology; of furthering our goals by finding ways to offer visitors relevant, valuable content through overlays.

Overlays are modal lightboxes that launch within a webpage and focus attention on a single offer. Still fuzzy on what an overlay is? Click here.

In this post, we’ll break down all the wins, losses and “holy smokes!” moments from our first 2,345,864 exit overlay viewers.

Psst: Towards the end of these experiments, Unbounce launched Convertables, and with it a whole toolbox of advanced triggers and targeting options for overlays.

Goals, tools and testing conditions

Our goal for this project was simple: Get more people to consume more Unbounce content — whether it be blog posts, ebooks, videos, you name it.

We invest a lot in our content, and we want it read by as many marketers as possible. All our research — everything we know about that elusive thing called conversion, exists in our content.

Our content also allows readers to find out whether Unbounce is a tool that can help them. We want more customers, but only if they can truly benefit from our product. Those who experience ‘lightbulb’ moments when reading our content definitely fit the bill.

As for tools, the first four experiments were conducted using Rooster (an exit-intent tool purchased by Unbounce in June 2015). It was a far less sophisticated version of what is now Unbounce Convertables, which we used in the final experiment.

Testing conditions were as follows:

  1. All overlays were triggered on exit; meaning they launched only when abandoning visitors were detected.
  1. For the first three experiments, we compared sequential periods to measure results. For the final two, we ran makeshift A/B tests.
  1. When comparing sequential periods, testing conditions were isolated by excluding new blog posts from showing any overlays.
  1. A “conversion” was defined as either a completed form (lead gen overlay) or a click (clickthrough overlay).
  1. All experiments were conducted between January 2015 and November 2016.

Experiment #1: Content Offer vs. Generic Signup

Our first exit overlay had a simple goal: Get more blog subscribers. It looked like this.

blog-subscriber-overlay

It was viewed by 558,488 unique visitors over 170 days, 1.27% of which converted to new blog subscribers. Decent start, but not good enough.

To improve the conversion rate, we posed the following.

HYPOTHESIS
Because online marketing offers typically convert better when a specific, tangible offer is made (versus a generic signup), we expect that by offering a free ebook to abandoning visitors, we will improve our conversion rate beyond the current 1.27% baseline.

Whereas the original overlay asked visitors to subscribe to the blog for “tips”, the challenger overlay offered visitors The 23 Principles of Attention-Driven Design.

add-overlay

After 96 days and over 260,000 visitors, we had enough conversions to call this experiment a success. The overlay converted at 2.65%, and captured 7,126 new blog subscribers.

overlay-experiment-1-results

Since we didn’t A/B test these overlays, our results were merely observations. Seasonality is one of many factors that can sway the numbers.

We couldn’t take it as gospel, but we were seeing double the subscribers we had previously.

Observations

  • Offering tangible resources (versus non-specific promises, like a blog signup) can positively affect conversion rates.

Stay in the loop and get all the juicy test results from our upcoming overlay experiments

Learn from our overlay wins, losses and everything in between.
By entering your email you’ll receive weekly Unbounce Blog updates and other resources to help you become a marketing genius.

Experiment #2: Four-field vs. Single-field Overlays

Data people always spoil the party.

The early success of our first experiment caught the attention of Judi, our resident marketing automation whiz, who wisely reminded us that collecting only an email address on a large-scale campaign was a missed opportunity.

For us to fully leverage this campaign, we needed to find out more about the individuals (and organizations) who were consuming our content.

Translation: We needed to add three more form fields to the overlay.

overlay-experiment-2

Since filling out forms is a universal bummer, we safely assumed our conversion rate would take a dive.

But something else happened that we didn’t predict. Notice a difference (besides the form fields) between the two overlays above? Yup, the new version was larger: 900x700px vs. 750x450px.

Adding three form fields made our original 750x450px design feel too cramped, so we arbitrarily increased the size — never thinking there may be consequences. More on that later.

Anyways, we launched the new version, and as expected the results sucked.

overlay-experiment-2-results
Things weren’t looking good after 30 days.

For business reasons, we decided to end the test after 30 days, even though we didn’t run the challenger overlay for an equal time period (96 days).

Overall, the conversion rate for the 30-day period was 48% lower than the previous 96-day period. I knew it was for good reason: Building our data warehouse is important. Still, a small part of me died that day.

Then it got worse.

It occurred to us that for a 30-day period, that sample size of viewers for the new overlay (53,460) looked awfully small.

A closer inspection revealed that our previous overlay averaged 2,792 views per day, while this new version was averaging 1,782. So basically our 48% conversion drop was served a la carte with a 36% plunge in overall views. Fun!

But why?

It turns out increasing the size of the overlay wasn’t so harmless. The size was too large for many people’s browser windows, so the overlay only fired two out of every three visits, even when targeting rules matched.

We conceded, and redesigned the overlay in 800x500px format.

overlay-experiment-redesign

Daily views rose back to their normal numbers, and our new baseline conversion rate of 1.25% remained basically unchanged.

loads-vs-views

Large gap between “loads” and “views” on June 4th; narrower gap on June 5th.

Observations

  • Increasing the number of form fields in overlays can cause friction that reduces conversion rates.
  • Overlay sizes exceeding 800×500 can be too large for some browsers and reduce load:view ratio (and overall impressions).

Experiment #3: One Overlay vs. 10 Overlays

It seemed like such a great idea at the time…

Why not get hyper relevant and build a different exit overlay to each of our blog categories?

With our new baseline conversion rate reduced to 1.25%, we needed an improvement that would help us overcome “form friction” and get us back to that healthy 2%+ range we enjoyed before.

So with little supporting data, we hypothesized that increasing “relevance” was the magic bullet we needed. It works on landing pages why not overlays?

HYPOTHESIS  
Since “relevance” is key to driving conversions, we expect that by running a unique exit overlay on each of our blog categories — whereby the free resource is specific to the category — we will improve our conversion rate beyond the current 1.25% baseline.

blog-categories

We divide our blog into categories according to the marketing topic they cover (e.g., landing pages, copywriting, design, UX, conversion optimization). Each post is tagged by category.

So to increase relevance, we created a total of 10 exit overlays (each offering a different resource) and assigned each overlay to one or two categories, like this:

category-specific-overlays

Creating all the new overlays would take some time (approximately three hours), but since we already had a deep backlog of resources on all things online marketing, finding a relevant ebook, course or video to offer in each category wasn’t difficult.

And since our URLs contain category tags (e.g., all posts on “design” start with root domain unbounce.com/design), making sure the right overlay ran on the right post was easy.

unbounce-targeting

URL Targeting rule for our Design category; the “include” rule automatically excludes the overlay from running in other categories.

But there was a problem: We’d established a strict rule that our readers would only ever see one exit overlay… no matter how many blog categories they browsed. It’s part of our philosophy on using overlays in a way that respects the user experience.

When we were just using one overlay, that was easy — a simple “Frequency” setting was all we needed.

unbounce-frequency

…but not so easy with 10 overlays running on the same blog.

We needed a way to exclude anyone who saw one overlay from seeing any of the other nine.

Cookies were the obvious answer, so we asked our developers to build a temporary solution that could:

  • Pass a cookie from an overlay to the visitor’s browser
  • Exclude that cookie in our targeting settings

They obliged.

unbounce-advanced-targeting

We used “incognito mode” to repeatedly test the functionality, and after that we were go for launch.

Then this happened.

rooster-dashboard
Ignore the layout… the Convertables dashboard is much prettier now :)

After 10 days of data, our conversion rate was a combined 1.36%, 8.8% higher than the baseline. It eventually crept its way to 1.42% after an additional 250,000 views. Still nowhere near what we’d hoped.

So what went wrong?

We surmised that just because an offer is “relevant” doesn’t mean it’s compelling. Admittedly, not all of the 10 resources were on par with The 23 Principles of Attention-Driven Design, the ebook we originally offered in all categories.

That said, this experiment provided an unexpected benefit: we could now see our conversion rates by category instead of just one big number for the whole blog. This would serve us well on future tests.

Observations

  • Just because an offer is relevant doesn’t mean it’s good.
  • Conversion rates vary considerably between categories.

Experiment #4: Resource vs. Resource

“Just because it’s relevant doesn’t mean it’s good.”

This lesson inspired a simple objective for our next task: Improve the offers in our underperforming categories.

We decided to test new offers across five categories that had low conversion rates and high traffic volume:

  1. A/B Testing and CRO (0.57%)
  2. Email (1.24%)
  3. Lead Gen and Content Marketing (0.55%)
Note: We used the same overlay for the A/B Testing and CRO categories, as well as the Lead Gen and Content Marketing Categories.

Hypothesis
Since we believe the resources we’re offering in the categories of A/B testing, CRO, Email, Lead Gen and Content Marketing are less compelling than resources we offer in other categories, we expect to see increased conversion rates when we test new resources in these categories.

With previous studies mentioned in this post, we compared sequential periods. For this one, we took things a step further and jury-rigged an A/B testing system together using Visual Website Optimizer and two Unbounce accounts.

And after finding what we believed to be more compelling resources to offer, the new test was launched.

topic-experiment

We saw slightly improved results in the A/B Testing and CRO categories, although not significant. For the Email category, we saw a large drop-off.

In the Lead Gen and Content Marketing categories however, there was a dramatic uptick in conversions and the results were statistically significant. Progress!

Observations

  • Not all content is created equal; some resources are more desirable to our audience.

Experiment #5: Clickthrough vs. Lead Gen Overlays

Although progress was made in our previous test, we still hadn’t solved the problem from our second experiment.

While having the four fields made each conversion more valuable to us, it still reduced our conversion rate a relative 48% (from 2.65% to 1.25% back in experiment #2).

We’d now worked our way up to a baseline of 1.75%, but still needed a strategy for reducing form friction.

The answer lay in a new tactic for using overlays that we dubbed traffic shaping.

Traffic Shaping: Using clickthrough overlays to incentivize visitors to move from low-converting to high-converting pages.

Here’s a quick illustration:

traffic-shaping-diagram

Converting to this format would require us to:

  1. Redesign our exit overlays
  2. Build a dedicated landing page for each overlay
  3. Collect leads via the landing pages

Basically, we’d be using the overlays as a bridge to move readers from “ungated” content (a blog post) to “gated” content (a free video that required a form submission to view). Kinda like playing ‘form field hot potato’ in a modern day version of Pipe Dream.

Hypothesis
Because “form friction” reduces conversions, we expect that removing form fields from our overlays will increase engagement (enough to offset the drop off we expect from adding an extra step). To do this, we will redesign our overlays to clickthrough (no fields), create a dedicated landing page for each overlay and add the four-field form to the landing page. We’ll measure results in Unbounce.

By this point, we were using Unbounce to build the entire campaign. The overlays were built in Convertables, and the landing pages were created with the Unbounce landing page builder.

We decided to test this out in our A/B Testing and CRO as well as Lead Gen and Content Marketing categories.

clickthrough-overlays

After filling out the form, visitors would either be given a secure link for download (PDF) or taken to a resource page where their video would play.

Again, for this to be successful the conversion rate on the overlays would need to increase enough to offset the drop off we expected by adding the extra landing page step.

These were our results after 21 days.

clickthrough-overlays-results

Not surprisingly, engagement with the overlays increased significantly. I stress the word “engagement” and not “conversion,” because our goal had changed from a form submission to a clickthrough.

In order to see a conversion increase, we needed to factor in the percentage of visitors who would drop off once they reached the landing page.

A quick check in Unbounce showed us landing page drop-off rates of 57.7% (A/B Testing/CRO) and 25.33% (Lead Gen/Content Marketing). Time for some grade 6 math…

clickthrough-overlays-results-2

Even with significant drop-off in the landing page step, overall net leads still increased.

Our next step would be applying the same format to all blog categories, and then measuring overall results.

Onward!

All observations

  • Offering specific, tangible resources (vs. non-specific promises) can positively affect conversion rates.
  • Increasing the number of form fields in overlays can cause friction that reduces conversion rates.
  • Overlay sizes exceeding 800×500 can be too large for some browsers and reduce load:view ratio (and overall impressions).
  • Just because an offer is relevant doesn’t mean it’s good
  • Conversion rates vary considerably between blog categories
  • Not all content is created equal; some resources are more desirable to our audience.
  • “Form friction” can vary significantly depending on where your form fields appear.

Stay tuned…

We’re continuing to test new triggers and targeting options for overlays, and we want to tell you all about it.

So what’s in store for next time?

  1. The Trigger Test — What happens when test our “on exit” trigger against a 15-second time delay?
  2. The Referral Test — What happens when we show different overlays to users from different traffic sources (e.g., social vs. organic)?
  3. New v.s. Returning Visitors — Do returning blog visitors convert better than first-time visitors?

Stay in the loop and get all the juicy test results from our upcoming overlay experiments

Learn from our overlay wins, losses and everything in between.
By entering your email you’ll receive weekly Unbounce Blog updates and other resources to help you become a marketing genius.

More: 

Lessons Learned From 2,345,864 Exit Overlay Visitors

Building an App or Online Business in 2017? Here’s A DIY Resource Kit of Free Tools & Tips!

building an app

Last year, I started working on an idea for a platform, called Counsell, currently available as an app on iOS and Android devices, that lets all professionals give and get paid advice. As a designer, I was fortunate to be working with an incredible developer from the very start so we knew we could turn the idea into a working product. However, it was only when I, bolstered by my marketing background, decided to build a business around the app that I realized how haphazard and unsystematic the realities of setting up a new online business could be. Thanks to…

The post Building an App or Online Business in 2017? Here’s A DIY Resource Kit of Free Tools & Tips! appeared first on The Daily Egg.

From:

Building an App or Online Business in 2017? Here’s A DIY Resource Kit of Free Tools & Tips!

Infographic: How to Avoid a Google SEO Penalty

focus on relevance

“Google Penalty.” Those two words are all it takes to make all SEOs (search engine optimizers) and internet marketers cringe. But honestly, you shouldn’t really fear these words. There is an easy recipe to follow so that you’ll never have to worry about it: Make Sure Your On-Page SEO is Tip-Top – Many websites suffer from poor on-page SEO issues. Have an SEO professional audit your website and implement their recommendations. This means get your title tags in order, use meta descriptions, and stay away from keyword cannibalization. Create Useful Content – I purposely stayed away from using the word…

The post Infographic: How to Avoid a Google SEO Penalty appeared first on The Daily Egg.

More:

Infographic: How to Avoid a Google SEO Penalty

How to Optimize User Experience & Conversion Paths in Google Analytics

how-to-optimize-user-experience-and-conversion-paths-in-google-analytics

If you’ve been keeping up with any thought leaders over the last few months, you know that we are all talking about user experience. The more you can customize, personalize, optimize, target, adapt, and segment your individual user experiences, the more success you’ll see you in 2017. That’s a nice thought, but not a very easily implemented practice. The truth is that tracking user experience is no easy feat. Especially in Google Analytics. Vague attribution models and skewed conversion paths make reporting on your user experience frustrating, to say the least. So, to make things easier, start combing through your…

The post How to Optimize User Experience & Conversion Paths in Google Analytics appeared first on The Daily Egg.

Read more:

How to Optimize User Experience & Conversion Paths in Google Analytics

Optimizing Mobile Home Page Increases Conversions for Wedding Shoes Website

Elegant Steps offers a large selection of wedding shoes in the UK, both online and in store. More than 50% of its users are new, female users discovering the website organically through mobile. The bulk of them are brides-to-be who are looking for wedding shoes.

Problem

After looking at Elegant Steps’ Google Analytics (GA) data, it was found that while its desktop website was converting at 2%, the mobile version was converting at a much lower 0.6%.

Observations

Hit Search, a digital marketing agency, used VWO to help Elegant Steps dig deep into the problem. They used GA, heuristic analysis, and VWO’s scrollmaps and heatmaps capabilities to find that:

  • Hardly any visitors were scrolling enough to reach the Shop by Brand section on the home page.
  • Elegant Steps’ 3 main USPs, including free shipping, weren’t appearing above the fold on mobile.
  • The text on the home page image was hard to read because it was the same color as the background.

This is how the home page looked on mobile:

elegant_control_jpg

Hypothesis

Armed with these observations, Niall Brooke from Hit Search set about optimizing the mobile home page to fix the problems. It was decided to:

  • Introduce the Shop by Brand section higher up on the page, as the presence of an established name is known to help instill trust and assuage fears.
  • Many studies have found that unexpected shipping cost is the biggest reason for cart abandonment. It was hypothesized that displaying “Free Shipping” above the fold will help reduce bounce and encourage users to continue down the conversion funnel.
  • Change the CTA copy from the generic “Shop Wedding Shoes” to the possessive, “Find my new wedding shoes.”
  • Change the text color on the image for the text to be readable.

This is how the variation looked:

elegant_variation_jpg

Test

Hit Search ran the new version of the home page against the original only for mobile visitors, using VWO’s targeting capability. Niall set VWO’s Bayesian-powered statistics engine to “High-Certainty” mode, and the results kicked in within a month.

Results

“The results were positive with almost a threefold increase in conversions and almost a 50% drop in bounce rate,” said Niall.

In his closing thoughts, Niall had this to say, “VWO is a brilliant all-round conversion optimization platform which we use on a daily basis to perform user analysis, A/B and split tests,” he added.

Mobile an afterthought?

According to a 2015 report, the average conversion rate for mobile websites in the US was 1.32%, significantly lower than its desktop counterpart (3.82%). Though studies have suggested that visitors mostly use mobile for research purposes and make the actual purchase through desktop website, there’s no denying that online retailers are still leaving money on the table. We would love to your thoughts about optimizing mobile websites. When does it become important for you to start looking at mobile optimization? Just hit us the comment section below.

5

1 ratings

How will you rate this content?

Please choose a rating

The post Optimizing Mobile Home Page Increases Conversions for Wedding Shoes Website appeared first on VWO Blog.

Link: 

Optimizing Mobile Home Page Increases Conversions for Wedding Shoes Website

Why a Killer UX Doesn’t Always Translate into Conversions

killer ux does not translate into conversions

One person’s killer UX is another’s UX killer. Why not copy the killer user experience of a famous site in your industry? The short answer? You’re not them. The longer answer is that unless you’re copying a fully optimized site with all the same variables, targets, and exact same audience, you’re likely setting yourself up for failure. Take Target for example. Following the success of Amazon’s review software, Target purchased it and sought to implement it on their own site. Even after copying the software and interface, the engagement with their reviews suffered. In fact, in the first month after…

The post Why a Killer UX Doesn’t Always Translate into Conversions appeared first on The Daily Egg.

See original article: 

Why a Killer UX Doesn’t Always Translate into Conversions