Tag Archives: pdf

[Case Study] Ecwid sees 21% lift in paid plan upgrades in one month

Reading Time: 2 minutes

What would you do with 21% more sales this month?

I bet you’d walk into your next meeting with your boss with an extra spring in your step, right?

Well, when you implement a strategic marketing optimization program, results like this are not only possible, they are probable.

In this new case study, you’ll discover how e-commerce software supplier, Ecwid, ran one experiment for four weeks, and saw a 21% increase in paid upgrades.

Get the full Ecwid case study now!

Download a PDF version of the Ecwid case study, featuring experiment details, supplementary takeaways and insights, and a testimonial from Ecwid’s Sr. Director, Digital Marketing.



By entering your email, you’ll receive bi-weekly WiderFunnel Blog updates and other resources to help you become an optimization champion.

A little bit about Ecwid

Ecwid provides easy-to-use online store setup, management, and payment solutions. The company was founded in 2009, with the goal of enabling business-owners to add online stores to their existing websites, quickly and without hassle.

The company has a freemium business model: Users can sign up for free, and unlock more features as they upgrade to paid packages.

Ecwid’s partnership with WiderFunnel

In November 2016, Ecwid partnered with WiderFunnel with two primary goals:

  1. To increase initial signups for their free plan through marketing optimization, and
  2. To increase the rate of paid upgrades, through platform optimization

This case study focuses on a particular experiment cycle that ran on Ecwid’s step-by-step onboarding wizard.

The methodology

Last Winter, the WiderFunnel Strategy team did an initial LIFT Analysis of the onboarding wizard, and identified several potential barriers to conversion. (Both in terms of completing steps to setup a new store, and in terms of upgrading to a paid plan.)

The lead WiderFunnel Strategist for Ecwid, Dennis Pavlina, decided to create an A/B cluster test to 1) address the major barriers simultaneously, and 2) to get major lift for Ecwid, quickly.

The overarching goal was to make the onboarding process smoother. The WiderFunnel and Ecwid optimization teams hoped that enhancing the initial user experience, and exposing users to the wide range of Ecwid’s features, would result in more users upgrading to paid plans.

Dennis Pavlina

Ecwid’s two objectives ended up coming together in this test. We thought that if more new users interacted with the wizard and were shown the whole ‘Ecwid world’ with all the integrations and potential it has, they would be more open to upgrading. People needed to be able to see its potential before they would want to pay for it.

Dennis Pavlina, Optimization Strategist, WiderFunnel

The Results

This experiment ran for four weeks, at which point the variation was determined to be the winner with 98% confidence. The variation resulted in a 21.3% increase in successful paid account upgrades for Ecwid.

Read the full case study for:

  • The details on the initial barriers to conversion
  • How this test was structured
  • Which secondary metrics we tracked, and
  • The supplementary takeaways and customer insights that came from this test

The post [Case Study] Ecwid sees 21% lift in paid plan upgrades in one month appeared first on WiderFunnel Conversion Optimization.

See original article:

[Case Study] Ecwid sees 21% lift in paid plan upgrades in one month

Lessons Learned From 2,345,864 Exit Overlay Visitors

sup

Back in 2015, Unbounce launched its first ever exit overlay on this very blog.

Did it send our signup rate skyrocketing 4,000%? Nope.

Did it turn our blog into a conversion factory for new leads? Not even close — our initial conversion rate was barely over 1.25%.

But what it did do was start us down the path of exploring the best ways to use this technology; of furthering our goals by finding ways to offer visitors relevant, valuable content through overlays.

Overlays are modal lightboxes that launch within a webpage and focus attention on a single offer. Still fuzzy on what an overlay is? Click here.

In this post, we’ll break down all the wins, losses and “holy smokes!” moments from our first 2,345,864 exit overlay viewers.

Psst: Towards the end of these experiments, Unbounce launched Convertables, and with it a whole toolbox of advanced triggers and targeting options for overlays.

Goals, tools and testing conditions

Our goal for this project was simple: Get more people to consume more Unbounce content — whether it be blog posts, ebooks, videos, you name it.

We invest a lot in our content, and we want it read by as many marketers as possible. All our research — everything we know about that elusive thing called conversion, exists in our content.

Our content also allows readers to find out whether Unbounce is a tool that can help them. We want more customers, but only if they can truly benefit from our product. Those who experience ‘lightbulb’ moments when reading our content definitely fit the bill.

As for tools, the first four experiments were conducted using Rooster (an exit-intent tool purchased by Unbounce in June 2015). It was a far less sophisticated version of what is now Unbounce Convertables, which we used in the final experiment.

Testing conditions were as follows:

  1. All overlays were triggered on exit; meaning they launched only when abandoning visitors were detected.
  1. For the first three experiments, we compared sequential periods to measure results. For the final two, we ran makeshift A/B tests.
  1. When comparing sequential periods, testing conditions were isolated by excluding new blog posts from showing any overlays.
  1. A “conversion” was defined as either a completed form (lead gen overlay) or a click (clickthrough overlay).
  1. All experiments were conducted between January 2015 and November 2016.

Experiment #1: Content Offer vs. Generic Signup

Our first exit overlay had a simple goal: Get more blog subscribers. It looked like this.

blog-subscriber-overlay

It was viewed by 558,488 unique visitors over 170 days, 1.27% of which converted to new blog subscribers. Decent start, but not good enough.

To improve the conversion rate, we posed the following.

HYPOTHESIS
Because online marketing offers typically convert better when a specific, tangible offer is made (versus a generic signup), we expect that by offering a free ebook to abandoning visitors, we will improve our conversion rate beyond the current 1.27% baseline.

Whereas the original overlay asked visitors to subscribe to the blog for “tips”, the challenger overlay offered visitors The 23 Principles of Attention-Driven Design.

add-overlay

After 96 days and over 260,000 visitors, we had enough conversions to call this experiment a success. The overlay converted at 2.65%, and captured 7,126 new blog subscribers.

overlay-experiment-1-results

Since we didn’t A/B test these overlays, our results were merely observations. Seasonality is one of many factors that can sway the numbers.

We couldn’t take it as gospel, but we were seeing double the subscribers we had previously.

Observations

  • Offering tangible resources (versus non-specific promises, like a blog signup) can positively affect conversion rates.

Stay in the loop and get all the juicy test results from our upcoming overlay experiments

Learn from our overlay wins, losses and everything in between.
By entering your email you’ll receive weekly Unbounce Blog updates and other resources to help you become a marketing genius.

Experiment #2: Four-field vs. Single-field Overlays

Data people always spoil the party.

The early success of our first experiment caught the attention of Judi, our resident marketing automation whiz, who wisely reminded us that collecting only an email address on a large-scale campaign was a missed opportunity.

For us to fully leverage this campaign, we needed to find out more about the individuals (and organizations) who were consuming our content.

Translation: We needed to add three more form fields to the overlay.

overlay-experiment-2

Since filling out forms is a universal bummer, we safely assumed our conversion rate would take a dive.

But something else happened that we didn’t predict. Notice a difference (besides the form fields) between the two overlays above? Yup, the new version was larger: 900x700px vs. 750x450px.

Adding three form fields made our original 750x450px design feel too cramped, so we arbitrarily increased the size — never thinking there may be consequences. More on that later.

Anyways, we launched the new version, and as expected the results sucked.

overlay-experiment-2-results
Things weren’t looking good after 30 days.

For business reasons, we decided to end the test after 30 days, even though we didn’t run the challenger overlay for an equal time period (96 days).

Overall, the conversion rate for the 30-day period was 48% lower than the previous 96-day period. I knew it was for good reason: Building our data warehouse is important. Still, a small part of me died that day.

Then it got worse.

It occurred to us that for a 30-day period, that sample size of viewers for the new overlay (53,460) looked awfully small.

A closer inspection revealed that our previous overlay averaged 2,792 views per day, while this new version was averaging 1,782. So basically our 48% conversion drop was served a la carte with a 36% plunge in overall views. Fun!

But why?

It turns out increasing the size of the overlay wasn’t so harmless. The size was too large for many people’s browser windows, so the overlay only fired two out of every three visits, even when targeting rules matched.

We conceded, and redesigned the overlay in 800x500px format.

overlay-experiment-redesign

Daily views rose back to their normal numbers, and our new baseline conversion rate of 1.25% remained basically unchanged.

loads-vs-views

Large gap between “loads” and “views” on June 4th; narrower gap on June 5th.

Observations

  • Increasing the number of form fields in overlays can cause friction that reduces conversion rates.
  • Overlay sizes exceeding 800×500 can be too large for some browsers and reduce load:view ratio (and overall impressions).

Experiment #3: One Overlay vs. 10 Overlays

It seemed like such a great idea at the time…

Why not get hyper relevant and build a different exit overlay to each of our blog categories?

With our new baseline conversion rate reduced to 1.25%, we needed an improvement that would help us overcome “form friction” and get us back to that healthy 2%+ range we enjoyed before.

So with little supporting data, we hypothesized that increasing “relevance” was the magic bullet we needed. It works on landing pages why not overlays?

HYPOTHESIS  
Since “relevance” is key to driving conversions, we expect that by running a unique exit overlay on each of our blog categories — whereby the free resource is specific to the category — we will improve our conversion rate beyond the current 1.25% baseline.

blog-categories

We divide our blog into categories according to the marketing topic they cover (e.g., landing pages, copywriting, design, UX, conversion optimization). Each post is tagged by category.

So to increase relevance, we created a total of 10 exit overlays (each offering a different resource) and assigned each overlay to one or two categories, like this:

category-specific-overlays

Creating all the new overlays would take some time (approximately three hours), but since we already had a deep backlog of resources on all things online marketing, finding a relevant ebook, course or video to offer in each category wasn’t difficult.

And since our URLs contain category tags (e.g., all posts on “design” start with root domain unbounce.com/design), making sure the right overlay ran on the right post was easy.

unbounce-targeting

URL Targeting rule for our Design category; the “include” rule automatically excludes the overlay from running in other categories.

But there was a problem: We’d established a strict rule that our readers would only ever see one exit overlay… no matter how many blog categories they browsed. It’s part of our philosophy on using overlays in a way that respects the user experience.

When we were just using one overlay, that was easy — a simple “Frequency” setting was all we needed.

unbounce-frequency

…but not so easy with 10 overlays running on the same blog.

We needed a way to exclude anyone who saw one overlay from seeing any of the other nine.

Cookies were the obvious answer, so we asked our developers to build a temporary solution that could:

  • Pass a cookie from an overlay to the visitor’s browser
  • Exclude that cookie in our targeting settings

They obliged.

unbounce-advanced-targeting

We used “incognito mode” to repeatedly test the functionality, and after that we were go for launch.

Then this happened.

rooster-dashboard
Ignore the layout… the Convertables dashboard is much prettier now :)

After 10 days of data, our conversion rate was a combined 1.36%, 8.8% higher than the baseline. It eventually crept its way to 1.42% after an additional 250,000 views. Still nowhere near what we’d hoped.

So what went wrong?

We surmised that just because an offer is “relevant” doesn’t mean it’s compelling. Admittedly, not all of the 10 resources were on par with The 23 Principles of Attention-Driven Design, the ebook we originally offered in all categories.

That said, this experiment provided an unexpected benefit: we could now see our conversion rates by category instead of just one big number for the whole blog. This would serve us well on future tests.

Observations

  • Just because an offer is relevant doesn’t mean it’s good.
  • Conversion rates vary considerably between categories.

Experiment #4: Resource vs. Resource

“Just because it’s relevant doesn’t mean it’s good.”

This lesson inspired a simple objective for our next task: Improve the offers in our underperforming categories.

We decided to test new offers across five categories that had low conversion rates and high traffic volume:

  1. A/B Testing and CRO (0.57%)
  2. Email (1.24%)
  3. Lead Gen and Content Marketing (0.55%)
Note: We used the same overlay for the A/B Testing and CRO categories, as well as the Lead Gen and Content Marketing Categories.

Hypothesis
Since we believe the resources we’re offering in the categories of A/B testing, CRO, Email, Lead Gen and Content Marketing are less compelling than resources we offer in other categories, we expect to see increased conversion rates when we test new resources in these categories.

With previous studies mentioned in this post, we compared sequential periods. For this one, we took things a step further and jury-rigged an A/B testing system together using Visual Website Optimizer and two Unbounce accounts.

And after finding what we believed to be more compelling resources to offer, the new test was launched.

topic-experiment

We saw slightly improved results in the A/B Testing and CRO categories, although not significant. For the Email category, we saw a large drop-off.

In the Lead Gen and Content Marketing categories however, there was a dramatic uptick in conversions and the results were statistically significant. Progress!

Observations

  • Not all content is created equal; some resources are more desirable to our audience.

Experiment #5: Clickthrough vs. Lead Gen Overlays

Although progress was made in our previous test, we still hadn’t solved the problem from our second experiment.

While having the four fields made each conversion more valuable to us, it still reduced our conversion rate a relative 48% (from 2.65% to 1.25% back in experiment #2).

We’d now worked our way up to a baseline of 1.75%, but still needed a strategy for reducing form friction.

The answer lay in a new tactic for using overlays that we dubbed traffic shaping.

Traffic Shaping: Using clickthrough overlays to incentivize visitors to move from low-converting to high-converting pages.

Here’s a quick illustration:

traffic-shaping-diagram

Converting to this format would require us to:

  1. Redesign our exit overlays
  2. Build a dedicated landing page for each overlay
  3. Collect leads via the landing pages

Basically, we’d be using the overlays as a bridge to move readers from “ungated” content (a blog post) to “gated” content (a free video that required a form submission to view). Kinda like playing ‘form field hot potato’ in a modern day version of Pipe Dream.

Hypothesis
Because “form friction” reduces conversions, we expect that removing form fields from our overlays will increase engagement (enough to offset the drop off we expect from adding an extra step). To do this, we will redesign our overlays to clickthrough (no fields), create a dedicated landing page for each overlay and add the four-field form to the landing page. We’ll measure results in Unbounce.

By this point, we were using Unbounce to build the entire campaign. The overlays were built in Convertables, and the landing pages were created with the Unbounce landing page builder.

We decided to test this out in our A/B Testing and CRO as well as Lead Gen and Content Marketing categories.

clickthrough-overlays

After filling out the form, visitors would either be given a secure link for download (PDF) or taken to a resource page where their video would play.

Again, for this to be successful the conversion rate on the overlays would need to increase enough to offset the drop off we expected by adding the extra landing page step.

These were our results after 21 days.

clickthrough-overlays-results

Not surprisingly, engagement with the overlays increased significantly. I stress the word “engagement” and not “conversion,” because our goal had changed from a form submission to a clickthrough.

In order to see a conversion increase, we needed to factor in the percentage of visitors who would drop off once they reached the landing page.

A quick check in Unbounce showed us landing page drop-off rates of 57.7% (A/B Testing/CRO) and 25.33% (Lead Gen/Content Marketing). Time for some grade 6 math…

clickthrough-overlays-results-2

Even with significant drop-off in the landing page step, overall net leads still increased.

Our next step would be applying the same format to all blog categories, and then measuring overall results.

Onward!

All observations

  • Offering specific, tangible resources (vs. non-specific promises) can positively affect conversion rates.
  • Increasing the number of form fields in overlays can cause friction that reduces conversion rates.
  • Overlay sizes exceeding 800×500 can be too large for some browsers and reduce load:view ratio (and overall impressions).
  • Just because an offer is relevant doesn’t mean it’s good
  • Conversion rates vary considerably between blog categories
  • Not all content is created equal; some resources are more desirable to our audience.
  • “Form friction” can vary significantly depending on where your form fields appear.

Stay tuned…

We’re continuing to test new triggers and targeting options for overlays, and we want to tell you all about it.

So what’s in store for next time?

  1. The Trigger Test — What happens when test our “on exit” trigger against a 15-second time delay?
  2. The Referral Test — What happens when we show different overlays to users from different traffic sources (e.g., social vs. organic)?
  3. New v.s. Returning Visitors — Do returning blog visitors convert better than first-time visitors?

Stay in the loop and get all the juicy test results from our upcoming overlay experiments

Learn from our overlay wins, losses and everything in between.
By entering your email you’ll receive weekly Unbounce Blog updates and other resources to help you become a marketing genius.

More: 

Lessons Learned From 2,345,864 Exit Overlay Visitors

A day in the life of an optimization champion

Reading Time: 9 minutes

How do you make conversion optimization a priority within a global organization?

Especially, when there are so many other things you could spend your marketing dollars on?

And how do you keep multiple marketing teams aligned when it comes to your optimization efforts?

These are some of the challenges facing Jose Uzcategui, Global Analytics and Ecommerce Conversion Lead at ASICS, and Sarah Breen, Global Ecommerce Product Lead at ASICS.

ASICS, a global sporting goods retailer, is a giant company with multiple websites and marketing teams in multiple regions.

For an organization like this, deciding to pursue conversion optimization (CRO) as a marketing strategy is one thing, but actually implementing a successful, cohesive conversion optimization program is an entirely different thing.

Related: Get WiderFunnel’s free Optimization Champion’s Handbook for tips on how to be the Optimization Champion your company needs.

We started working with ASICS several months ago to help them with this rather daunting task.

A few weeks ago, I sat down with Jose and Sarah to discuss what it’s like to be an Optimization Champion within a company like ASICS.

Let’s start at the very beginning with a few introductions.

For almost 8 years, Jose has been involved in different areas of online marketing, but Analytics has always been a core part of his career. About five years ago, he began to move from paid marketing and SEO and started focusing on analysis and conversion optimization.

He was brought in to lead the conversion optimization program at ASICS, but it became obvious that proper conversion optimization wouldn’t be possible without putting the company’s Analytics in order first.

“For my first year at ASICS, I was focused on getting our Analytics where they need to be. Right now, we have a good Analytics foundation and that’s why we’re getting momentum on conversion optimization. We’re building our teams internally and externally and my role, right now, is both execution and strategy on these two fronts,” explains Jose.

Sarah has been with ASICS for a little over a year as the Ecommerce Global Product Lead. She hadn’t really been involved with testing until she started working more closely with WiderFunnel and Optimizely (a testing tool).

She started working with Nick So, WiderFunnel Optimization Strategist, and Aswin Kumar, WiderFunnel Optimization Coordinator, to try to figure out what experiments would make the biggest impact in the shortest amount of time on ASICS’ sites.

“I sometimes work with our designers to decide what a test should look like from the front end and how many variations we want to test, based on Nick and Aswin’s recommendations. I provide WiderFunnel the necessary assets, as well as a timeline and final approvals.

“Once a test is launched, I work with WiderFunnel and with Jose to figure out what the results mean, and whether or not the change is something we want to roll out globally and when we’ll be able to do that (considering how many other things we have in our queue that are required development work),” explains Sarah.

But optimization is just a part of Sarah’s role at ASICS: she works with a number of vendors to try to get third party solutions on their sites globally, and she works with ASICS’ regional teams to determine new product features and functionality.

Despite the fact that they wear many hats, Jose and Sarah are both heavily involved in ASICS’ conversion optimization efforts, and I wanted to know what drew them to CRO.

Q: What do each of you find exciting about conversion optimization?

“Conversion optimization gives immediate results and that’s a great feeling,” says Jose. “Particularly with e-commerce, if you have an idea, you test it, and you know you’re about to see what that idea is worth in monetary value.”

Sarah loves the certainty.

We’re proving our assumptions with data. Testing allows me to say, ‘This is why we took this direction. We’re not just doing what our competitors do, it’s not just doing something that we saw on a site that sells used cars. This is something that’s been proven to work on our site and we’re going to move forward with it.’

Of course, it’s not all high’s when you’re an Optimization Champion at an enterprise company, which led me to my next question…

Q: What are the biggest challenges you face as an Optimization Champion within a company like ASICS?

For Sarah, the biggest challenge is one of prioritization. “We have so many things we want to do: how do we prioritize? I want to do more and more testing. It’s just about picking our battles and deciding what the best investment will be,” she explains.

“When it comes to global teams, aligning the regions on initiatives you may want to test can be challenging,” adds Jose. “If a region doesn’t plan for testing at the beginning of their campaign planning process, for instance, it becomes very difficult to test something more dramatic like a new value proposition or personalization experiences.”

Despite the challenges, Sarah and Jose believe in conversion optimization. Of course, it’s a lot easier to sell the idea of CRO if there’s already a data-driven, testing culture within a company.

Q: Was there a testing culture at ASICS before your partnership with WiderFunnel?

“We had a process in place. We had introduced the LIFT Model®, actually. The LIFT Model is an easy framework to work with, it’s easy to communicate. But there wasn’t enough momentum, or resources, or time put into testing for us to say, ‘We have a testing culture and everybody is on board.’ Before WiderFunnel, there were a few seeds planted, but not a lot of soil or water for them to grow,” says Jose.

LIFT_Model
WiderFunnel’s LIFT Model details the 6 conversion factors.

Q: So, there wasn’t necessarily a solid testing culture at ASICS – how, then, did you go about convincing your team to invest in CRO versus another marketing strategy?

“Education. For everything in enterprise, education is the most important thing you can do. As soon as people understand that they can translate a campaign into a certain amount of money or ROI, then it becomes easy to say ‘Ok, let’s try something else that can tie to the money,’” says Jose, firmly.

“A different strategy is just downplaying the impact of testing. ‘It’s just a test, it’s just temporary for a couple of weeks,’ I might say. Either people understand the value of testing, or I diminish the impact that a test has on the site.”

“Until it’s a huge winner!” I interject.

“Yes! Obviously, if it’s a huge winner, I can say, ‘Oh, look at that! Let’s try another,’” chuckles Jose.

Jose and Sarah focused on education and, with a bit of luck and good timing, they convinced ASICS to invest in conversion optimization.

Q: Has it been a good investment?

“Everybody goes into this kind of investment hoping that there will be a test that will knock it out of the park. You know, a really clear, black and white winner that shows: we invested this amount in this test and in a year it will mean 5x that amount.

“We had a few tests that pointed in that direction, but we didn’t have that black and white winner. For some people, they have that black and white mentality and they might ask if it was worth it.

“I think it was a wise investment. It’s a matter of time before we run that test that proves that everything is worthwhile or the team as a whole realizes that things that we’re learning, even if they’re not at this moment translating into dollars, are worthwhile because we’re learning how our users think, what they do, etc.”

After establishing ASICS’ satisfaction, I wanted to move on to the logistics of managing a conversion optimization program both internally and in conjunction with a partner. First things first: successful relationships are all about communication.

Q: How do you communicate, share ideas, and implement experiments both between your internal teams and WiderFunnel? How do you keep everyone aligned and on the same page?

Sarah explains, “We’ve tried a few different management tools. Right now, JIRA seems to be working well for us. I can add people to an already existing ticket and I don’t have to add a lot of explanation. I can just say, Aswin and Nick came up with this idea, it’s approved, here’s a mock up. Everything is documented in one place and it’s searchable.

“I don’t necessarily think JIRA is the best tool for what we’re doing, but it allows us to have a whole history in a system that our development team is already using. And they know how to use it and check off a ticket and that’s helpful.

Related: Get organized with Liftmap. This free management tool makes it easy for teams to analyze web experiences, then present findings to stakeholders.

“I also send emails with recaps, because digging through those long JIRA discussions is kind of rough.”

Q: How do you share what you’re working on with other teams within ASICS?

“There are two parts to sharing our work: what’s going on and what’s coming,” explains Jose.

“You can see what’s coming in JIRA: tests that are coming and ideas that are being developed.

“Once we have results from a test and a write up, we’ll put a one-pager in a blog style report. When we have a new update, we send an email with the link to the one-pager and I also attach it as a PDF so that anyone who may not have access can still see the results.”

Sarah adds, “They’re very clear, paragraph form explanations with images of everything we’re doing. It’s less technical, more ‘this is what we tried, these are our assumptions, these are our results, this is what we’re going to do.’

This gives the Execs that aren’t on the day-to-day a snapshot showing we’ve made progress, what next steps are, and that we’re doing something good.

Q: How do you engage your co-workers and get them excited about conversion optimization?

Jose says, “I’ve gotten some comments and questions [on our one-page reports]. Obviously, I would like to get more. Once we have more resources, we’ll be able to put different strategies in place to get more engagement from the team. Lately, I’ve been trying to give credit to the region at least that came up with whatever idea we tested.

“I would like to get even more specific as we get more momentum, being able to say things like ‘Pete came up with this idea…and actually it didn’t work out, though we did learn insight X or insight Y.’ or ‘Pete came up with his third winning idea in a row—he gets a prize!’

There’s a level of fun that we can activate. We have some engagement, but I’m hoping for more.

Q: Ok, you’ve concluded a test, analyzed and shared your results — what’s your process for actually implementing changes based on testing?

Jose is quick to respond to this question, giving credit to Sarah: “Sarah’s involvement in our conversion optimization program has been great. Ultimately, Sarah is the one who gets things onto the site. And that’s half of the equation when it comes to testing. It’s so necessary having someone like Sarah invested in this. Without her, the tests might die in development.”

Sarah laughs and thanks Jose. “A lot of my job is managing expectations with our regions,” she explains. “Some regions want to test everything, and they want to do it now, and we have to tell them ‘That’s great, but we can’t give you all of our attention.’ Whereas some regions barely talk to us and have a lot of missed opportunities, so we have to manage the testing and implementation on their site.

“For less engaged regions, we try to communicate “Hey, we have evidence that this change really helped — look at all the sales you got and all of the clicks you got, we’d like you to have this on your page.

“Testing also takes a lot of the back and forth and Q & A out of implementation because we already have something that works. And, unless there’s some weird situation, we can roll a change out globally and say, ‘This is where the idea came from, it came from so-and-so, it’s pushed all the way through and now it’s a global change.’

“We can invite the regions to think of all of the awesome things we can do as a global team whenever we work together and go through this process. And other people can say ‘Hey, we did this! I have some more ideas.’ And the circle continues. It’s really great.”

You’ve both spent a lot of time working with WiderFunnel to build up ASICS’ conversion optimization program, so I’ve got to ask…

Q: What are the biggest challenges and benefits of working with a partner like WiderFunnel?

“The biggest challenge in working with any partner is response time: me responding in time, them responding in time. I’m also the middle man for a lot of things, so maintaining alignment can be tough,” says Sarah.

“But as far as benefits go, it’s hard to choose one. One of the biggest has been WiderFunnel’s ability to take the debate out of a testing decision. You’re able to evaluate testing ideas with a points structure, saying, ‘We think this would be the most valuable for you, for your industry, for what we’ve seen with your competitors, this is the site you should run it on, we think it would be best on mobile or desktop, etc.’

“And we can rely on WiderFunnel’s expertise and say, ‘Let’s do it.’ We just have to figure out if there’s anything that might really ruffle feathers, like making changes to our homepage. We have to be careful with that because it’s prime real estate.

“But if it’s a change to a cart page, I can say, ‘Yes, let’s go ahead and do that, get that in the queue!’ It’s all about getting those recommendations. And once we have a few smaller wins, we can move up to the homepage because we’ve built that trust.

“Another benefit is the thorough results analysis. The summary of assumptions, findings, charts, data, graphs, next steps and opportunities. That’s huge. We can look at the data quickly and identify what’s obvious, by ourselves, but it takes time for us to collate and collect and really break down the results into very clear terms. That’s been hugely helpful,” she adds.

For Jose, the benefit is simple: “Getting tests concluded and getting ideas tested has been the most helpful. Yes or no, next. Yes or no, next. Yes or no, next. That’s created the visibility that I’ve been hoping for — getting visibility across the organization and getting everybody fired up about testing. That’s been the best aspect for me.”

Are you your organization’s Optimization Champion? How do you spread the gospel of testing within your organization? Let us know in the comments!

The post A day in the life of an optimization champion appeared first on WiderFunnel Conversion Optimization.

See original article: 

A day in the life of an optimization champion

Freebie: Olympics Sports Icon Set (45 Icons, EPS, PDF, PNG, SVG)

This year, there will be 42 different sports and over 300 events taking place at the Olympics. Perhaps you have a project related to these upcoming games, or maybe you’ll be working on a project which is somehow related? Wouldn’t it be great to have a set of consistent icons for all sports-related activities, just in case? Well, that’s just what we thought.

Freebie: Olympics Sports Icon Set (45 Icons, EPS, PDF, PNG, SVG)

This set of 45 icons was created by the design team at Icons8. Please note that this icon set is licensed under a Creative Commons Attribution 3.0 Unported. You may modify the size, color or shape of the icons. No attribution is required, however, reselling of bundles or individual pictograms is not cool. Please provide credits to the creators and link to the article in which this freebie was released if you would like to spread the word in blog posts or anywhere else.

The post Freebie: Olympics Sports Icon Set (45 Icons, EPS, PDF, PNG, SVG) appeared first on Smashing Magazine.

Excerpt from – 

Freebie: Olympics Sports Icon Set (45 Icons, EPS, PDF, PNG, SVG)

Finally, CSS In JavaScript! Meet CSSX


JavaScript is a wonderful language. It’s rich, it’s dynamic, and it’s so tightly coupled to the web nowadays. The concept of writing everything in JavaScript doesn’t sound so crazy anymore. First, we started writing our back end in JavaScript, and then Facebook introduced JSX, in which we mix HTML markup with JavaScript. Why not do the same for CSS?

Finally, CSS In JavaScript! Meet CSSX

Imagine a web componentdistributed as a single .js file and containing everything — markup, logic and styles. We would still have our basic style sheets, but the dynamic CSS would be a part of JavaScript. Now this is possible, and one way to achieve it is with CSSX. CSSX is a project that swallowed my spare time for a month. It was challenging and interesting, and it definitely pushed me to learn a lot of new stuff. The result is a set of tools that allows you to write vanilla CSS in JavaScript.

The post Finally, CSS In JavaScript! Meet CSSX appeared first on Smashing Magazine.

See the original article here – 

Finally, CSS In JavaScript! Meet CSSX

Freebie: World Landmark Icons (AI, EPS, PDF, PNG and PSD)


Today we’re happy to release a new Smashing freebie: 18 lovely world landmark icons such as the London Eye, the Eiffel Tower or the Empire State Building. The icons are detailed enough to show architectural elegance but without adding chaos. They’re designed to work best in both digital and print media.

Landmark Icons Excerpt

Today we’re happy to release a new Smashing freebie: 18 lovely world landmark icons such as the London Eye, the Eiffel Tower or the Empire State Building. The icons are detailed enough to show architectural elegance but without adding chaos. They’re designed to work best in both digital and print media.

The post Freebie: World Landmark Icons (AI, EPS, PDF, PNG and PSD) appeared first on Smashing Magazine.

Follow this link:  

Freebie: World Landmark Icons (AI, EPS, PDF, PNG and PSD)