Tag Archives: count

Thumbnail

20 Conversion Optimization Tips for Zooming Past Your Competition

20 Conversion Optimization Tips for Zooming Past Your Competition

Conversion optimization (CRO) is one of the most impactful things you can do as a marketer.

I mean, bringing traffic to a website is important (because without traffic you’re designing for an audience of crickets). But without a cursory understanding of conversion optimization—including research, data-driven hypotheses, a/b tests, and analytical capabilities—you risk making decisions for your website traffic using only gut feel.

CRO can give your marketing team ideas for what you can be doing better to convert visitors into leads or customers, and it can help you discover which experiences are truly optimal, using A/B tests.

However, as with many marketing disciplines, conversion optimization is constantly misunderstood. It’s definitely not about testing button colors, and it’s not about proving to your colleagues that you’re right.

I’ve learned a lot about how to do CRO properly over the years, and below I’ve compiled 20 conversion optimization tips to help you do it well, too.

Conversion Optimization Tip 1:
Learn how to run an A/B test properly

Running an A/B test (an online controlled experiment) is one of the core practices of conversion optimization.

Testing two or more variations of a given page to see which performs best can seem easy due to the increased simplification of testing software. However, it’s still a methodology that uses statistical inference to make a decision as to which variant is best delivered to your audience. And there are a lot of fine distinctions that can throw things off.

What is A/B Testing?

There are many nuances we could get into here—Bayesian vs. frequentist statistics, one-tailed vs. two-tailed tests, etc.—but to make things simple, here are a few testing rules that should help you breeze past most common testing mistakes:

  • Always determine a sample size in advance and wait until your experiment is over before looking at “statistical significance.” You can use one of several online sample size calculators to get yours figured out.
  • Run your experiment for a few full business cycles (usually weekly cycles). A normal experiment may run for three or four weeks before you call your result.
  • Choose an overall evaluation criterion (or north star metric) that you’ll use to determine the success of an experiment. We’ll get into this more in Tip 4.
  • Before running the experiment, clearly write your hypothesis (here’s a good article on writing a true hypothesis) and how you plan to follow up on the experiment, whether it wins or loses.
  • Make sure your data tracking is implemented correctly so you’ll be able to pull the right numbers after the experiment ends.
  • Avoid interaction effects if you’re running multiple concurrent experiments.
  • QA your test setup and watch the early numbers for any wonky technical mistakes.

I like to put all of the above fine details in an experiment document with a unique ID so that it can be reviewed later—and so the process can be improved upon with time.

An example of experiment documentation
An example of experiment documentation using a unique ID.
Tip 1: Ensure you take the time to set up the parameters of your A/B test properly before you begin. Early mistakes and careless testing can compromise the results.

Conversion Optimization Tip 2:
Learn how to analyze an A/B test

The ability to analyze your test after it has run is obviously important as well (and can be pretty nuanced depending on how detailed you want to get).

For instance, do you call a test a winner if it’s above 95% statistical significance? Well, that’s a good place to begin, but there are a few other considerations as you develop your conversion optimization chops:

  • Does your experiment have a sample ratio mismatch?
    Basically, if your test was set up so that 50% of traffic goes to the control and 50% goes to the variant, your end results should reflect this ratio. If the ratio is pretty far off, you may have had a buggy experiment. (Here’s a good calculator to help you determine this.)
  • Bring your data outside of your testing tool.
    It’s nice to see your aggregate data trends in your tool’s dashboard, and their math is a good first look, but I personally like to have access to the raw data. This way you can analyze it in Excel and really trust it. You can also import your data to Google Analytics to view the effects on key segments.

This can also open up the opportunity for further insights-driven experiments and personalization. Does one segment react overwhelmingly positive to a test you’ve run? Might be a good opportunity to implement personalization.

Checking your overall success metric first (winner, loser, inconclusive) and then moving to a more granular analysis of segments and secondary effects is common practice among CRO practitioners.

Here’s how Chris McCormick from PRWD explains the process:

Once we have a high level understanding of how the test has performed, we start to dig below the surface to understand if there are any patterns or trends occurring. Examples of this would be: the day of the week, different product sets, new vs returning users, desktop vs mobile etc.

Also, there are tons of great A/B test analysis tools out there, like this one from CXL:

AB Test Calculator
Tip 2: Analyze your data carefully by ensuring that your sample ratio is correct. Then export it to a spreadsheet where you can check your overall success metric before moving on to more granular indicators.

Conversion Optimization Tip 3:
Learn how to design your experiments

At the beginning, it’s important to consider the kind of experiment you want to run. There are a few options in terms of experimental design (at least, these are the most common ones online):

  1. A/B/n test
  2. Multivariate test
  3. Bandit test

A/B/n test

An A/B/n test is what you’re probably most used to.

It splits traffic equally among two or more variants and you determine which test won based on its effect size (assuming that other factors like sample size and test duration were sufficient).

ABCD Test Example
An A/B test with four variants: Image source

Multivariate test

In a multivariate test, on the other hand, you can test several variables on a page and hope to learn what the interaction effects are among elements.

In other words, if you were changing a headline, a feature image, and a CTA button, in a multivariate test you’d hope to learn which is the optimal combination of all of these elements and how they affect each other when grouped together.

A Multivariate Test

Generally speaking, it seems like experts run about ten a/b tests for every multivariate test. The strategy I go by is:

  • Use A/B testing to determine best layouts at a more macro-level.
  • Use MVT to polish the layouts to make sure all the elements interact with each other in the best possible way.

Bandit test

Bandits are a bit different. They are algorithms that seek to automatically update their traffic distribution based on indications of which result is best. Instead of waiting for four weeks to test something and then exposing the winner to 100% traffic, a bandit shifts its distribution in real time.

Experimental Design: Bandits

Bandits are great for campaigns where you’re looking to minimize regrets, such as short-term holiday campaigns and headline tests. They’re also good for automation at scale and targeting, specifically when you have lots of traffic and targeting rules and it’s tough to manage them all manually.

Unfortunately, while they are simpler from an experimental design perspective, they are much harder for engineers to implement technically. This is probably why they’re less common in the general marketing space, but an interesting topic nonetheless. If you want to learn more about bandits, read this article I wrote on the topic a few years ago.

Tip 3: Consider the kind of experiment you want to run. Depending on your needs, you might run an A/B/n test, a multivariate test, a bandit test, or some other form of experimental design.

Conversion Optimization Tip 4:
Choose your OEC

Returning to a point made earlier, it’s important to choose which north star metric you care about: this is your OEC (Overall Evaluation Criterion). If you don’t state this and agree upon it up front as stakeholders in an experiment, you’re welcoming the opportunity for ambiguous results and cherry-picked data.

Basically, we want to avoid the problem of HARKing: hypothesizing after results are known.

Twitter, for example, wrote on their engineering blog that they solve this by stating their overall evaluation criterion up front:

One way we guide experimenters away from cherry-picking is by requiring them to explicitly specify the metrics they expect to move during the set-up phase….An experimenter is free to explore all the other collected data and make new hypotheses, but the initial claim is set and can be easily examined.

The term OEC was popularized by Ronny Kohavi at Microsoft, and he’s written many papers that include the topic, but the sentiment is widely known by people who run lots of experiments. You need to choose which metric really matters, and which metric you’ll make decisions with.

Tip 4: In order to avoid ambiguous or compromised data, state your OEC (Overall Evaluation Criterion) before you begin and hold yourself to it. And never hypothesize after results are known.

Conversion Optimization Tip 5:
Some companies shouldn’t A/B test

You can still do optimization without A/B testing, but not every company can or should run A/B tests.

It’s a simple mathematical limitation:

Some businesses just don’t have the volume of traffic or discrete conversion events to make it worth running experiments.

Getting an adequate amount of traffic to a test ultimately helps ensure its validity, and you’ll need this as part of your sample size to ensure a test is cooked.

In addition, even if you could possibly squeeze out a valid test here and there, the marginal gains may not justify the costs when you compare it to other marketing activities in which you could engage.

That said, if you’re in this boat, you can still optimize. You can still set up adequate analytics, run user types on prototypes and new designs, watch session replays, and fix bugs.

Running experiments is a ton of fun, but not every business can or should run them (at least not until they bring some traffic and demand through the door first).

Tip 5: Determine whether your company can or even should run A/B tests. Consider both your volume of traffic and the resources you’ll need to allocate before investing the time.

Conversion Optimization Tip 6:
Landing pages help you accelerate and simplify testing

Using landing pages is correlated with greater conversions, largely because using them makes it easier to do a few things:

  • Measure discrete transitions through your funnel/customer journey.
  • Run controlled experiments (reducing confounding variables and wonky traffic mixes).
  • Test changes across templates to more easily reach a large enough sample size to get valid results.

To the first point, having a distinct landing page (i.e. something separate and easier to update than your website) gives you an easy tracking implementation, no matter what your user journey is.

For example, if you have a sidebar call to action that brings someone to a landing page, and then when they convert, they are brought to a “Thank You” page, it’s very easy to track each step of this and set up a funnel in Google Analytics to visualize the journey.

Google Analytics Funnel

Landing pages also help you scale your testing results while minimizing the resource cost of running the experiment. Ryan Farley, co-founder and head of growth at LawnStarter, puts it this way:

At LawnStarter, we have a variety of landing pages….SEO pages, Facebook landing pages, etc. We try to keep as many of the design elements such as the hero and explainer as similar as possible, so that way when we run a test, we can run it sitewide.

That is, if you find something that works on one landing page, you can apply it to several you have up and running.

Tip 6: Use landing pages to make it easier to test. Unbounce lets you build landing pages in hours—no coding required—and conduct unlimited A/B tests to maximize conversions.

Conversion Optimization Tip 7:
Build a growth model for your conversion funnel

Creating a model like this requires stepping back and asking, “how do we get customers?” From there, you can model out a funnel that best represents this journey.

Most of the time, marketers set up simple goal funnel visualization in Google Analytics to see this:

Google Analytics Funnel Visualization

This gives you a lot of leverage for future analysis and optimization.

For example, if one of the steps in your funnel is to land on a landing page, and your landing pages all have a similar format (e.g. offers.site.com), then you can see the aggregate conversion rate of that step in the funnel.

More importantly, you can run interesting analyses, such as path analysis and landing page comparison. Doing so, you can compare apples to apples with your landing pages and see which ones are underperforming:

Landing Page Comparison
The bar graph on the right allows you to quickly see how landing pages are performing compared to the site average.

I talk more about the process of finding underperforming landing pages in my piece on content optimization if you want to learn step-by-step how to do that.

Tip 7: Model out a funnel that represents the customer journey so that you can more easily target underperforming landing pages and run instructive analyses focused on growth.

Conversion Optimization Tip 8:
Pick low hanging fruit in the beginning

This is mostly advice from personal experience, so it’s anecdotal: when you first start working on a project or in an optimization role, pick off the low hanging fruit. By that, I mean over-index on the “ease” side of things and get some points on the board.

It may be more impactful to set up and run complex experiments that require many resources, but you’ll never pull the political influence necessary to set these up without some confidence in your abilities to get results as well as in the CRO process in general.

To inspire trust and to be able to command more resources and confidence, look for the easiest possible implementations and fixes before moving onto the complicated or risky stuff.

And fix bugs and clearly broken things first! Persuasive copywriting is pretty useless if your site takes days to load or pages are broken on certain browsers.

Tip 8: Score some easy wins by targeting low hanging fruit before you move on to more complex optimization tasks. Early wins give you the clout to drive bigger experiments later on.

Conversion Optimization Tip 9:
Where possible, reduce friction

Most conversion optimization falls under two categories (this is simplified, but mostly true):

  • Increasing motivation
  • Decreasing friction

Friction occurs when visitors become distracted, when they can’t accomplish a task, or simply when a task is arduous to accomplish. Generally speaking, the more “nice to have” your product is, the more friction matters to the conversion. This is reflected in BJ Fogg’s behavior model:

BJ FOGGs Behaviour Model

In other words, if you need to get a driver’s license, you’ll put up with pure hell at the DMV to get it, but you’ll drop out of the funnel at the most innocent error message if you’re only trying to buy something silly on drunkmall.com.

A few things that cut down on friction:

  • Make your site faster.
  • Trim needless form fields.
  • Cut down the amount of steps in your checkout or signup flow.

For an example on the last one, I like how Wordable designed their signup flow. You start out on the homepage:

Wordable

Click “Try It Free” and get a Google OAuth screen:

Wordable 0auth

Give permissions:

Wordable permissions

And voila! You’re in:

Wordable Dashboard

You can decrease friction by reducing feelings of uncertainty as well. Most of the time, this is done with copywriting or reassuring design elements.

An example is with HubSpot’s form builder. We emphasize that it’s “effortless” and that there is “no technical expertise required” to set it up:

Hubspot Form Builder

(And here’s a little reminder that HubSpot integrates beautifully with Unbounce, so you’ll be able to automatically populate your account with lead info collected on your Unbounce landing pages.)

Tip 9: Cut down on anything that makes it harder for users to convert. This includes making sure your site is fast and trimming any forms or steps that aren’t necessary for checkout or signup.

Conversion Optimization Tip 10:
Help increase motivation

The second side of the conversion equation, as I mentioned, is motivation.

An excellent way to increase the motivation of a visitor is simply to make the process of conversion…fun. Most tasks online don’t need to be arduous or frustrating, we’ve just made them that way due to apathy and error.

Take, for example, your standard form or survey. Pretty boring, right?

Well, today, enough technological solutions exist to implement interactive or conversational forms and surveys.

One such solution is Survey Anyplace. I asked their founder and CEO, Stefan Debois, about how their product helps motivate people to convert, and here’s what he said:

An effective and original way to increase conversion is to use an interactive quiz on your website. Compared to a static form, people are more likely to engage in a quiz, because they get back something useful. An example is Eneco, a Dutch Utility company: in just 6 weeks, they converted more than 1000 website visitors with a single quiz.

Full companies have been built on the premise that the typical form is boring and could be made more fun and pleasant to complete (e.g. TypeForm). Just think, “how can I compel more people to move through this process?”

Other ways to do this that are quite commonplace involve invoking certain psychological triggers to compel forward momentum:

  • Implement social proof on your landing pages.
  • Use urgency to compel users to act more quickly.
  • Build out testimonials with well-known users to showcase authority.

There are many more ways to use psychological triggers to motivate conversions. Check out Robert Cialdini’s classic book, Influence, to learn more. Also, check out The Wheel of Persuasion for inspiration on persuasive triggers.

Tip 10: Make your conversion process fun in order to compel your visitors to keep moving forward. Increased interactivity, social proof, urgency, and testimonials that showcase authority can all help you here too.

Conversion Optimization Tip 11:
Clarity > Persuasion

While persuasion and motivation are really important, often the best way to convert visitors is to ensure they understand what you’re selling.

Stated differently, clarity trumps persuasion.

Use a five-second test to find out how clear your messaging is.

Conversion Optimization Tip 12:
Consider the “Pre-Click” Experience

People forget the pre-click experience. What does a user do before they hit your landing pages? What ad did they click? What did they search in Google to get to your blog post?

Knowing this stuff can help you create strong message match between your pre-click experience and your landing page.

Sergiu Iacob, SEO Manager at Bannersnack, explains their process for factoring in keywords:

When it comes to organic traffic, we establish the user intent by analyzing all the keywords a specific landing page ranks for. After we determine what the end result should look like, we adjust both our landing page and our in app user journey. The same process is used in the optimization of landing pages for search campaigns.

I’ve recommended the same thing before when it comes to capturing email leads. If you can’t figure out why people aren’t converting, figure out what keywords are bringing them to your site.

Usually, this results in a sort of passive “voice of customer” mining, where you can message match the keywords you’re ranking for with the offer on that page.

It makes it much easier to predict what messages your visitors will respond to. And it is, in fact, one of the cheapest forms of user research you can conduct.

AHRefs Keywords
Using Ahrefs to determine what keywords brought traffic to a page.
Tip 12: Don’t forget the pre-click experience. What do your users do before they hit your landing page? Make sure you have a strong message match between your ads (or emails) and the pages they link to.

Conversion Optimization Tip 13:
Build a repeatable CRO process

Despite some popular blog posts, conversion optimization isn’t about a series of “conversion tactics” or “growth hacks.” It’s about a process and a mindset.

Here’s how Peep Laja, founder of CXL, put it:

The quickest way to figure out whether someone is an amateur or a pro is this: amateurs focus on tactics (make the button bigger, write a better headline, give out coupons etc) while pros have a process they follow.

And, ideally, the CRO process is a never-ending one:

CRO Process

Conversion Optimization Tip 14:
Invest in education for your team

CRO people have to know a lot about a lot:

  • Statistics
  • UX design
  • User research
  • Front end technology
  • Copywriting

No one comes out the gate as a 10 out of 10 in all of those areas (most never end up there either). You, as an optimizer, need to be continuously learning and growing. If you’re a manager, you need to make sure your team is continuously learning and growing.

Conversion Optimization Tip 15:
Share insights

The fastest way to scale and leverage experimentation is to share your insights and learnings among the organization.

This becomes more and more valuable the larger your company grows. It also becomes harder and harder the more you grow.

Essentially, by sharing you can avoid reinventing the wheel, you can bring new teammates up to speed faster, and you can scale and spread winning insights to teams who then shorten their time to testing. Invest in some sort of insights management system, no matter how basic.

Full products have been built around this, such as GrowthHackers’ North Star and Effective Experiments.

Effective Experiments
Tip 15: Share what you learn within your organization. The bigger your company grows, the more important information sharing becomes—but the more difficult it will become as well.

Conversion Optimization Tip 16:
Keep your cognitive biases in check

As the great Richard Feynman once said, “The first principle is that you must not fool yourself and you are the easiest person to fool.”

We’re all afflicted by cognitive biases, ranging from confirmation bias to the availability heuristic. Some of these can really impact our testing programs, specifically confirmation bias (and its close cousin, the Texas Sharpshooter Fallacy) where you only seek out pieces of data that confirm your previous beliefs and throw out those that go against them.

Experimenter Bias

It may be worthwhile (and entertaining) simply to run down Wikipedia’s giant list of cognitive biases and gauge where you may currently be running blind or biases.

Tip 16: Be cognizant of your own cognitive biases. If you’re not careful, they can influence the outcome of your experiments and cause you to miss (or misinterpret) key insights in your data.

Conversion Optimization Tip 17:
Evangelize CRO to your greater org

Having a dedicated CRO team is great. Evangelizing the work you’re doing to the rest of the organization? Even better.

Evangelize your CRO
Spread the word about the importance of CRO within your org.

When an entire organization buys into the value of data-informed decision making and experimentation, magical things can happen. Ideas burst forth, and innovation becomes easy. Annoying roadblocks are deconstructed. HiPPO-driven decision making is deprioritized behind proper experiments.

Things you can do to evangelize CRO and experimentation:

  • Write down your learnings each week on a company wiki.
  • Send out a newsletter with live experiments and experiment results each week to interested parties.
  • Recruit an executive sponsor with lots of internal influence.
  • Sing your praises when you get big wins. Sing it loud.
  • Make testing fun, and make it easier for others to join in and pitch ideas.
  • Make it easier for people outside of the CRO team to sponsor tests.
  • Say the word “hypothesis” a lot (who knows, it might work).

This is all a kind of art; there are no universal methods for spreading the good gospel of CRO. But it’s important that you know it’s probably going to be something of an uphill battle, depending on how big your company is and what the culture has traditionally been like.

Tip 17: Spread the gospel of CRO across your organization in order to ensure others buy into the value of data-driven decision making and experimentation.

Conversion Optimization Tip 18:
Be skeptical with CRO case studies

This isn’t so much a conversion optimization tip as it is life advice: be skeptical, especially when marketing is involved.

I say this as a marketer. Marketers exaggerate stuff. Some marketers omit important details that derail a narrative. Sometimes, they don’t understand p values, or how to set up a proper test (maybe they haven’t read Tip 1 in this article).

In short, especially in content marketing, marketers are incentivized to publish sensational case studies regardless of their statistical merit.

All of that results in a pretty grim standard for the current CRO case study.

Don’t get me wrong, some case studies are excellent, and you can learn a lot from them. Digital Marketer lays out a few rules for detecting quality case studies:

  • Did they publish total visitors?
  • Did they share the lift percentage correctly?
  • Did they share the raw conversions? (Does the lack of raw conversions hurt my case study?)
  • Did they identify the primary conversion metric?
  • Did they publish the confidence rate? Is it >90%?
  • Did they share the test procedure?
  • Did they only use data to justify the conclusion?
  • Did they share the test timeline and date?

Without context or knowledge of the underlying data, a case study might be a whole lot of nonsense. And if you want a good cathartic rant on bad case studies, then Andrew Anderson’s essay is a must-read.

According to a study...
Tip 18: Approach existing material on CRO with a skeptical mindset. Marketers are often incentivized to publish case studies with sensational results, regardless of the quality of the data that supports them.

Conversion Optimization Tip 19:
Calculate the cost of additional research vs. just running it

Matt Gershoff, CEO of Conductrics, is one of the smartest people I know regarding statistics, experimentation, machine learning, and general decision theory. He has stated some version of the following on a few occasions:

  • Marketing is about decision-making under uncertainty.
  • It’s about assessing how much uncertainty is reduced with additional data.
  • It must consider, “What is the value in that reduction of uncertainty?”
  • And it must consider, “Is that value greater than the cost of the data/time/opportunity costs?”

Yes, conversion research is good. No, you shouldn’t run blind and just test random things.

But at the end of the day, we need to calculate how much additional value a reduction in uncertainty via additional research gives us.

If you can run a cheap A/B test that takes almost no time to set up? And it doesn’t interfere with any other tests or present an opportunity cost? Ship it. Because why not?

But if you’re changing an element of your checkout funnel that could prove to be disastrous to your bottom line, well, you probably want to mitigate any possible downside. Bring out the heavy guns—user testing, prototyping, focus groups, whatever—because this is a case where you want to reduce as much uncertainty as possible.

Tip 19: Balance the value of doing more research with the costs (including opportunity costs) associated with it. Sometimes running a quick and dirty A/B test will be sufficient for your needs.

Conversion Optimization Tip 20:
CRO never ends

You can’t just run a few tests and call it quits.

The big wins from the early days of working on a relatively unoptimized site may taper off, but CRO never ends. Times change. Competitors and technologies come and go. Your traffic mix changes. Hopefully, your business changes as well.

As such, even the best test results are perishable, given enough time. So plan to stick it out for the long run and keep experimenting and growing.

Think Kaizen.

Kaizen

Conclusion

There you go, 20 conversion optimization tips. That’s not all there is to know; this is a never-ending journey, just like the process of growth and optimization itself. But these tips should get you started and moving in the right direction.

This article:

20 Conversion Optimization Tips for Zooming Past Your Competition

Thumbnail

A Guide To Embracing Challenges And Excelling At Your UX Design Internship




A Guide To Embracing Challenges And Excelling At Your UX Design Internship

Erica Chen



This is the story about my user design internship. I’m not saying that your internship is going to be anything like mine. In fact, if there’s one thing I can say to shape your expectations, it would be this: be ready to put them all aside. Above all else, remember to give yourself space and time to learn. I share my story as a reminder of how much I struggled and how well everything went despite my difficulties so that I’ll never stop trying and you won’t either.

It all started in May 2018, when I stepped off the plane in Granada, Spain, with a luggage at my side, laptop on my back, and some very rusty Spanish in my head. It was my first time in Europe and I would be here for the next three months doing an internship in UX design at Badger Maps. I was still pretty green in UX, having been learning about it for a barely a year at this point but I felt ready and eager to gain experience in a professional setting.

Follow along as I learned how to apply technical knowledge to complete the practical design tasks assigned to me:

  • Create a design system for our iOS app using Sketch;
  • Design a new feature that would display errors occurring in data imports;
  • Learn the basics HTML, CSS, and Flexbox to implement my design;
  • Create animations with Adobe Illustrator and After Effects.

This article is intended for beginners like me. If you are new to UX design looking to explore the field — read on to learn if a UX design internship is the right thing for you! For me, the work I ended up completing went well beyond my expectations. I learned how to a design system, how to compromise design with user needs, the challenges of implementing a new design, and how to create some “moments of delight.” Every day at the internship presented something new and unpredictable. At the conclusion of my internship, I realized I had created something real, something tangible, and it was like everything I had struggled with suddenly fell into place.

Recommended reading: How To Land A First-Rate Graphic Design Internship

Chapter 1: Legos

My first task was to create a design system for our existing iOS app. I had created design systems in the past for my own projects and applications, but I had never done them retrospectively and never for a design that wasn’t my own. To complete the assignment, I needed to reverse engineer the mockups in Sketch; I would first need to update and optimize the file in order to create the design system.


Screenshot of organizing a design file in the program Sketch.


Working with organizing the Sketch file to create a design system. (Large preview)

It was also at this opportune moment when I learned the Sketch program on my computer had been outdated for about a year and a half. I didn’t know about any of the symbols, overrides and other features in the newer versions. Lesson learned: keep your software updated.


Footer symbols and overrides in the program Sketch.


Creating footers and working with overrides in Sketch. (Large preview)

Before worrying about the symbols page, I went through the mockups artboard by artboard, making sure they were updated and true to the current released version of the application. Once that was done, I began creating symbols and overrides for different elements. I started with the header and footer and moved on from there.

As a rule of thumb, if an element showed up in more than one page, I would make it a symbol. I added different icons to the design system as I went, building up the library. However, it quickly became clear that the design system was evolving and changing faster than I could try to organize it. Halfway through, I stopped trying to keep the symbols organized, opting instead to go back and reorganize them once I had finished recreating each page. When I stopped going back and forth between mockups and symbols and worrying about the organization for both, I could work more efficiently.

It was easy to come to appreciate the overrides and symbols in Sketch. The features made the program much more powerful than what I was used to and increased the workability of the file for future designs. The task of creating the design system itself challenged me to dive deep into the program as well as understand all the details of the design of our application. I began to notice small inconsistencies in spacing, icon size, or font sizes that I was able to correct as I worked.


A description of what the image shows for alt text


A caption to be shown below the image. (Large preview)

The final step was to go back into the symbols page and organize everything. I weeded through all the symbols, deleted those not in use and any replicas. Despite being a little tedious, this was a very valuable step in the process. Going through the symbols after working through the document gave me a chance to reevaluate how I had created the symbols for each page. Grouping them together forced me to consider how they were related throughout the app.

By going through this thought process, I realized how challenging it was to create a naming system. I needed to create a system broad enough to encompass enough elements, specific enough to avoid being vague, and that could easily be understood by another designer. It took me a few tries before I landed upon a workable system that I was happy with. Ultimately, I organized elements according to where they were used in the application, grouping pieces like lists together. It worked well for an application like Badger that had distinct designs for different features in the app. The final product was a more organized file that would be a lot easier to work with for any future design iterations.


New design with larger headers, inspired by native apple apps.


Modernizing the design with new header designs. (Large preview)

As a capstone to this project, I experimented with modernizing the design. I redesigned the headers throughout the app, drawing on native apple apps for inspiration. Happily, the team was excited about it as well and are considering implementing the changes in future updates to the app.

Overall, working a Sketch file to such detail was an unexpectedly helpful experience. I left with a much greater fundamental understanding of things like font size, color, and spacing by virtue of redoing every page. The exercise of copying existing design required a minute attention to detail that was very satisfying. It was like putting together a Lego model: I had all the pieces and knew what the end product needed to look like. I just needed to organize everything and put them together to create the finished product. This is one of the reasons why I enjoy doing UX design. It’s about the problem solving and piecing together a puzzle to create something that everyone can appreciate.


Final design for a new feature for the badger maps web application.


Dashboard design for the Badger web application. (Large preview)

Chapter 2: The Design

The next part of my internship allowed me to get into the weeds with some design work. The task: to design a new import page for the Badger web application.

The team was working on redesigning the badger to CRM integration to create a system that allowed users to view any data syncs and manage their accounts themselves. The current connection involves a lot of hands-on work from badger CSAs and AEs to set up and maintain. By providing an interface for users to directly interact with the data imports, we wanted to improve the user experience for our CRM integration.


Current design for the import process.


Existing process: Users currently integrating Badger with their Salesforce accounts can’t manage the flow of information between the two. They can’t view any errors in data being imported to Badger or easily see the status of their import. To the right is the existing errors view for users importing via spreadsheets. We want to improve this user experience and make it accessible to Salesforce-integrated users as well. (Large preview)

My goal was to design a page that would display errors occurring in any data imports that also communicated to users how and where to make the necessary changes to their data. If there were more errors associated with a single import or users would like to view all errors at once, they should be able to download an excel file of all that information.

Objectives

  1. Create an import page that informs the user on the status of an import in process;
  2. Provide a historical record of account syncs between Badger and the CRM with detailed errors associated with each import;
  3. Provide links to the CRM for each account that has an import error in Badger;
  4. Allow users to download an excel file of all outstanding errors.

User Stories

Badger customer with CRM account:
As a customer with a CRM, I want to be able to connect my CRM to my badger and visualize all data syncs so that I’m aware of all errors in the process and can make changes as necessary.

Badger:
As a badger, I want users to be able to manage and view the status of their CRM integration so that I can save time and manual work helping and troubleshooting users syncing their badger to their CRM accounts.

Before I really delved into the design, we needed to go through some thinking to decide what information to show and how:

  1. Bulk versus continuous imports
    Depending on the type of user, there are two ways to import data to Badger. If done through spreadsheets, the imports would be batched and we would be able to visualize the imports in groups. Users integrated with their CRMs, however, would need to have their Badger data updated constantly as they made changes within their CRM. The design needed to be able to handle both use cases.
  2. Import records
    Because this was a new feature, we weren’t absolutely sure of the user behavior. Thus, deciding how to organize the information was challenging. Should we allow users to go for an infinity scroll in a list of their history? How would they search for a specific import? Should they be able to? Should we show the activity day-by-day or month by month?

Ultimately, we were only able to make a best guess for each of these issues — knowing that we could make appropriate adjustments in the future once users began using the feature. After thinking these issues out, I moved into wireframing. I had the opportunity to design something completely different and this was both liberating and challenging. The final design was a culmination of individual elements from various designs that were created along the way.

Design Process

The hardest part of this process was learning to start over. I eventually learned that forcing something into my design for solely aesthetic purposes was not ideal. Understanding this and letting my ideas go was key to arriving at a better design. I needed to learn how to go start over again and again to explore different ideas.


Three design explorations.


First few iterations: Experimenting with the placement of the header, buttons, and list design. Feedback at this point and for the next few days was consistently as it should be: ‘let’s see what else we can do.’ But the advantages to running like a headless chicken was that I occasionally stumbled upon some corn kernels of gold that I used in the final design. (Large preview)


A blue themed design exploration.


One design exploration that stretched a little too far from the badger application. After this, I circled back a little but the final design really benefited from exploring such different ideas. (Large preview)

Challenges

1. Using white space

Right off the bat, I needed to explore what information we wanted to show on the page. There were many details we could include — and definitely the room to do it.


A dashboard design showing a lot of excess information.


Initially, I was very intimidated at the prospect of having a lot of white space and a minimalistic design so tried really hard to come up with filler information, 75% of which our users wouldn’t really need. Then I crammed it all into my design, permitting minimal breathing room. A very good attitude for a city planner in San Francisco; not so much for creating user centric design. (Large preview)

All the unnecessary information added way too much cognitive load and took away from what the user was actually concerned about. Instead of trying to get rid of all the white space, I needed to work with it. With this in mind, I eventually chucked out all the irrelevant information to show only what we expect our users to be most concerned about: the errors associated with data imports.

This was the final version:


Final design featuring a streamlined table design with activity organized by month.


Imports organized according to day and month. This was a more logical organization for our purposes, especially because synchronizations between the CRM and Badger were continuous, not just on demand. (Large preview)

2. Navigation

The next challenge was deciding between a sidebar versus a header for displaying information. The advantages to the sidebar was that the information would be consistently visible as the user scrolled. But we also had to ensure that the information contained in the sidebar was logically related to what was going on in the rest of the page.

The header offered the advantage of a clean, single column design. The downside was that it took up a lot of vertical real estate depending on how much information was contained in the header. It also visually prioritized the contents of the header over what was below it for the user.


Design exploration with a top navigation.


Iteration exploring the top navigation. Cons: users would scroll through the list of imports to view errors and have to scroll back up to see the summary. The contents and location of the two cells to the right was also confusing. It didn’t make sense for the two cells to scroll with the rest of the page because they were a summary of all information to its left. But it would make for a confusing user experience if they didn’t scroll. Overall, the organization of the information on the page was misaligned with the design. (Large preview)

Once I worked out what information to display where, the sidebar navigation became the more logical decision. We expect users to be primarily concerned with the errors associated with their imports and with a large header, too much of that information would fall below the fold. The sidebar could then be a container for an import and activity summary that would be visible as the user scrolled.

Sidebar design: After I decided on having a sidebar, it came down to deciding what information to include and how to display it.


Five different sidebar design explorations.


Different sidebar design explorations. The design became increasingly simple as I narrowed in on the information the users wanted to see. (Large preview)

I struggled to create a design that was visually interesting because there was little information to show. For this reason, I once again found myself adding in unnecessary elements to fill up the space although I wanted to prioritize the user. I experimented with different content and color combinations, trying to find the compromise between design and usability. The more I worked with it, the more I was able to parse down the design to the bare bones. It became easier to differentiate useful information from fillers. The final product is a streamlined design with just a few summary statistics. It also offers great flexibility to include more information in the future.


Final design for a new feature for the badger maps web application.


Final design: Subtext beneath the buttons removed and the accounts created/accounts updated information is placed in its own container and shifted down to add visual interest. (Large preview)

Import process: The import progress page was created after the design for the import page was finalized. The biggest design challenge here was deciding how to display the in-progress import sync. I tried different solutions from pop-ups and overlays but ultimately settled with showing the progress in the sidebar. This way, users can still resolve any errors and see the historical record of their account data while an import is in progress. To prevent any interruptions to the import, the ‘Sync data’ and ‘Back to Badger’ buttons are disabled so users can’t leave the page.


Final design with the sync data and back to badger buttons disabled.


Sync data and Back to Badger buttons disabled to prevent users from interrupting the sync and going back to the application. (Large preview)

With the designs done, I moved onto HTML and CSS.


Screenshot of the sketch program and visual studio code with the code for the design.


Beginning to code my design. (Large preview)

Chapter 3: HTML/CSS

This project was my first experience with any type of coding. Although I had tried to learn HTML and CSS before, I had never reached any level of proficiency. And what better way to start than with a mockup of one’s own design?

Understanding the logic of organizing an HTML document reminded me of organizing the Sketch document with symbols and overrides. However, the similarities ended there. Coding felt like a very alien thing that I was consistently trying to wrap my head around. As my mentor would say, “You’re flexing very different muscles in programming than you are in design.” With the final product in hand now, I’m fully convinced that learning to code is the coolest thing I’ve learned to do since being potty trained.

The first challenge, after setting up a document and understanding the basics, was working with Flexbox. The design I had created involved two columns side by side. The right portion was meant to scroll while the left remained static. Flexbox seemed like a clean solution for this purpose, assuming I could get it to work.

Implementing Flexbox consisted of a lot of trial and error and blind copying of code while I scrambled through various websites, reading tutorials and inspecting code. With guidance from my mentor through this whole process, we eventually got it to work. I will never forget the moment when I finally understood that by using flex-direction: column I would get all of the elements into a single column, and flex-direction: row helped placed them in one row.

It makes so much sense now, although my initial understanding of it was the exact opposite (I thought flex-direction: column would put elements in columns next to each other). Surprisingly, I didn’t even come to this realization until after the code was working. I was reviewing my code and realized I didn’t understand it at all. What tipped me off? In my CSS, I had coded flex-direction: row into the class I named column. This scenario was pretty indicative of how the rest of my first coding experience went. My mental model was rarely aligned with the logic of the code, and they often clashed and went separate ways. When this happened, I had to go back, find my misconceptions, and correct the code.

After setting up Flexbox, I needed to figure out how to get the left column to stay fixed while the right portion scrolled. Turns out this couldn’t be achieved with a single line of code as I had hoped. But working through this helped me understand the parent-child relationship that aided me immensely with the rest of the process.


Table of imports design showing the timeline and calendar icons


Vertical timeline with calendar icons. (Large preview)

Coding the vertical timeline and the dial was also a process. The timeline was simpler than I had originally anticipated. I was able to create a thin rectangle, set an inner shadow and a gradient filling to it, and assign it to the width of each activity log.

The dial was tricky. I tried implementing it with pure CSS with very little success. There were a few times I considered changing the design for something simpler (like a progress bar) but I’m quite happy I stuck with it.


Image showing the original and final dial designs.


Original and final dial designs. (Large preview)

A major struggle was getting outside progress dial to overlap the background circle along the border. This was where I changed the design a little bit — instead of having the unloaded portion of the progress dial cut out, it overlaps all around. It was a compromise between my design and code that I was initially unwilling to make. As it turns out, however, I was satisfied with the final result and once I realized this, I was happy to make that compromise. The final dial was implemented via JavaScript.

There was a moment in my coding process where I threw every line of code I’d ever written into every class to try to make it work. To make up for this lack of hindsight, I needed to spend quite a while going through and inspecting all the elements to remove useless code. I felt like a landlord kicking out the tenants who weren’t paying rent. It was most definitely a lesson learned in maintaining a level of housekeeping and being judicious and thoughtful with code.

The majority of the experience felt like blind traversing and retrospective learning. However, nothing was more satisfying than seeing the finished product. Going through the process made me interact with my work in a way I had never done before and gave me insight into how design is implemented. In all of my expectations for the internship, I never anticipated being able to code and create one of my own designs. Even after being told I would be able to do so on my first day, I didn’t believe it until after seeing this page completed.

Chapter 4: Working With Baby Badgers

As part of the process integrating Badger users with their CRM accounts, we needed our users to sign into their CRM — requiring us to redirect them out of badger to the native CRM website. To prevent a sudden, jarring switch from one website to another, I needed to design intermediate loading pages.


Original design for the redirection page with the badger maps logo and “See ya later!” message.


One of the first mockups of a sample static redirection page. It was simple and fulfilled its purpose but did little else. (Large preview)

I started out with your run-of-the-mill static redirection page. They were simple and definitely fulfilled their purpose, but we weren’t quite happy with them.

The challenge was to create something simple and interesting that informed the user they were leaving our website in just a few seconds it was visible. The design would need to introduce itself, explain why it was there, and leave before anyone got tired of looking at it. It was essentially an exercise in speed dating. With that in mind, I decided to try animations — specifically that of a cheeky little badger, inspired by the existing logo.


Image showing 7 iterations of the badger design and how it changed.


The evolution of “baby badger”. (Large preview)

Using the badger logo as a starting reference point, I created different badger characters in Adobe Illustrator. The original logo felt a little too severe for a loading animation, so I opted for something a little cuter. I kept the red chest and facial features from the original logo for consistency and worked away at creating a body and head around these elements. The head and stripes took a while to massage into shapes that I was happy with. The body took the form a little easier, but it took a little longer to find the right proportion between the size of the head and the body. Once I nailed that down, I was ready to move onto animating.


Stop animation frames animating the baby badger.


My attempt at stop animation. (Large preview)

My first instinct was to try a stop-motion animation. I figured it was going to be great — a lá Wallace and Gromit. But after the first attempt and then the second, and all the ensuing ones, it became clear that watching that show as a child had not fully equipped me with the skills required to do a stop-motion animation.

I just wasn’t able to achieve the smoothness I wanted, and there were small inconsistencies that felt too jarring for a very short loading animation. Animation typically runs at 23 frames per second, and my badger animation only had about 15 frames per second. I considered adding more frames, but upon suggestion from my mentor, decided to try character animation instead.

This was the first time I had animated anything that was more than 5 moving parts and there was definitely a learning curve to understanding how to animate a two-dimensional character in a visually satisfying way. I needed to animate the individual elements to move by themselves independent of the whole in order to make the motion believable. As I worked on the animation, the layers I imported became increasingly granular. The head went from being one layer to five as I learned the behavior of the program and how to make the badger move.

I anchored each limb of the body and set each body part as a child to the parent layer of the body. I set the anchor points accordingly at the top of the thighs and shoulders to make sure they moved appropriately and then, using rotations and easing, simulated the movement of the body parts. The head was a tad bit tricky and required some vertical movement independent of the body. To make the jump seem more realistic, I wanted the head to hang in space a little before being pushed up by the rest of the body, and to come down just slightly after the rest of him. I also adjusted the angle I tried to make him seems as if he were leading with his nose, pointing up during the jump, and straightforward while he ran.

The overly anthropomorphic feet were abandoned from the original designs. They were one of the last changes made to baby badger. I hadn’t considered how odd human toes looked like on a badger.

The animation featured on the page redirecting the user back to badger displayed the baby badger running back to badger with a knapsack full of information from the CRM.

Animation of baby badger running back to the badger application.

And finally: the confused badger. This was done for the last page I needed to create: an error page notifying the user of unexpected complications in the integration process. And what better way to do that then a sympathetic, confused badger?


Image showing four iterations of the baby badger face.


Design exploration of the baby badger face. (Large preview)

The tricky part here was combining the side profile of the existing cartoon badger and the logo to create a front-facing head shape. Before beginning this project, I had never once seen a real live badger. Needless to say, Badger has found its way into my google image searches this month. I was surprised to see how flat the head of a badger actually is. In my first few designs, I tried to mimic this but wasn’t satisfied with the result. I worked with the shape some more, adjusting the placement of the nose, the stripes, and the ears to achieve the final result:

Swirly eyes inspired by the possum from the movie Fantastic Mister Fox.

This animation process has forced me to take my preexisting knowledge to a higher level. I needed to push myself beyond what I knew rather than limiting myself with what I thought I could do. I originally started with the stop-motion animation because I didn’t trust myself to do character animation. By giving myself the chance to try something new and different, I was able to achieve something that exceeded my own expectations.


Four cartoon-style designs based off the baby badger animation.


Designs expanded from the original baby badger to be printed and used around the office and on marketing material. (Large preview)

Conclusion

The three months I spent at my internship were incredibly gratifying. Every single day was about learning and trying something new. There were challenges to everything I did — even with tasks I was more familiar with such as design. Every time I created something, I was very insecure and apprehensive about how it would be received. There was a lot of self-doubt and lots of discarded ideas.

For that reason, it was incredible to be part of a team and to have a mentor to lead me in the right direction. Being told to try something else was often the only encouragement I needed to try something else and achieve something bigger and better. I like to picture myself as a rodent in a whack-a-mole game, being hit on the head over and over but always popping up again and again. Now the struggles and challenges have come to an end, I only want to do it all over again.

I appreciate what I’ve learned and how I was pushed to go beyond what I thought I could do. It’s crazy to see how far I’ve come in a few months. My understanding of being a UX designer has grown immensely, from figuring out the features, to hammering out the design, and then writing front-end code to implement it. This internship has taught me how much more I have to learn and has motivated me to keep working. I’ve come to understand that what I can do should never be limited by what I know how to do.


badger mascot

Smashing Editorial
(mb, ra, yk, il)


From:

A Guide To Embracing Challenges And Excelling At Your UX Design Internship

Thumbnail

Flexbox: How Big Is That Flexible Box?




Flexbox: How Big Is That Flexible Box?

Rachel Andrew



This is the third part of my series on Flexbox. In the past two articles, we have looked at what happens when you create a flex container and explored alignment as it works in Flexbox. This time we are going to take a look at sizing. How do we control the size of our flex items, and what choices is the browser making when it controls the size?

Initial Display Of Flex Items

If I have a set of items, which have variable lengths of content inside, and set their parent to display: flex, the items will display as a row and line up at the start of that axis. In the example below my three items have a small amount of content and are able to display the content of each item as an unbroken line. There is space at the end of the flex container which the items do not grow into because the initial value of flex-grow is 0, do not grow.


Three items with space at the end


The flex items have room to each be displayed on one line (Large preview)

If I add more text to these items, they eventually fill the container, and the text begins to wrap. The boxes are assigned a portion of the space in the container which corresponds to how much text is in each box — an item with a longer string of text is assigned more space. This means that we don’t end up with a tall skinny column with a lot of text when the next door item only contains a single word.


Three items, the final item has longer text and the text wraps


The space is distributed to give more space to a longer item (Large preview)

This behavior is likely to be familiar to you if you have ever used Flexbox, but perhaps you have wondered how the browser is working that sizing out, as if you look in multiple modern browsers you will see that they all do the same thing. This is down to the fact that detail such as this is worked out in the specification, making sure that anyone implementing Flexbox in a new browser or other user agent is aware of how this calculation is supposed to work. We can use the spec to find this information out for ourselves.

The CSS Intrinsic And Extrinsic Sizing Specification

You fairly quickly discover when looking at anything about sizing in the Flexbox specification, that a lot of the information you need is in another spec — CSS Intrisnic and Extrinsic Sizing. This is because the sizing concepts we are using aren’t unique to Flexbox, in the same way that alignment properties aren’t unique to Flexbox. However, for how these sizing constructs are used in Flexbox, you need to look in the Flexbox spec. It can feel a little like you are jumping back and forth, so I’ll round up a few key definitions here, which I’ll be using in the rest of the article.

Preferred Size

The preferred size of a box is the size defined by a width or a height, or the logical aliases for these properties of inline-size and block-size. By using:

.box 
    width: 500px;

Or the logical alias inline-size:

.box 
    inline-size: 500px;

You are stating that you want your box to be 500 pixels wide, or 500 pixels in the inline direction.

min-content Size

The min-content size is the smallest size that a box can be without causing overflow. If your box contains text then all possible soft-wrapping opportunities will be taken.

max-content Size

The max-content size is the largest size the box can be to contain the contents. If the box contains text with no formatting to break it up, then it will display as one long unbroken string.

Flex Item Main Size

The main size of a flex item is the size it has in the main dimension. If you are working in a row — in English — then the main size is the width. In a column in English, the main size is the height.

Items also have a minimum and maximum main size as defined by their min-width or min-height on the main dimension.

Working Out The Size Of A Flex Item

Now that we have some terms defined, we can have a look at how our flex items are sized. The initial value of the flex properties are as follows:

  • flex-grow: 0
  • flex-shrink: 1
  • flex-basis: auto

The flex-basis is the thing that sizing is calculated from. If we set flex-basis to 0 and flex-grow to 1 then all of our boxes have no starting width, so the space in the flex container is shared out evenly, assigning the same amount of space to each item.

See the Pen Smashing Flexbox Series 3: flex: 1 1 0; by Rachel Andrew (@rachelandrew) on CodePen.

Whereas if flex-basis is auto and flex-grow: 1, only the spare space is distributed, taking the size of the content into account.

See the Pen Smashing Flexbox Series 3: flex: 1 1 auto short text by Rachel Andrew (@rachelandrew) on CodePen.

In situations where there is no spare space, for example when we have more content than can fit in a single line, then there is no space to distribute.

See the Pen Smashing Flexbox Series 3: flex: 1 1 auto long text by Rachel Andrew (@rachelandrew) on CodePen.

This shows us that figuring out what auto means is pretty important if we want to know how Flexbox works out the size of our boxes. The value of auto is going to be our starting point.

Defining Auto

When auto is defined as a value for something in CSS, it will have a very specific meaning in that context, one that is worth taking a look at. The CSS Working Group spend a lot of time figuring out what auto means in any context, as this talk for spec editor Fantasai explains.

We can find the information about what auto means when used as a flex-basis in the specification. The terms defined above should help us dissect this statement.

“When specified on a flex item, the auto keyword retrieves the value of the main size property as the used `flex-basis`. If that value is itself auto, then the used value is `content`.”

So if our flex-basis is auto, Flexbox has a look at the defined main size property. We would have a main size if we had given any of our flex items a width. In the below example, the items all have a width of 110px, so this is being used as the main size as the initial value for flex-basis is auto.

See the Pen Smashing Flexbox Series 3: flex items with a width by Rachel Andrew (@rachelandrew) on CodePen.

However, our initial example has items which have no width, this means that their main size is auto and so we need to move onto the next sentence, “If that value is itself auto, then the used value is content.”

We now need to look at what the spec says about the content keyword. This is another value that you can use (in supporting browsers) for your flex-basis, for example:

.item 
    flex: 1 1 content;

The specification defines content as follows:

“Indicates an automatic size based on the flex item’s content. (It is typically equivalent to the max-content size, but with adjustments to handle aspect ratios, intrinsic sizing constraints, and orthogonal flows”

In our example, with flex items that contain text, then we can ignore some of the more complicated adjustments and treat content as being the max-content size.

So this explains why, when we have a small amount of text in each item, the text doesn’t wrap. The flex items are auto-sized, so Flexbox is looking at their max-content size, the items fit in their container at that size, and the job is done!

The story doesn’t end here, as when we add more content the boxes don’t stay at max-content size. If they did they would break out of the flex container and cause overflow. Once they fill the container, the content begins to wrap and the items become different sizes based on the content inside them.

Resolving Flexible Lengths

It’s at this point where the specification becomes reasonably complex looking, however, the steps that need to happen are as follows:

First, add up the main size of all the items and see if it is bigger or smaller than the available space in the container.

If the container size is bigger than the total, we are going to care about the flex-grow factor, as we have space to grow.


flex items with spare space at the end


In the first case our items have available space to grow into. (Large preview)

If the container size is smaller than the total then we are going to care about the flex-shrink factor as we need to shrink.


flex items overflowing the container


In the second case our items are too large and need to shrink to fit into the container. (Large preview)

Freeze any inflexible items, which means that we can decide on a size for certain items already. If we are using flex-grow this would include any items which have flex-grow: 0. This is the scenario we have when our flex items have space left in the container. The initial value of flex-grow is 0, so they get as big as their max-width and then they don’t grow any more from their main size.

If we are using flex-shrink then this would include any items with flex-shrink: 0. We can see what happens in this step if we give our set of flex items a flex-shrink factor of 0. The items become frozen in their max-content state and so do not flex and arrange themselves to fit in the container.

See the Pen Smashing Flexbox Series 3: flex: 0 0 auto by Rachel Andrew (@rachelandrew) on CodePen.

In our case — with the initial values of flex items — our items can shrink. So the steps continue and the algorithm enters a loop in which it works out how much space to assign or take away. In our case we are using flex-shrink as the total size of our items is bigger than the container, so we need to take away space.

The flex-shrink factor is multiplied by the items inner base size, in our case that is the max-content size. This gives a value with which to reduce space. If items removed space only according to the flex-shrink factor then small items could essentially vanish, having had all of their space removed, while the larger item still has space to shrink.

There is an additional step in this loop to check for items which would become smaller or larger than their target main size, in which case the item stops growing or shrinking. Again, this is to avoid certain items becoming tiny, or massive in comparison to the rest of the items.

All that was simplified in terms of the spec as I’ve not looked at some of the more edge-casey scenarios, and you can generally simply further in your mind, assuming you are happy to let Flexbox do its thing and are not after pixel perfection. Remembering the following two facts will work in most cases.

If you are growing from auto then the flex-basis will either be treated as any width or height on the item or the max-content size. Space will then be assigned according to the flex-grow factor using that size as a starting point.

If you are shrinking from auto then the flex-basis will either be treated as any width or height on the item or the max-content size. Space will then be removed according to the flex-basis size multiplied by the flex-shrink factor, and therefore removed in proportion to the max-content size of the items.

Controlling Growing And Shrinking

I’ve spent most of this article describing what Flexbox does when left to its own devices. You can, of course, exercise greater control over your flex items by using the flex properties. They will hopefully seem more predictable with an understanding of what is happening behind the scenes.

By setting your own flex-basis, or given the item itself a size which is then used as the flex-basis you take back control from the algorithm, telling Flexbox that you want to grow or shrink from this particular size. You can turn off growing or shrinking altogether by setting flex-grow or flex-shrink to 0. On this point, however, it is worth using a desire to control flex items as a time to check whether you are using the right layout method. If you find yourself trying to line up flex items in two dimensions then you might be better choosing Grid Layout.

If your flex items are ending up an unexpected size, then this is usually because your flex-basis is auto and there is something giving that item a width, which is then being used as the flex-basis. Inspecting the item in DevTools may help identify where the size is coming from. You can also try setting a flex-basis of 0 which will force Flexbox to treat the item as having zero width. Even if this isn’t the outcome that you want, it will help to identify the flex-basis value in use as being the culprit for your sizing issues.

Flex Gaps

A much-requested feature of Flexbox is the ability to specify gaps or gutters between flex items in the same way that we can specify gaps in grid layout and multi-column layout. This feature is specified for Flexbox as part of Box Alignment, and the first browser implementation is on the way. Firefox expects to ship the gap properties for Flexbox in Firefox 63. The following example can be viewed in Firefox Nightly.

See the Pen Smashing Flexbox Series 3: flex-gaps by Rachel Andrew (@rachelandrew) on CodePen.


Three rows of items with gutter spacing between them


The image as seen in Firefox 63 (Large preview)

As with grid layout, the length of the gap is taken into account before space is distributed to flex items.

Wrapping Up

In this article, I’ve tried to explain some of the finer points of how Flexbox works out how big the flex items are. It can seem a little academic, however, taking some time to understand the way this works can save you huge amounts of time when using Flexbox in your layouts. I find it really helpful to come back to the fact that, by default, Flexbox is trying to give you the most sensible layout of a bunch of items with varying sizes. If an item has more content, it is given more space. If you and your design don’t agree with what Flexbox thinks is best then you can take control back by setting your own flex-basis.

Smashing Editorial
(il)


Link to article: 

Flexbox: How Big Is That Flexible Box?

Thumbnail

6 Things You Need to Know About CRO & Social Login

cro-social-login

Entrepreneurs are always trying to figure out how to engage their user more and boost their website’s conversion rate. One way to do that is through social sign-on, also referred to as social login or lazy sign-in. With social login, users can access your website using the social account IDs that they already have instead of setting up new login details for your website. And they don’t have to remember a new set of login credentials. Simply put, social login enhances a website’s user experience while allowing marketers to collect more accurate user data, including gender, age, interests, relationship status and a…

The post 6 Things You Need to Know About CRO & Social Login appeared first on The Daily Egg.

Link:

6 Things You Need to Know About CRO & Social Login

Thumbnail

19 Call-to-Action Phrases That Will Convert Your Users

call-to-action-phrases

What happens when nobody clicks on your call-to-action phrases and buttons? You don’t get any leads. Nor do you generate any revenue. That’s the opposite of the point, right? Which is why I tell business owners and marketers to take the time to refine their CTAs. A poorly-written CTA negates all the hard work you do for the rest of your marketing campaign. Someone who visits your website might be with you up until that point, then decide to bail on the conversion. So, how do you write call-to-action phrases that convert? What is the Psychology Behind CTA Phrases? From…

The post 19 Call-to-Action Phrases That Will Convert Your Users appeared first on The Daily Egg.

See the original article here:  

19 Call-to-Action Phrases That Will Convert Your Users

Thumbnail

Increase Clicks with these 12 Call-to-Action Phrases

call-to-action-phrases

What happens when nobody clicks on your call-to-action phrases and buttons? You don’t get any leads. Nor do you generate any revenue. That’s the opposite of the point, right? Which is why I tell business owners and marketers to take the time to refine their CTAs. A poorly-written CTA negates all the hard work you do for the rest of your marketing campaign. Someone who visits your website might be with you up until that point, then decide to bail on the conversion. So, how do you write call-to-action phrases that convert? What is the Psychology Behind CTA Phrases? From…

The post Increase Clicks with these 12 Call-to-Action Phrases appeared first on The Daily Egg.

This article is from – 

Increase Clicks with these 12 Call-to-Action Phrases

Thumbnail

What is the Best Homepage to Have (3 Real Examples)

best-homepage

Remind me: How many chances do you get to make a good impression? Oh, that’s right. One. Just one. If you don’t have the best homepage possible, that first impression becomes negative for website visitors. You lose that first impression forever. Will the visitor come back? Maybe. But you’re playing with fire. There aren’t any new statistics on web design aesthetics and first impressions, but an older study demonstrated that 94 percent of people’s first impressions of a business were related to web design. That’s pretty illustrative. If you have a beautiful, functional, easily navigable homepage, you’re more likely to…

The post What is the Best Homepage to Have (3 Real Examples) appeared first on The Daily Egg.

Read this article: 

What is the Best Homepage to Have (3 Real Examples)

Thumbnail

How to Get Traffic to Your Website (Neil Patel’s Ultimate Guide)

how-to-get-traffic-to-your-website

I don’t think anyone would argue that I’ve written a ton of content over the years about how to get traffic to your website. Still, it’s one of the most important skills to learn. Optimizing your website for conversions won’t matter if you don’t have any traffic to begin with. It’s like expecting to sell merchandise when nobody visits your brick-and-mortar store. Once you know how to get traffic to your website, you can optimize your content and pages based on patterns among your target audience. That’s step two. But as they say, you have to walk before you can…

The post How to Get Traffic to Your Website (Neil Patel’s Ultimate Guide) appeared first on The Daily Egg.

Taken from – 

How to Get Traffic to Your Website (Neil Patel’s Ultimate Guide)

Thumbnail

How to Reduce Bounce Rate Today: 12 Advanced Techniques

how-reduce-bounce-rate

Have you ever walked into a store, turned around, and walked right back out? You bounced. Maybe you didn’t like the look of the store itself, or perhaps you realized the store didn’t sell the type of merchandise you needed. Whatever the case, you took one look and got out of dodge. The same thing happens on your website, and it’s not a good thing. Learning how to reduce bounce rate makes your site “stickier.” It’s more inviting. Visitors want to hang around for a while — and maybe even buy something or convert on one of your offers. A…

The post How to Reduce Bounce Rate Today: 12 Advanced Techniques appeared first on The Daily Egg.

Continue reading:

How to Reduce Bounce Rate Today: 12 Advanced Techniques

Thumbnail

Low Conversion Rate? 13 Reasons (And 13 Fantastic Solutions)

low-conversion-rate-9

A low conversion rate can harm your business by slowing your leads and sales to a trickly. Fortunately, there are plenty of ways to fix the problem. First, though, you need to know why you have a low conversion rate. What’s causing people to bounce from your site or read your content without any other engagement? Data and tools can help you identify the culprit, but it helps to know what red flags to consider. I’m going to take you through 13 potential reasons for your low conversion rate, then answer four common questions involving conversion rate metrics. 13 Reasons…

The post Low Conversion Rate? 13 Reasons (And 13 Fantastic Solutions) appeared first on The Daily Egg.

Visit link: 

Low Conversion Rate? 13 Reasons (And 13 Fantastic Solutions)