6 CRO Mistakes You Might Be Making (And How to Fix Them)

6 CRO Mistakes You Might Be Making (And How to Fix Them)

To help save you time and overcome that steep learning curve, here are some of the most common mistakes marketers make with conversion rate optimization. 6 CRO mistakes you might be making 1) You think of CRO as mostly A/B testing. If you think you’re "doing CRO" just by A/B testing everything, you’re not being very smart about your testing. Different days of the week and of the month can yield very different conversion rates. Finally, remember that conversion rate isn’t the most important metric for your business. Conversion rate doesn’t always tell you whether your business is doing better than it was. One of the biggest mistakes I made when I first started learning CRO was thinking I could rely on what I remembered from my college statistics courses to run conversion tests. 92% of our monthly blog leads also came from "old" posts. By optimizing these already high-performing posts for traffic and conversions, we more than doubled the number of monthly leads generated by the old posts we've optimized. To give a lead conversion lift to 50 of these high-traffic, low-converting posts, Nick conducted a test in which he replaced each post’s primary call-to-action with a call-to-action leading visitors to an offer that was most tightly aligned with the post’s topic and had the highest submission rate.

5 Growth Hacking Strategies for New Small Businesses
Here’s How to Master Facebook Advertising and Why You Must
Google Analytics and Social Media: What Marketers Need to Know

You just ran what you thought was a really promising conversion test. In an effort to raise the number of visitors that convert into demo requests on your product pages, you test an attractive new redesign on one of your pages using a good ol’ A/B test. Half of the people who visit that page see the original product page design, and half see the new, attractive design.

You run the test for an entire month, and as you expected, conversions are up — from 2% to 10%. Boy, do you feel great! You take these results to your boss and advise that, based on your findings, all product pages should be moved over to your redesign. She gives you the go-ahead.

But when you roll out the new design, you notice the number of demo requests goes down. You wonder if it’s seasonality, so you wait a few more months. That’s when you start to notice MRR is decreasing, too. What gives?

Turns out, you didn’t test that page long enough for results to be statistically significant. Because that product page only saw 50 views per day, you would’ve needed to wait until over 150,000 people viewed the page before you could achieve a 95% confidence level — which would take over eight years to accomplish. Because you failed to calculate those numbers correctly, your company is losing business.

A risky business

Miscalculating sample size is just one of the many CRO mistakes marketers make in the CRO space. It’s easy for marketers to trick themselves into thinking they’re improving their marketing, when in fact, they’re leading their business down a dangerous path by basing tests on incomplete research, small sample sizes, and so on.

But remember: The primary goal of CRO is to find the truth. Basing a critical decision on faulty assumptions and tests lacking statistical significance won’t get you there.

To help save you time and overcome that steep learning curve, here are some of the most common mistakes marketers make with conversion rate optimization. As you test and tweak and fine-tune your marketing, keep these mistakes in mind, and keep learning.

6 CRO mistakes you might be making

1) You think of CRO as mostly A/B testing.

Equating A/B testing with CRO is like calling a square a rectangle. While A/B testing is a type of CRO, it’s just one tool of many. A/B testing only covers testing a single variable against another to see which performs better, while CRO includes all manner of testing methodologies, all with the goal of leading your website visitors to take a desired action.

If you think you’re “doing CRO” just by A/B testing everything, you’re not being very smart about your testing. There are plenty of occasions where A/B testing isn’t helpful at all — for example, if your sample size isn’t large enough to collect the proper amount of data. Does the webpage you want to test get only a few hundred visits per month? Then it could take months to round up enough traffic to achieve statistical significance.

If you A/B test a page with low traffic and then decide six weeks down the line that you want to stop the test, then that’s your prerogative — but your test results won’t be based on anything scientific.

A/B testing is a great place to start with your CRO education, but it’s important to educate yourself on many different testing methodologies so you aren’t restricting yourself. For example, if you want to see a major lift in conversions on a webpage in only a few weeks, try making multiple, radical changes instead of testing one variable at a time. Take Weather.com, for example: They changed many different variables on one of their landing pages all at once, including the page design, headline, navigation, and more. The result? A whopping 225% increase in conversions.

2) You don’t provide context for your conversion rates.

When you read that line about the 225% lift in conversions on Weather.com, did you wonder what I meant by “conversions?”

If you did, then you’re thinking like a CRO.

Conversion rates can measure any number of things: purchases, leads, prospects, subscribers, users — it all depends on the goal of the page. Just saying “we saw a huge increase in conversions” doesn’t mean much if you don’t provide people with what the conversion means. In the case of Weather.com, I was referring specifically to trial subscriptions: Weather.com saw a 225% increase in trial subscriptions on that page. Now the meaning of that conversion rate increase is a lot more clear.

But even stating the metric isn’t telling the whole story. When exactly was that test run? Different days of the week and of the month can yield very different conversion rates.

conversion-rate-fluctuation.png

For that reason, even if your test achieves 98% significance after three days, you still need to run that test for the rest of the full week because of how different conversion rate can be on different days. Same goes for months: Don’t run a test during the holiday-heavy month of December and expect the results to be the same as if you’d run it for the month of March. Seasonality will affect your conversion rate.

Other things that can have a major impact on conversion rate? Device type is one. Visitors might be willing to fill out that…

COMMENTS

WORDPRESS: 0
DISQUS: 0