Marketing analytics is not (necessarily) science For a few years now, A/B testing has been the darling of the digital analytics scene. A/B testing just wasn’t always a good fit for the questions I had about what worked in our marketing. In digital marketing, that kind of controlled environment is almost impossible to achieve. It doesn’t work, and besides that, it’s missing the point. Confirmation bias isn’t good or bad. Instead, learn to identify your confirmation bias, and then use it to ask better questions. Play devil’s advocate, make a conscious-effort fight for the “other side.” What results would you expect to see if you were wrong? Seek good questions, not good answers If there’s one lesson I’ve learned working with digital marketing data, it’s that you have to be a perpetual skeptic. Of the metrics, of your reporting, of yourself. The world, the platforms your marketing relies on, and your customers are perpetually changing.
Digital marketers have a problem: We’ve got too much data.
It sounds like a ridiculous complaint coming from a data analyst, but it’s true.
Google Analytics alone has more than 150 default metrics, which can be explored with more than 100 dimensions. And that’s without including advanced implementations.
Currently, a default Facebook export includes seven spreadsheets with 10 or more columns of data each. A default Twitter export includes 40 columns of data.
And we’re expected to choose one KPI?
Having this much data was supposed to make marketing easier. And while companies with the resources to hire data science experts have prospered from ubiquitous measurement, the small business marketer often doesn’t know where to begin.
It’s overwhelming. There’s so much of it, and it seems to be constantly changing. So most of us end up running the same reports over and over, instead of exploring and asking questions.
And when we don’t ask questions, we can miss important changes and new ideas that could help us better serve our audiences.
I’ve written before about how marketers can use prediction to improve their intuition about what works in their marketing.
But today, I’d like to talk about some smaller, more fundamental mindset shifts that can help you cut down on the overwhelm and be more curious about your data — whether it’s website analytics or Facebook ad performance.
Marketing analytics is not (necessarily) science
For a few years now, A/B testing has been the darling of the digital analytics scene.
Want to improve your landing page? Test it! Do your customers respond better to an orange or a purple button? Test it! Should your website home page use a photo of a cute puppy or a smiling little girl? Test it!
It’s a great thought.
Wouldn’t it be excellent if we could conduct conclusive experiments that would tell us exactly how to market to our audiences?
For several years, I was a strong believer in the power of A/B testing. I enthusiastically hypothesized dozens of tests. But more often than not, whenever I ran one, the results were inconclusive.
Usually, the problem was that the sample size was too small to reach statistical significance. Other times, there was a variable beyond my control — a traffic fluctuation or a website issue.
Eventually, I realized that the problem wasn’t the method.