I recently did my first post on LinkedIn’s “Pulse” blogging platform. It’s about a helpful 5-point checklist for conducting business experiments, including A/B tests. You can read all about it here:
As a practitioner of Web Analytics and Optimization, I’ve spent a fair amount of the last seven years of my career focused on tracking and improving Conversion Rates.This is a noble pursuit for any business, but intense focus on the conversion rate metric can have negative implications – people in your organization (or your clients) may obsess about changes in conversion rate, pester you about them, and even blame you for them!
To avoid this risk and/or annoyance, it helps to have simple, educational “sound bites” for stakeholders explaining why they are maybe/possibly freaking out for no good reason. This is NOT to say that you shouldn’t diligently investigate what you believe to be causes for concern to your business or client, however. Strike a balance, as in all things.
Let’s say someone in your company comes to you with some data about a change in conversion rate. That is a good thing (coming to you with data), right? So, first off, don’t brush them off. Encourage data-driven behavior, even when it might be off-base!
They’re concerned because in a week-over-week report, conversion rate has dropped by 25%. That’s usually a bad thing, so it’s worth some respectful, diligent investigation. Assuming you’ve looked at various data points, and are of the opinion that it’s not a cause for immediate concern, here’s how you might frame the conversation as it continues.
Hi, [Stakeholder], thanks for bringing this to my attention. I’ve dug into the data, looking at YoY behavior, key segments, etc. and I recommend we take a “wait and see” approach. I’m attaching a trended report, so you can see beyond the week-over-week view.
The Stakeholder will almost inevitably ask some follow-up questions about why conversion rate is down. While you will likely have some data points and explanation of your own, here are 9 of my favorite reminders of why your conversion rate may fluctuate from time to time, instead of the constant “up and to the right” trend that we all strive for: Continue reading
I want to discuss a framework that’s new to me, but not that new: It’s called the Data > Information > Knowledge > Wisdom Hierarchy, or the “DIKW Pyramid.”
I was turned on to it very recently, but it’s been around since at least 1982. It, or variants of it, have been used in fields as diverse as Information Science, Engineering, and Geography.
Different academics in different fields of study seem to disagree about how useful or smart the framework is, but I’ve been thinking about DIKW in terms of Web Analytics and Optimization (natch). And, it’s working just fine for me!
This framework helps explain the kinds of work I’ve been focused on for the last 10 years or so in Digital Marketing. Kindly allow me to explain…
Hi Readers! I’ve not been posting lately, which is sad. I’ll try to get back on the horse in 2014
What I have been up to is a fun video interview and podcast with Alex Harris of the Marketing Optimization podcast series. In it, we discuss my framework of the 5 ingredients of a world-class testing hypothesis. Better yet, we do live optimization on landing page designs, showing you how to use the framework on your site pages!
And, if you’re the podcast-listening sort, make sure to subscribe to the Marketing Optimization podcast on iTunes, which features a new podcast with a digital marketing expert every week!
A lot has been written about how to pursue Email Marketing Optimization, often with emphasis on tactics like subject line testing, email creative testing, and landing page optimization for email campaigns.
All great topics, but what I haven’t read about is an overarching framework to use while doing email optimization that will guide you through the process in a strategic way. In my day-to-day world of Conversion Optimization, there are many frameworks that one can follow. For example, the Persuasion Architecture™ developed in the early days at FutureNow, or the LIFT Model™ used by WiderFunnel up in British Columbia.
These frameworks give you guideposts to follow, things to think about, and ways to prioritize work in the field of Conversion Optimization. Maybe some experts have developed similar frameworks for Email Marketing that I don’t know about?
In honor of All Hallows’ Eve, I relay this tale of woe and warning about a Multivariate marketing experiment that was doomed, doomed from the start! Read and retain, lest ye tumble into the same pitfalls and traps in your Optimization work. Continue reading
While case studies about conversion rate lift and increases in revenue from testing get a majority of the “press” these days, there is something far less sexy and far more important to be thinking about: The order in which you execute your test ideas.
I know, that’s way more boring than, Learn how company X increased their sales by 4,000% with one simple change!!!!! I don’t blame anyone for wanting to share exciting test results, but what if you burn an hour on an webinar and find out you’ve already made the change they’re talking about? Or, you’ve already tested that change and it didn’t make a difference on your marketing?
That’s why we’re going to talk about something “boring” that is guaranteed to help you get results over a longer time frame. Prioritization – the way you apply resources to Optimization work to get the best ROI. Continue reading
As the testing and Conversion Optimization of websites becomes more widely accepted and practiced, the technologies that enable testing (A/B and Multivariate testing software) are getting better and cheaper. There is good competition in the testing tool market, so it’s a great time to be a company looking to start a formal testing and Optimization program.
Many companies that go down this road fail after a short time due to the difficulties of setting up valid tests, getting measurable results, and coming up with new testing ideas. Even companies that are somewhat successful at taking on testing without any outside help tend to “plateau” after 6-12 months and aren’t able to get sustainably positive test results.
While I can understand a CMO’s desire to reap the touted rewards of Optimization “on the cheap,” this post will explain why you need staffed Optimization expertise to be successful in a formal testing program. Continue reading
I recently had a young Optimizer ask me, “How do you turn analytics data into a hypothesis?” My answer was probably unexpected: “You don’t.” My curt answer was meant to alert this young pup that a single input isn’t enough to form the basis of a good marketing hypothesis.
Today’s post will overview what I believe are the 5 key ingredients of a great marketing hypothesis. Plenty of posts I’ve read have instructed you how to leverage the Scientific Method to write a Conversion Optimization hypothesis. They usually instruct you to make sure it’s provable/disprovable, clearly stated, based on a specific Key Performance Indicator, etc.
This is all good advice, but assuming you know all that, I want to cover the inputs. These inputs, IMO, are the difference between a legitimate hypothesis and a world class hypothesis. Some of these 5 key inputs are probably obvious, but a few may have evaded you. Or, perhaps you thought it was “uncool” to have them as inputs?