Measurement and accountability are hot topics these days. According to a recent study by consultancy Avidan Strategies, "When asked to name the ways in which agencies must improve, 71% of brand managers cited accountability."
Although the exponential increase in the availability of customer and business data has enhanced our ability to measure success, it has also increased our chances of misunderstanding and misinterpreting it.
The following are five common errors to avoid when setting out to measure your marketing efforts.
1. Neglecting to Set Expectations
As Julia Glass quipped in I See You Everywhere, "I'd rather be pleasantly surprised than fatally disappointed." Without setting expectations with the team about what specific tactics can and cannot accomplish, you run the risk of skewing the perception of how your marketing efforts are truly performing.
For example, Web banners generally have low click-through rates, so unless you have a specific offer or powerful call to action, expect to use this tactic as a brand awareness tool. Facebook ads are creatively limited and also have consistently low click-through rates, but if used strategically to build a social presence or raise awareness they make sense.
By setting expectations with your team at the outset, you'll save yourself a lot of misunderstandings and explanations when the data rolls in.
2. Failing to Set Measurable Goals
Determining specific goals that will help the team achieve the overarching marketing objective can be tricky. The process usually breaks down when you look at the numbers. What is success? A 10% increase in Web traffic? A 3% increase in sales? 2,000 Facebook Likes? And who decides?
To set realistic, measurable goals, you should...
- Set benchmarks for context.
- Sit down with all of the partners involved (you may be surprised by the differing expectations).
- Change or tweak goals on the go if they've been set erroneously at the beginning.
3. Flawed Attribution
There is no cause-and-effect relationship in marketing. There is only inspiration and correlation. Drawing straight lines between a tactic and an outcome is dangerous, not only because there are so many variables involved but also because it encourages a mindset that devalues the power of the sum of the parts via integration.
A simple example might include directly linking website traffic to an online advertising campaign. Yes, increased digital activity should correlate with an increase in website traffic, but drawing a cause-and-effect conclusion negates the importance of hiring and training the right employees, SEO, social media efforts, product packaging, in-store signage, etc.
A high-functioning marketing plan will take all of the company efforts into consideration and look for positive trends rather than cause and effect.
4. Believing in One-to-One Digital Analytics
Anyone who has spent time digging into website traffic numbers has probably come to the realization that the data isn't perfect. Because so many variables are involved, including user IP, a myriad of Web browser options, third-party plug-ins, and multiple analytics platforms, online analytics are not one to one. So, though we know website traffic is probably close to what is being reported, we can't say it is exactly what is being reported.
The problem gets even worse when you drill down into the website numbers to look for data such as on referring sites. Google Analytics, for instance, is relying on the Web browser to pass on referral data correctly. And the Web browser is relying on all of the individual websites to track and pass along the data accurately. Not to mention that a Web browser plug-in or opt-out option could be throwing the entire thing off in the first place.
With all of the middlemen and variables involved, it's impossible to have perfect, one-to-one digital analytics. The only safe way to evaluate the numbers is to keep an open mind and look for changes and correlations.
5. Reducing Humans to Numbers
Unlike much of the data we analyze, human behavior is not binary. When consumers see a tactic such as a Web banner, they can choose to click on it, enter the company URL directly into the address bar, use a search engine to investigate the company or product, or simply mentally note the brand. Hopefully, all of those behaviors culminate in an uptick in Web traffic, awareness, and sales—but there is simply no binary way to evaluate it.
No matter what our measurement culture would like us to believe, humans will never be robots. When evaluating data, sometimes it's hard not to begin thinking of people as data points. The danger lies in thinking that people are static. The truth is that attitudes and behaviors constantly shift.
The job of changing hearts and minds will always be more art than science. The two go hand-in-hand, but be careful about putting one in front of the other. Use art to inspire and persuade, and use science to cautiously evaluate.
In the end, view data as a guide, look for trends, and make careful decisions based on a thoughtful evaluation of the best information available.