The central question of any marketing opportunity is, "What motivates customers to buy?" This is just as true in email as any other channel. How do we know if two or blue or Tuesday or 15% will generate more response and revenue? The answer is simple: Test it.


Yet, there is altogether too much guessing and not enough proving in email marketing. With inboxes overflowing, subscriber fatigue growing and competition from social networks increasing, it's no longer enough to broadcast out your best guess as to what will motivate and inspire and engage.
We talked about why email marketers in particular don't do much testing in a panel I led at the MarketingProfs Digital Mixer a few weeks ago with Sarah Welcome, Director of Customer Intelligence for International Data Group (IDG) and Shane Johnston, VP of Client Strategy at Brooks Bell Interactive.
We think it comes down to two things: Fear and Lack of Knowledge. This includes fear of losing revenue, making a mistake or having to actually defend your reading of the results. It's complicated by a lack of testing methodology, no strong starting point or low statistical intelligence.
Testing can be scary. Using objective data forces us to be accountable. That is great if you are trying to prevent internal preference or opinion from guiding your email marketing strategy. Data is wonderful when you are protecting your subscribers (and your sender reputation and ability to reach the inbox) from too frequent or poorly timed messages.
The flip side of the potential for a big gain from testing, is a big loss from testing. Sarah of IDG mentioned that it can be scary to think about spending company money for a campaign or a new design that just ends up costing the company more money in lower response. It's easier sometimes to just stand still.
Most marketers seem to do just that. Jupiter Research (now part of Forrester Research) found in 2008 that only 40% of email marketers did any regular testing. But those who did saw a big lift, up to 25% improvement in response rates.
"If you test, you never really lose," says Sarah from IDG. "There are tests that are ill conceived and poorly implemented. However, if you do test well and the theory you had is proven WRONG .... that's not a failure. You gained knowledge from that."
Ah - there's the rub, right? She says it's okay to be proven wrong if your test is well conceived and implemented. Uh, oh. Now we are back to fear and lack of knowledge again!
You can do it! Shane from Brooks Bell Interactive outlined a six step methodology to help get everyone started:


  1. Set goals. Be clear on what is measureable.

  2. Define Success. Align the test and the metrics.

  3. Know the Audience

  4. Develop likely hypotheses about what you think will happen. For example, "Changing the call to action from 'download' to 'click here' will increase clicks."

  5. Create viable tests. For example, segmenting out a random 10% of the file and sending them the same message as others, but with just the change from 'download' to 'click here.'

  6. Identify the winner after a viable timeframe. Sometimes the immediate winner after an hour is not the winner at the end of 48 hours. Or, sometimes the winner of higher clicks is not the winner of higher conversions. Be sure to focus on the right metrics.

  7. Lather, Rinse, Repeat. Keep testing to iterate further.

  8. Trend the findings. Like all marketing, email is not consumed in a vacuum or in a snapshot of time. Trend the findings in order to understand subscriber behavior.


Sarah said that her methodology is similar, but that they integrate testing into their daily routines. Questions from the audience around how this applies to smaller businesses, or how to test across channels like email and social networks, also indicate that attendees struggle with the same challenges.
We shared a number of great examples of tests done well and lessons learned.

  • IDG retested a concept that had been testing a year before, and the opposite now proved true. So much for consistent subscriber behavior! Lesson: Don't accept the status quo, even if it worked before.

  • AARP tested a new template and found that ugly worked better. Yikes. To their credit, however, they kept testing and found out the underlying issues - not beauty, but user experience.

  • Text sometimes beats HTML.

  • Behavior trumped demographics for certain mail types.


If you'd like a copy of the deck, just email me stephanie[dot]miller[at]returnpath[dot]net. I'll also send you a testing worksheet we created that should help you follow Shane's methodology and review results. Please test it out and let us know how it works for you!
I would love to feature your successful testing - including learning you didn't expect - in a future column.

Enter your email address to continue reading

Email Testing: Reveal The Mysteries Of Subscriber Behavior

Don't worry...it's free!

Already a member? Sign in now.

Sign in with your preferred account, below.

Did you like this article?
Know someone who would enjoy it too? Share with your friends, free of charge, no sign up required! Simply share this link, and they will get instant access…
  • Copy Link

  • Email

  • Twitter

  • Facebook

  • Pinterest

  • Linkedin


ABOUT THE AUTHOR

image of Stephanie Miller
Stephanie Miller is the chief member officer at DMA.