Real-World Education for Modern Marketers

Join Over 600,000 Marketing Professionals

Start here!
N E X T
Text:  A A

Online Research Traps That Can Derail Your Marketing Strategy

by   |    |  7,652 views

It's hard to believe how different it was, just a mere 15 years ago, to conduct secondary market research. There was no Internet (commercial at least), no Yahoo portal, no Google search, no Web-accessible databases to tap. Almost every effort required a phone call, a trip to the library, a subscription to a third-party source, a read-through of hardcopy reference material.

How times have changed.

But, not always for the better. The seemingly bottomless pit of content that makes up today's Web poses some distinct challenges to marketers looking for precise, credible facts on which to build a strategy.

This article points out some of the online research traps—which can put your marketing strategy atop the proverbial house of cards.

Misleading Definitions


It is not uncommon for a data source to use imprecise terminology. My favorite example is the oft-quoted number of cell phone users in the US. You constantly see or hear 255 million, since that number is prominently displayed on the home page "ticker" of the wireless trade association CTIA.org. It seemed high to me, and when I contacted CTIA's head researcher, he admitted the number is for subscriptions, not subscribers.

When I later saw the 255 million number used, once again, in The New York Times April 13 "Week in Review" section, I requested a correction. The Times did its own research and two weeks later published a retraction, with a new number of 226 million.

Other data definitions to double-check include households vs. individuals, visits vs. visitors, total population vs. Internet users. These types of mistakes can really throw off your work in sizing market opportunities.

Bad Math

Most mistakes in this category seem to come from analysts' providing interpretation of data, comparing one set of numbers to another.

For example, research reports frequently reference market growth. Be careful: For example, an analysis might say that a company went from 10% to 11% market share, achieving 10% growth (1 divide by 10). Misleading, and easily misinterpreted by your researcher.

A more dangerous form is where interpretations are developed on "scaled" qualitative information. For example, if 60% of people rated factor A as a "5," and 30% rated factor B as a "5," the analyst might conclude people are twice as likely to choose A over B. Nonsense. Don't be swayed by this kind of math. (The scale may be using numbers, but it's not quantitative; it could just as easily be using "OK, good, better, awesome, and way cool...") 

Unclear Percentages

A common example in this category relates to surveys that allow respondents to check off multiple answers, where the total response can be well over 100%. If the source is not clear, or your researcher isn't paying close attention, some ludicrous conclusions might find their way into your marketing strategy.

Would you believe there are also surveys out there that add up to less than 100%? Last fall I came across a report on DVR usage, with key data shown in two bar charts. One mapped 36% of total TV viewing time as "real time," the other showed 32% as "time shifted."

I contacted the research company to find out about the remaining 32%—what other definition of time had they come up with? The answer led me to conclude that this particular study was completely unusable.

No Check on Reality

This trap is the most challenging, as you need some experience and a healthy dose of skepticism to spot it. It is most common with emerging topics (where almost no prior research exists) and with surveys of customer intent (what people say they will do versus how they might actually behave).

The first example comes from a story about genealogy that I read earlier this year. It cited a study claiming 75% of the US "was interested in genealogy." The skeptic in me didn't believe that 75% of the population could even define genealogy. But, being an emerging topic, it was hard to refute factually.

For these kind of traps, I use my "rule of 20." I survey 5 friends, 5 work colleagues, 5 family members, and 5 citizens (bus driver, deli clerk, lobby security guard, etc.) to see if the data is in the ballpark. Try it yourself using this genealogy example—see whether you get 75%. (I didn't come close.)

A second example comes from a research company that claimed some months ago that over 40% of US internet users had watched a full episode of a TV show online. This time I had seen plenty of other data that didn't add up, specifically for full episodes, so I contacted the source directly (see a pattern here?). Turns out the 40% was an amalgamation of several inputs, only half of them true behavioral tracking.

Conclusion

To be sure, this is not an exhaustive list of the traps awaiting us as we mine the depths of online research sources. For now, keep these simple guidelines in mind:

  • Ensure data definitions are precise.
  • Check the math, especially on data comparisons.
  • Watch those percentages (totaling more than or less than 100%).
  • If it doesn't sound right, it probably isn't: Do a reality check.
  • When all else fails, don't be afraid to contact the source.

Unfortunately, there is no quality control or Good Housekeeping seal of approval for online research. Which means, of course, that's one more task the marketing department has to take on.


Join over 600,000 marketing professionals, and gain access to thousands of marketing resources! Don't worry ... it's FREE!

WANT TO READ MORE?
SIGN UP TODAY ...
IT'S FREE!

We will never sell or rent your email address to anyone. We value your privacy. (We hate spam as much as you do.) See our privacy policy.

Sign in with one of your preferred accounts below:

Loading...
Kevin Horne is an independent marketing strategist working with advertising agencies and interactive firms in NYC. His blog can be viewed at lairigmarketing.typepad.com.

Rate this  

Overall rating

  • This has a 4 star rating
  • This has a 4 star rating
  • This has a 4 star rating
  • This has a 4 star rating
  • This has a 4 star rating
3 rating(s)

Add a Comment

Comments

  • by Dan Soschin Tue Jul 22, 2008 via web

    My suggestion to those of you publishing results from your research is to also ALWAYS publish how the data was obtained and the full data set... this is especially true if your data shows something that is hard to believe.

    If you were shocked by the data, then chances are, so too will your audience. Site your sources and methodology so you can add credibility to your data!

  • by J Geibel Wed Jul 23, 2008 via web

    The article is a good start but the initial premise "marketers looking for precise, credible facts on which to build a strategy." is a bit off.

    Strategies aren't always "built" on facts - strategies are developed as a byproduct of a market vision, and market research is used to validate or shape the underlying assumptions.

    Unfortunately (or fortunately, depending on how you look at it), marketing isn't accounting and often there aren't "facts" to support key assumptions in a marketing strategy - for example - there weren't any "facts" to support the use of personal computers when Apple introduced theirs in the late 70's. Nor were there any "facts" to support the use of cell phones when they were introduced in the mid-80's.

    What this means is that the unknowns in a strategy have to be clearly highlighted and monitored as the go-to-market strategy is implemented. Those assumptions will be either validated or disproved in the initial marketing efforts - and the strategy has to be adjusted in real time. That's why many market introductions are so tough.

    All this being said, the author brings out some good points about checking (and cross-checking) any "facts" that are published. There is a tremendous amount of misinformation and disinformation on the Internet and even published by major media (the fact checkers were laid off years ago). However, the Internet also makes cross-checking easier. My rule of thumb is to triangulate - I look for three sources that validate the "fact" or figure - and not ones that merely reference each other.

MarketingProfs uses single
sign-on with Facebook, Twitter, Google and others to make subscribing and signing in easier for you. That's it, and nothing more! Rest assured that MarketingProfs: Your data is secure with MarketingProfs SocialSafe!