The latest Emetrics Summits (June 2004) in Santa Barbara and London included an annual session called the Top Ten Web Analytics Problems.

The attendees gathered into groups to discuss what was troubling them. Not counting common management problems (lack of executive sponsorship, no clear objectives, insufficient resources, internal politics), the following issues were decided on as the most troubling among the 50 or so discussed:

1. So Many Tools, So Little Time

If you want to measure Web site success, you first have to determine measurement methods. You can analyze server logs, tag pages with Javascript, invest in packet sniffing, conduct usability studies, survey your customers for satisfaction, and so forth. But you can't afford to do it all. So which measurement techniques are best? Sorry, this is a list of problems; not solutions.... (For solutions, see note at the end of this article —Ed.)

2. Dependence on It to Get Things Done

Implementing Web analytics tools requires a team of people from marketing, Web services and IT, which is busy doing "real" work (like payroll; it's hard to argue with that one). But once the tools are in place, they require a bit of tweaking to get them well-tuned. Once they are well-tuned, the site itself should be updated in a controlled way to test the value of various modifications. This involves the IT department, which is fine in a firm with highly motivate individuals working in concert toward common goals. I have met several such organizations, but no more than can be counted on two hands.

3. Expectations of Accuracy

As Matthew Berk of Tommy Hilfiger put it, "We will spend many years trying to recover from the myth of accurate Web analytics numbers." Web analytics tools capture data in multiple ways. They cleanse that data in multiple ways. They report out that data in multiple ways. While they are consistent, they are never in agreement. The business side of the house demands precise numbers. You'll have to show them how tracking trends is more important than knowing specifics to three decimal points.

4. Went Shopping for Answers, but Only Got Data

After wearing down upper management for funding, finally getting the tools you need and finally getting them installed, those tools produce data rather than insights. After combing through reams of reports, three or four anomalies point to places where your Web site needs serious help.

A few HTML tweaks later and the Web analytics tool has earned a significant return on investment—this time. Subsequent Web site improvements only come from laborious analysis of the data. Deriving useful, probative information from Web analytics tools requires dedication, resources and an individual not found at most firms, as lamented in Problem No. 5.

5. Got Tools and Lots of Reports, but No Analysts

Terry Lund, Web intrepreneur at Kodak and an independent consultant before joining a Web analytics vendor, put it succinctly in his presentation, "Marketing expects IT to figure out the meaning of Web analytics reports, and IT figures the Marketing Department can handle it themselves. Neither is correct."

There is a third person needed for this task: the analyst. This is a person dedicated to higher mathematics. This is a person who loves numbers and lives for multivariate regression testing: a quant wonk.

This person is an absolute necessity because so much of Web analytics data can be interpreted in so many ways. Knowing what is statistically significant and knowing the difference between lies, damn lies and statistics will keep you on a path to continuous improvement.

6. No Benchmarks

You've got tools, data, analysis and even information! Now what are you going to compare it to? If your only interest is in continuous improvement, then you have nothing to worry about; you are your own benchmark. But if you want to know where you stand in the grand scheme of things or in the scheme of your own industry, you'll be disappointed.

So many senior executives want to know whether the Web efforts are up to global standards. Sorry, pal, we're still trying to agree on how "conversion" is calculated. We'll get back to you on the whole benchmarking thing when we can distinguish a visit from a visitor.

7. Normalizing Metrics Across Design Changes

Web analytics is not at its best solving Big Questions like benchmarking. It is at its most valuable when measuring whether specific changes have caused a change in specific online behavior. It's excellent for controlled, continuous improvement.

Did you catch that? The word "controlled"? Aye, there's the rub.

To get the best information from Web analytics tools, you must limit the number of changes you make at any given time. So many organizations upgrade their Web sites en masse or create whole new versions, and then sit back to see whether there's a positive effect on visitor behavior.

OK, so let's say it did. Let's say conversion doubled. Great! But what, specifically caused that improvement? Did you record which change is worth repeating?

In a laboratory situation, you make one change and one change only, measure the results, and then go on to the next experiment. Few companies out there can do this sort of formal testing; the rest of the world can only aspire.

8. Data Integration

Problem No. 1 (So Many Tools, So Little Time) bemoaned the large number of different tools to choose from. That creates a data integration problem right off the bat. But it gets worse. Think about data from multiple Web sources. Data from the content side, from the applications side, from the e-commerce side.

Now try getting all that into a normalized dataset with keys linking all the individuals to their behavior and transactions. Tough, right? You ain't seen nothin' yet.

How about integrating all that with offline data? Data from lead generation programs, cooperative marketing programs, in-store purchases, catalog sales, contact center customer service calls, distribution channel CRM systems and filed sales contact management systems? Yes, your head should be hurting right about now.

9. Putting Web Analytics Data to Use

This one got the most votes as the biggest problem. You have to get all of the above right before you even find out what a challenge this is.

Your job will be to convince dozens of business managers that they can better manage their business if they look at, internalize and make use of Web analytics information. While all managers (in the abstract) would be delighted to have a new way to measure their ability to reach their goals, real-life people have a little trouble with this level of heretofore-unapplied accountability.

Why? Most Web analytics professionals find that the sticking point comes when you ask business managers about their goals. You can start with corporate goals, divisional goals, departmental goals, product line goals—I've even had the occasion to inquire about an individual's compensation plan in order to bring specific, measurable goals to the surface.

You will succeed in this if you have backing from upper management, proper resources, clear objectives and a knack for education.

10. TCO Surprises

New software that tracks a lot of data needs more hardware—no surprise there. But new systems need integration and training and then education and then more training. Everyone at the Emetrics Summit agreed that getting an inexpensive Web analytics tool to cut your teeth on was a good idea.

Once your team understands what Web measurement is all about and how it might really be put to use, they'll start wanting it to do things it can't. That's when to consider a more sophisticated system that can answer the questions they didn't know to ask before.

The majority of participants were on their third analytics tool and considering switching again. So don't forget to include getting acquainted with the genre and developing some experience in your total cost of ownership calculations. Then set your sites on continuous improvement against your own benchmark figures.

My question is whether the problems outlined at the 2005 Emetrics Summit will be more of the same, or whether we will be facing a whole new set of challenges.

Editor's Note: Jim Sterne is heading the upcoming MarketingProfs.com virtual seminar on Web Activity Analysis: Best Practices in the Real World (including solutions to the above problems). To learn more about the seminar, click here.

Subscribe today...it's free!

MarketingProfs provides thousands of marketing resources, entirely free!

Simply subscribe to our newsletter and get instant access to how-to articles, guides, webinars and more for nada, nothing, zip, zilch, on the house...delivered right to your inbox! MarketingProfs is the largest marketing community in the world, and we are here to help you be a better marketer.

Already a member? Sign in now.

Sign in with your preferred account, below.

Did you like this article?
Know someone who would enjoy it too? Share with your friends, free of charge, no sign up required! Simply share this link, and they will get instant access…
  • Copy Link

  • Email

  • Twitter

  • Facebook

  • Pinterest

  • Linkedin


ABOUT THE AUTHOR

image of Jim Sterne

Jim Sterne founded the Marketing Analytics Summit in 2002 and co-founded the Digital Analytics Association in 2004. He now advises companies on analytics strategy planning at Data Driven Leaders Studio and teaches AI and machine-learning to marketers.