If you're not "living and breathing" search engine optimization, it can be easy to latch onto old SEO trends and metrics and focus obsessively on them, especially those few hot-button issues that get the most attention from the press or from your CEO.
It takes time and experience to stay on the cutting edge of SEO, and more than likely you don't have that kind of time, considering your other marketing efforts. So here's a quick update on what's hot and what's not in the world of search engine optimization.
- Becoming a trusted contributor on social news/content sites like Digg, Propeller, Reddit, Mixx, StumbleUpon, Wikipedia, and Knol
- Building your personal and professional network in online communities like Facebook, LinkedIn, MySpace, Flickr, YouTube, Bebo, MyBlogRoll, and the blogosphere in general, and then taking advantage of the residual network effect
- Link baiting—posting humorous/fascinating/contentious/controversial content that is a magnet for links
- Truly understanding and leveraging the power of "Long Tail" dynamics
- Obsessively watching search engine indexation numbers and rankings on trophy keywords (like the one you know the CEO always checks first thing in the morning)
- Worrying yourself sick over duplicate-content penalties
- Relying on XML sitemaps to fix your indexation problems
- The old-fashioned link exchange
Speaking of what's hot, a new generation of SEO metrics exists so you can keep track of your progress once you've abandoned the old thinking and adopted more modern strategies. Gauging your success solely on your positions in the search engine results is old hat.
New SEO paradigms, such as the "Long Tail," universal search, and personalized search, call for new key performance indicators (KPIs).
In addressing "Long Tail SEO," consider the following KPIs:
This is the percentage of your natural search traffic that comes from brand keywords versus nonbrand keywords. If the ratio is high and most of your traffic is coming from searches for your brand name, this means that your SEO efforts are fundamentally broken. The lower the ratio, the more of the long tail of natural search you are likely capturing. This metric is an excellent gauge of the success of your optimization initiatives.
This is the number of unique (non-duplicate) Web pages crawled by search engine spiders such as Googlebot. Your Web site is your virtual sales force, bringing in prospects from search engines, and each unique page is one of your virtual salespeople. The more unique pages you have, the more virtual salespeople you have out there in the engines selling on your behalf.
This is the percentage of unique pages that yield search-delivered traffic in a given month. This ratio essentially is a key driver of the length of your Long Tail of natural search. The more pages that yield traffic from search engines, the healthier your SEO program. If you have only a small portion of your Web site delivering searchers to your door, then most of your pages—your virtual salespeople—are standing around the water cooler instead of working hard for you.
This is the average number of keywords each page (minus the ones that don't get you any traffic) yields in a given month. Put another way, it's the ratio of keywords to pages yielding search traffic. The higher your keyword yield, the greater the part of the Long Tail of natural search your site will capture.
In other words, the more keywords each yielding page attracts or targets, the longer your tail. So an average of eight search terms per page indicates pages with much broader appeal to the engines than, say, three search terms per page.
In a research study done by my company (Netconcepts) called Chasing the Long Tail of Natural Search, the average merchant had a keyword yield of 2.4 keywords per page.
Visitors per Keyword
This is the ratio of search engine-delivered visitors to search terms. This metric indicates how much traffic each keyword drives and is a function of your rankings in the search engine result pages. Put another way, this metric determines the height or thickness of your Long Tail. The average merchant in the aforementioned study obtained 1.9 visitors per keyword.
This is the ratio of pages indexed to unique crawled pages. If a page gets crawled by Googlebot, that doesn't guarantee it will show up in Google's index. A low ratio can mean your site doesn't carry much weight in Google's eyes.
Calculated for each search engine separately, this is how much traffic the engine delivers for every page it crawls. Each search engine has a different audience size. This metric helps you fairly compare the referral traffic you get from each engine. The Netconcepts study found that Live Search and Yahoo tended to crawl significantly more pages, but the yield per crawled page from Google was typically higher by a significant margin.
Hopefully you're now more up-to-date on your SEO tactics, but keep in mind that any of these trends can change at the drop of a hat. Search engine optimization is a process, not a project, so as you optimize your site through multiple iterations, watch the above-mentioned KPIs to ensure you're heading in the right direction. Marketers who are not privy to these metrics will have a much harder time reaching qualified prospects.
If you'd like to hear more about these search trends and metrics, attend the Marketing Profs Digital Mixer, a multi-channel online marketing conference coming up in October. Stephan is program chair for the search marketing track. Check out the full program.
You may like these other MarketingProfs articles related to Search Engine Marketing:
- Visual Search for B2B Marketing Success: Olga Andrienko on Marketing Smarts [Podcast]
- Five Ways to Optimize Your Videos for Search [Infographic]
- Five B2B Brands That Prove Voice Search Is More Than Hype
- Google's 'Helpful Content Update': Five SEO Mistakes to Avoid
- A Complete Guide to Anchor Text Optimization in Four Steps
- An 11-Step Plan for Improving Your SEO Strategy [Infographic]