Search engine optimization (SEO) is a rapidly changing and exciting arena. In fact, the competition is heating up as companies awaken to a universe of marketing opportunities—providing their customers can find them, that is!

Where is SEO heading? What are the trends and new opportunities? And what are the real issues facing the industry?

To get some answers, MarketingProfs recently convened a Thought Leaders Summit with some of the best minds in the field of search engine optimization. In the 90-minute-long session, I (as the panel leader) tapped into the collective wisdom of panelists Cam Balzer (Performics), Christine Churchill (KeyRelevance), Mike Grehan (Smart Interactive), Ammon Johns (Propellernet), Brian Klais (Netconcepts), Barry Lloyd (MakeMeTop), Ian McAnerin (McAnerin Networks), Alan Rimm-Kaufman (Rimm-Kaufman Group), Eric Ward (EricWard.com) and Jill Whalen (High Rankings).

Here is the second in a two-part series on what our experts had to say about the sticky topics of SEO: What can be done to get rid of search engine spam? What are the ethical issues facing the industry? And how can you ascertain whether an SEO vendor is ethical?

We also asked our panelists for their top optimization tactics and discussed the pros and cons of links and other off-page factors.

Let the games begin.

Can the Spam!

I hit our panel with a tough question that even the mighty search engines haven't been able to solve: What can, or should, be done to get rid of search engine spam? I focused the discussion on what isn't already being done.

To date, trying to get everyone in the SEO industry to agree on any sort of guidelines or standards around search engine spam has been almost impossible. The greatest difficulty is that those who control content—the content authors and editors—have a vested interest in getting as highly ranked as possible.

Thus, the objectives of the search engines and Web content authors are not aligned. However, with blogs, it's in the blogger's best interest to keep the comment spam to a minimum, because it erodes trust in the value of their content. The href rel="nofollow" tag—now supported by Google, Yahoo, and MSN—is an easy-to-implement solution that ameliorates the problem for both the bloggers and the search engines. Finding similar common ground between the search engines and content authors is the way forward.

Search engines could, and should, be doing more to let webmasters know they are actually derailing their own future. And the engines are ignoring the great PR opportunity of rallying the cause. Search engines aren't helping themselves; they aren't publicly bringing clarity to the issues around spam for Webmasters, other than at conferences and in the online forums. For example, when exactly is cloaking spam and when is it not?

As search engines mature, we are likely to see a combination of spam solutions emerge that will involve technology, the community and the marketplace. But, first, they need to determine what is a "reputable site." PageRank algorithms do that, but better technology—who is referencing whom, and who can be trusted—will emerge.

From a technological point of view, we have already seen search engines adapt and move on. Future developments are likely to include the surveying of the clicks that are made off of the search results and the incorporation of the data into the determination of relevance.

On the community side, we are likely to see more in the way of whitelisting services, professional certification, professional organizations and Better Business Bureau stamps of approval. Longer-term, we wouldn't be surprised if there were even ways to purchase credibility, like a Better Business seal, a bonded program (like IronPort's Bonded Sender) or an SSL certificate—from a central authority that might say these people are transparent.

We have to remember that, ultimately, the problem belongs to the search engines. It is up to them to do something about it. Certainly, as search engines head into the third phase (i.e., personalization), they will push for wide adoption of the "nofollow" tag on links. But that will only go some of the way toward solving the problem, with so many millions of abandoned blogs out there collecting comment spam continuously.

No-No and No-Go Areas

One man's spam is another's clever strategy, particularly in Europe, where the environment is more liberal. So one has to determine, for oneself, what is ethical and what is not.

Tim Mayer from Yahoo admitted at a Search Engine Strategies conference that he understood there are different expectation levels when people get involved in this game. Applying classic SEO tactics if you are aiming to be number one for Viagra is a bit like turning up at a gunfight with a sword. It's not viable.

Cloaking is supposedly unethical, right? But it was applied successfully in a recent campaign for a TV media company, where the site being optimized was completely audio-visual. If you typed the name of the main character into Google, the site in question came up first.

So we have a situation where Google served up the most relevant, and most appropriate result. The end user was absolutely delighted. Some people would say that was unethical. But if everybody was so happy, why would it be?

Cloaking, by the way, is a method of showing search engine crawlers a page different from that which you would see with your browser. If you have a list of all the IP addresses for all of the search engine crawlers, when you see one of those IP addresses show up on your site you can, for example, serve a different page, one that's text-heavy and full of the keywords that you want—whereas a visitor would get the page full of Flash movies.

Cloaking is totally acceptable to search engines if they allow you to do it through a commercial arrangement. To be frank, trusted feeds are cloaking. This is where you put content into an XML format and send it to a search engine, asking it to rank your pages based on the XML-formatted text that you have sent in.

Even Google states that cloaking is OK in certain circumstances, particularly if you are fixing URLs to make them more spiderable and not really playing games with the content itself.

In the end, it's all about client disclosure. The client needs to be fully aware of the SEO vendor's tactics that will be used in the optimization of his/her site, and of the potential benefits or pitfalls that may occur. Nondisclosure by the SEO vendor of potential risks of the vendor's optimization strategy is possibly the most unethical thing an SEO vendor can do.

The consensus of the experts panel was to adhere to the search engines' published guidelines and optimize your site accordingly. If you work in a very competitive industry and so don't care about the tactics used, you really should rethink that approach. You are treading on dangerous ground.

Is It Unethical? or Is It Just 'Tactics' (and Therefore Acceptable)?

Search engines like to take the hard line and categorize things as black or white. In some cases, though, they are actually grey. A pet peeve of panelists was the engines' opposition to link buying.

Say you are an artist and have a local frame shop you like to recommend. In turn, the frame shop might give you a small referral fee for sending all your wonderful clients their way. The online version of this is a link from the artist's Web site to the frame shop's Web site. It makes sense. It's good marketing. It's good business, and that is what link building and link buying is all about.

Many times, a link is a great lead generator. Let's take the artist and the frame shop example again. You might get tons of traffic from that artist's shop. It is a great business link.

You might buy links for credibility. Taken to the extreme, your link from the local Chamber of Commerce could be considered link-buying.

Some techniques are misconstrued as always unethical, whereas in actuality they can have legitimate uses.

For example, the noscript tag is a perfectly acceptable way to assist search engines and browsers that can't read script or JavaScript. There's nothing wrong with putting your links in the noscript tag if you have, say, a DHTML menu that can't be read. The search engines have no problem with that. Of course, the noscript tag can be abused in unethical ways, such as by hiding links within a noscript tag that you intend only for search engines to find and never the users.

The same logic applies to the noframes tag. If you don't have a framed site, inserting a noframes tag full of keywords is spamming. But if you have a framed site and you're summarizing the content contained within that frameset, then you are just helping the search engines get to the information they are trying to get to anyway.

How to Find out Whether an SEO Vendor Is Ethical

It is an interesting sign of the times when you have to ask a vendor whether what they are doing is ethical. Obviously, they are going to say: "Yes, of course what we do is ethical."

Luckily, there are things you can do to connect the dots. For starters, it is really important to extensively talk with potential SEO vendors. They should explain the techniques they will be using. Are they telling you they use proprietary techniques? That's a warning sign. Do their techniques involve any kind of deception? Again, a warning sign.

Don't be afraid to ask really hard questions of the potential vendor, such as "Do you do doorway pages? Do you do deceptive redirects? Have you ever had sites banned?"

If they start backpedaling and talking about how SEO tactics are short-term ones, you might want to reconsider. SEO is a long-term investment. Done right, the benefits of SEO should last for years.

Another way to recast the issue around ethical behavior is to evaluate whether the SEO vendor's business practices are sustainable—i.e., likely to deliver long-term results to their clients. One question you might ask is the average length of time they have worked with their clients. A lot of client turnover indicates an SEO firm may not be taking a long-term approach.

Talk to both previous and current clients if you can. Are they happy with them? Have those clients' traffic and sales gone up?

Avoid SEO vendors that offer rank guarantees. You can't guarantee something you have no control over. The only way you can get a guaranteed rank is through pay-per-click. While we would all agree that search rankings can't be guaranteed, sometimes the workmanship can be guaranteed—i.e., that the work promised can be delivered to specification, and if the client is not satisfied... the vendor will either refund the money or make it right.

Another warning sign is if you receive unsolicited email (spam) from an SEO vendor. An ethical SEO vendor who generates long-term results doesn't need to advertise in such a fashion.

Practice due diligence to ensure that the SEO vendor is going to deliver on its promises. If you're told "We don't do anything deceptive," and then you visit a client site and see a bunch of hidden text and links and so forth, then, obviously, the vendor is already lying to you.

There are ways to take a look under the hood and verify the SEO vendor's claims. For example, examine what their clients' Web sites are serving to the search engines.

There are a couple of ways to view a site through the eyes of a search engine spider: one is through a Firefox browser extension called User Agent Switcher; the other is through the cached version of the page that was indexed by the engine, available from the cache link in the search results.

Compare the page meant for the search engines with the corresponding page off the native Web site as seen by a normal visitor. If the content served up to the search engines is something completely different from what is served up to visitors, then the site is spamming. Typical things to look for when making your comparison: the title tag is significantly different, and keywords have been stuffed into the meta tags and into parts of the Web site to help the version shown to search engines rank better.

Do a quick search on the Internet to see if people have posted complaints about the company. You can also check the online forums, and sites like SEOPros.com and SEOConsultants.com that take a closer look at their Web sites and clients prior to inclusion.

In short, do your research before you spend money on SEO. Get educated, perhaps by attending one or two industry conferences, to really appreciate the complexities involved. Not only will you go into the engagement with eyes wide open about ethical boundaries, you'll also be more receptive to their good ideas.

Optimizing a Site for Both SEO and Usability

Usability is a term often talked about in relation to Web-site best practices. Are these twin objectives of usability and optimization at odds, or are they harmonious? Are they two processes, or are they one?

Think of it this way: search engine spiders are just another visitor to your Web site. They just they happen to be a visitor that refers a lot of other visitors to your site. That particular visitor always just happens to be "disabled"—it can't read your graphics, or view your Java applets and Flash movies, or fill out your forms. So usability really does go hand-in-hand with search engine optimization. You are making your Web site usable to the search engine spider.

The same tactics you would use to deal with someone who is blind, or someone who has difficulty understanding high-tech ways of navigating through a site, should also apply to a search engine spider.

A first view of a Web site should be from a usability standpoint. You are wasting your money if you are going to spend a lot on SEO and achieve wonderfully high rankings, but your site is unusable and can't sell anything. You can and should construct a site for both human usability and spider readability.

The Most Effective Optimization Tactics

The panel was in agreement that linking is where the action is. Putting together a really comprehensive linking strategy is one of the most important things you are ever going to do to garner high rankings. Getting a site indexed is not that difficult. Ranking is. And remember that we are talking about quality links.

SEO boils down to making the best possible site you can—which means good content—and then telling as many people as possible about it, which means links.

Get the content right so your Web site deserves to be in the position that you want it to be in, and then get the links to let the search engines know that you are there and that there is some content there worth visiting. Put the two together, and you end up with a powerful package.

When looking at your site content, remember that every page on your Web site is a potential entry point. So make sure it is usable both by surfers who actually visit that site and can navigate out of it, and also by search engines.

Also make sure that every page is viewed as a separate potential entry point. Hire a copywriter to ensure that the copy is effective. Your site is like a virtual sales force in the search engines. And it's those content pages or product pages on your site that can really drive a lot of traffic.

Cast a very broad net by tuning the templates of those pages, especially if your site is database driven, so that every page of content is performing at its maximum in the search engines. That broad net can catch a lot more fish than just a couple of rods and reels—i.e., focusing only on optimizing a handful of your pages for high-volume keywords.

Links and the RSS Secret Weapon

Up until a couple years ago, nobody ever really asked about links in relation to search engines. People wanted links only because they knew that links would help bring them traffic from partner Web sites. Now the majority of linking inquiries are along the lines of "What will you charge us to get 100 links?" or "Our site has a PageRank three. What will you charge us to boost that to a PageRank seven?"

This has created a climate of obsession over links to the detriment, in the long term, of the Web site itself.

Now, thanks to RSS, or really simple syndication, you can address both simultaneously. RSS enhances the value of your Web site by offering a new channel for your end-users to regularly pull your most recent and most valuable content into their favorite RSS news reader or Web-based aggregator.

An example is the My Yahoo service (https://my.yahoo.com). Millions of people now have their own personal Yahoo pages, where they get their news, weather, stock quotes, etc. By offering content as an RSS feed (with a little "Add to my Yahoo page" button on your site), with just one click visitors can make your content part of their regular visit to their My Yahoo page. You can have a similar, clickable link that automatically subscribes people to your feeds using their preferred service, whether My Yahoo, Bloglines, MyMSN or others.

But, in addition, and of primary importance to SEO, RSS offers a syndication channel that one can use to get links. If you have an RSS feed with fresh and valuable content, webmasters will syndicate that content and the associated links onto their own sites. For example, nanodot.org displays Slashdot's latest 10 headlines on its home page, giving Slashdot 10 links.

Furthermore, you can get links direct to your feed, just like you get links to your home page or to other site pages. A feed is just another element of a site that's linkable.

There are at least 50 to 75 venues that you can submit your RSS feed to, and they link directly to your feed URLs. You can also submit your site to the feed search engines, such as Feedster, that happily accept submissions of RSS feeds—and, really, that is the same thing as accepting a link submission.

Tracking their readership probably hasn't crossed the minds of Web site owners already offering RSS. With RSS, search engines and people are out there pinging the content every five minutes, every 10 minutes, every hour, every two hours. Each hit shows up in your Web server logs. More sophistication is called for to be able to say "How many accesses to my RSS feed were actually because people wanted to read it, or were from search engines pinging it to see if it has been updated?"

As marketers become more savvy with RSS technology, personalized and individually tracked RSS feeds will become common.

Interestingly enough, we aren't sure yet whether the major search engines are indexing the underlying XML and following the links. RSS content is by nature very dynamic. It changes so quickly that by the time an engine indexes it and somebody searches on it, the underlying content has changed. What the major search engines will have to decide is, "Do we want to index feed content, considering how often it is going to potentially update?"

It promises to be interesting times ahead for the search engines, for their users and for us marketers who are trying to stay on the forefront of SEO.

Subscribe today...it's free!

MarketingProfs provides thousands of marketing resources, entirely free!

Simply subscribe to our newsletter and get instant access to how-to articles, guides, webinars and more for nada, nothing, zip, zilch, on the house...delivered right to your inbox! MarketingProfs is the largest marketing community in the world, and we are here to help you be a better marketer.

Already a member? Sign in now.

Sign in with your preferred account, below.

Did you like this article?
Know someone who would enjoy it too? Share with your friends, free of charge, no sign up required! Simply share this link, and they will get instant access…
  • Copy Link

  • Email

  • Twitter

  • Facebook

  • Pinterest

  • Linkedin


ABOUT THE AUTHOR

image of Stephan Spencer

Stephan Spencer is the founder of Science of SEO and an SEO expert, author, and speaker.

LinkedIn: Stephan Spencer

Twitter: @sspencer