There is a new round of “SEO Myth” discussion this month and some of the favorite old windmills of SEO gurudom are being targeted for dragon-slaying once again.

Here’s a quick run-down of the nonsense and bad advice you’ll get from your friendly neighborhood SEO debunkers (and there is more than one guilty party so let’s not worry about who got nailed this time).

SEO Myth: One good link outweighs content – You can easily outrank any site that depends on one link from any of the following: CNN, Time, Whitehouse.gov, Apple.com, Google.com, etc. by repeating a keyword on the page often enough.

The SEO community loves this myth because the SEO mindset is trapped on the link building treadmill.

The fact that you CAN rank through link anchor text in no way implies that links carry more weight than content. Or, to put it another way, just because you rely on links for your rankings doesn’t mean you are optimizing for search. All it means is that you’re relying on links for your rankings.

SEO Myth: Links are the most important ranking factor – This nonsense should have died long ago when Rand Fishkin published his last ranking factors survey which showed that a lot of people favor the title tag over links.

Are links a close second in the community wisdom? Sure they are. But maybe it’s time to see if the SEO community wants to place links above the title tags.

Frankly, in any algorithm that considers 200+ “signals”, arguing that one signal is more important than another is a waste of time. The fact of the matter is that most SEOs invest their resources in link anchor text — which only proves that links are the most important factor to THE SEO COMMUNITY (rather than to search ranking algorithms).

SEO Myth: Meta tags don’t matter – I don’t know which is worse: the claim that meta tags don’t matter or the rebuttals of the claim, which usually focus on title and meta description. Here is a quick list of page header/meta tags you SHOULD use in your search engine optimization:

  • Title – If you want to target a page for a specific expression, make sure the expression (or its most important words) appear in the title. Make your title an elevator pitch to searchers.
  • Meta Description – Use your meta description to reiterate your elevator pitch to searchers. Your most important keywords should appear in this tag.
  • Meta Keywords – Because Ask and Yahoo! still index its content, and Google is NOT the only search engine out there, use this tag. It helps with some site search tools, too.
  • Robots – If your page is listed in DMOZ or Yahoo! include “noodp,noydir” in this tag. If you’re retiring a page and don’t have the ability to implement 301-redirects, use this tag with “noindex,nofollow,noarchive”.
  • seomoz (conditional) – If you don’t want people to know what you link to, block Linkscape (but keep demanding a robots.txt-level solution that actually works). Majestic SEO (who honor robots.txt exclusion) and other link indexers should also be blocked (if possible) to protect your linking privacy (if that matters to you — the point is, don’t just block Linkscape if you want to protect your link privacy).

You, your grandma, and your neighbor’s barking dog will all roll your eyes over my inclusion of the keywords meta tag but, frankly, it still helps with search engine optimization. Google is not the only search engine. You’re not optimizing for search if you only focus on Google.

In fact, you’ll realize more benefit from using the keywords meta tag than from …

SEO Myth: Sculpting Your PageRank – Despite the fact that no one has ever shown this actually works (yes, several people have claimed success but they haven’t provided any independently verifiable data), a lot of people still talk about it as if it’s a proven, fundamental principle of search engine optimization.

I still find people linking to Rand’s pro-PageRank sculpting post, “quoting” Matt Cutts’ alleged support for the idea (Rand added Matt’s exact words after his own interpretation). Matt disagreed with Rand’s interpretation in a comment on the post and took the position (adopted by several other Googlers including JohnMu and Adam Lasnik) that “most regular webmasters don’t need to worry about link-level PageRank flow within their site”.

In another comment on that same post, Matt said: “… if your site architecture is at all reasonable … then internal PageRank flow is more of a second-order effect. So I’d worry about other things at a higher priority.”

Frankly, since you cannot measure PageRank, you have no hope of controlling or sculpting it. This will never be a fundamental SEO principle — rather, it’s just fundamental nonsense.

SEO Myth: You Need Latent Semantic Analysis Optimization – I’m still seeing this old tune on a number of SEO Web sites. You’d think it would have been dropped by now. Technically, the search engines ARE developing new semantic engineering tools and methods, but if you’re going to optimize for Latent Semantic Analysis you’ll have to create a statistically broad set of Web sites that all use similar language.

Start with, say, 1,000 sites and see where that gets you.

The whole point of latent semantic analysis is that it looks at how language is used by many different documents, building a map through structure and function to help determine relevance. Individual keywords become less important in this kind of analysis.

In other words, there are a thousand ways to write about search engine optimization. You can call it search engine marketing, search market position, search rankings engineering, Web site promotion technique number 5 — whatever. You’ll end up saying many things similar to what other people say about the topic, and you won’t even know you are. LSA is a holistic approach to analyzing a document, not a keyword-manipulation technique that can be offered for sale.

SEO Myth: Search engine submission doesn’t work – I noticed this in Rand’s article about rewriting his Beginner’s Guide to SEO.

He’s right. And wrong.

Rand is right because using a “Submit URL” form has no effect (so far as the last couple of tests I’ve participated in determined).

Rand is wrong because people are submitting URLs in volume to the search engines through XML sitemaps.

It’s not that search engine submission doesn’t work, it’s that the method of search engine submission has evolved. Frankly, who wants to submit a single URL to search engines anyway? I’d rather submit thousands at a time (and not get in trouble for doing so).

Of course, only this week I mentioned on SE Roundtable that I no longer submit XML sitemaps to search engines. Registering a Web site and uploading an XML file is a time-consuming task. For the past six months or so I’ve just been linking to sitemaps in robots.txt. The sites are getting crawled and indexed at about the same rate as before. NOTE: Only do this if you work with substantial linking resources to ensure the search engines visit new sites.

Search engine submission is alive and well. It just doesn’t look like its daddy did.

SEO Myth: Toolbar PR is tied to crawl rate – This myth actually sounds plausible (and for all I know there may be some fact buried beneath the fictional analysis). However, except for non-disclosure agreements, etc., etc., I could show you sites with PR 1 values that get crawled and indexed every day. I could show you sites with PR 4 and 5 values that are crawled and indexed (by Google) less frequently.

Your internal PageRank undoubtedly has something to do with how often your site is crawled, but your site’s behavior has something to do with it, too. Remember, search engines (well, Google in particular) are evaluating how sites behave. They’re not just looking at who you link to or how you link, they’re also looking at your XML content, your internal link structures, how often you publish, and more. Everything that a Web site can do — from holding a gadget in place to swapping links on a random basis — constitutes the aggregate behavior that is unique to each site.

In other words, don’t try to guess how often a site’s linking page will be crawled by Toolbar PR. You’re wasting your time.

The average blog, once Google knows it is active, should be crawled and indexed within minutes of posting a new article as long as it is pinging major ping servers.

SEO Myth: You should use nofollow for any links you sell – This myth, of course, comes from Google itself. Matt Cutts advises people who sell links to disclose those links to Google.

The SEO community generally equates disclosure with the use of disclaimers.

Disclaimers are for people, not search engines. If you want to disclaim your links, do so in your footer (or put disclaiming text near the links). Use language like this:

The links on this page are not endorsements. Links are provided only for reference and do not in any way constitute editorial opinion or preference.

You get the idea.

However, it’s misleading for Google to tell people they are “disclosing” paid links by using “rel=’nofollow’”. The casual visitor has no clue as to what is going on with the links and is as likely to click on followable links as on nofollowed links. And, frankly, if using nofollow means a link is paid for, I’ll never use nofollow again.

You’re not disclaiming anything if people cannot see the disclaimer. You’re not disclosing anything by telling a search engine not to follow a link. Are PageRank Sculptors disclosing that they paid themselves for their own internal links?

Google needs to stop promoting the use of hidden content for its sole benefit because clearly you’re not disclosing or disclaiming anything by slapping “rel=’nofollow’” on a link.

Keep in mind that I’m not advising people to buy or sell links. You have to figure that out for yourself. You should already be promoting your site to the other 65% of searchers who don’t use Google — AND you should be promoting your site through means other than mere search engine optimization.

The buying and selling of links is a perfectly legal and acceptable practice that predates the search engines’ current algorithmic use of links. Just because Google objects to the practice doesn’t mean it’s bad. It just means that Google wants to control how the Web works for its own financial gain. Is that what you want?

SEO Myth: Google is the only search engine that matters (because it controls 70/80/90% of search traffic) – You have a lot to learn about search market share if you still believe this nonsense.

As of this week, Microsoft is now the only search engine openly sharing any recent data about how many queries it processes. They claim they are serving 3 billion queries a month through their search API. Put that in your search market share article the next time you want to tell people Google dominates Web search.

We don’t know how many queries the search engines process each month. All the metric services buy data from ISPs and use statistical modeling to estimate traffic. The real measure of search engine market share, however, is not what the estimates come out to and absolutely has nothing to do with what you see in your server log (after all, if you only optimize for Google, why should you expect a lot of traffic from Yahoo!, Ask, and Microsoft?).

The real measure of search engine market share is how much traffic the search engines send to other sites (as opposed to their own destinations). A lot of Google’s search traffic only leads people to other Google destinations. That’s not search market share — that’s just being a sticky Web site.

The Power of Repetition
So what does this article have to do with its title? I’ll admit I was inspired by something Rand wrote on SEOmoz. He said: “Studies looking at thousands of the top search results across different queries have found that keyword repetitions (or keyword density) appear to play an extremely limited role in boosting rankings, and have a low overall correlation with top placement.”

That would be because most top-ranking sites don’t repeat their keywords very much — which has no correlation whatsoever with whether repetition works.

In other words, to show there is any scientific and statistical credibility to the claim that repetition doesn’t work, these “studies” must look at sites that are actually repeating keywords on the page rather than through anchor text. If a study fails to exclude sites ranking through link anchor text, any claims that “on-page repetition doesn’t work” are bogus and naive.

I find sites every day that rank because of the on-page repetition of keywords. It’s not hard to do when you look past the obviously hyperoptimized verticals where everyone is mired in the link muck.

Most SEOs implement their repetition through link anchor text. You can just as easily (actually, more easily) rank by embedding all your keywords in your copy (and use irrelevant anchor text in your links) for most active queries. It sometimes, though rarely, works in hyperoptimized queries (probably only thanks to universal search injection).

You don’t need to study “thousands of search results” to prove that repetition works. You can test it yourself.

The real power of repetition is demonstrated every time an SEO blogger repeats a myth or “debunks” one myth with another myth. The more these ideas are repeated, the more credible they appear to be. But no matter how many times you repeat a myth, it’s still just a myth.