Tuesday, July 28, 2009

SEO No-Nos That'll Get Your Site Slammed by Google

It's Not Nice To Mess
With Google


Search Engine Traps:

Don’t Get Snared

Most of us know a bit about search engine optimization (SEO). We know the importance of a solid list of keywords. We know the importance of certain HTML tags. Maybe we’ve even added a site map and submitted it to Google’s site map reader function – the fast track to being recognized by Google.

There’s plenty of information on how to optimize your site but no where near as much info on what not to do – what search engines don’t want to see. And many of these search engine “traps” are in common practice so the simple fact is, you may have actually taken steps to improve your site optimization only to diminish site quality in the “eyes” of search engine spiders.

SEO rumors run rampant on the web and spread like wildfire. One blogger posts that Google doesn’t like the color red and the next thing you know 100,000 webmaster are reworking their color schemes. But there are some activities that Google itself will tell you to avoid. Here are some tips from Matt Cutts, the head of Google’s SEO spam squashers and if anyone should know what not to do it’s this guy.

Duplicate Content

Cutts mentioned a problem site that he’d encountered. It was owned by an on-line entrepreneur who also owned almost two dozen other web sites all, more or less, selling the same products.

No problem owning and managing 20-something web sites (good luck with that) and certainly nothing that would raise suspicions on the part of a Googlebot. But bots (also called spiders or crawlers) have become much more than letter string munchers. They’re little automated detectives, today.

Cutts pointed out that this site owner used the same content on many of his sites. The bot even discovered the same pages appearing on different sites. A money saver, to be sure, but, man, did this site get slammed.

Another example (not from Matt Cutts). A doctor put up a web site on women’s health topics with a strong emphasis on good pregnancy health. The site was well designed, the owner had a back-end maternity store and he was getting between 10K and 12K hits a day. And, he ranked number two or three on Google’s SERPs so he was generating a lot of organic traffic. Sweet.

Then, one morning, Doc log’s on and discovers that his PR has all but vanished, he’s no longer on Google’s first page and organic sales have all but disappeared. Devastating.

So, he hires an SEO expert who analyzes the site top to bottom and can’t find anything unusual. But, this SEO talks with his client and discovers that the doctor had been writing informational articles for his web site and had decided to syndicate them to other sites through URLs like ClickBank and GoArticles. So, his beautiful original content lost all of its value to the home site because it was now duplicate content, and it took months to correct the problem. Who knows how many dollars were lost.

The point? Something as simple as syndicating content with links pointing back to your site can get you squashed like a Googlebug – even though you thought your SEO strategy was sound. In this case it wasn’t.

Bottom line, original content scores points. If you own more than one site, don’t recycle content and never use the same pages on different sites. SE bots will notice quickly and your moves to cut costs will also cut your chances for SEO success.

Elephantine Site Maps

In general, creating a site map is a good thing to do. Submitting that site map to search engines is also a good thing to do. It’s like sending a search engine spider a personal invitation to stop by and check out your cool self.

Problem. If your site has hundreds or thousands of pages, that site map is going to be huge – way huge – and frankly, bots ain’t bright so a gigantic site map may confuse these crawlers to the point where they don’t know what you’re about.

Problem solved. Break that huge site map into spider-sized portions. You can place all of these site maps into one XML file for easy spidering but you won’t overload the limited capacities of SE bots.

Yes, create a site map. (Check out site map generators.) It’s good for visitors and for spiders that follow links when crawling a site – links that appear on your site map. Just keep the size of the site map manageable for the minimal “minds” of site crawlers.

What Works for Google Gives Yahoo a Headache

True. Each search engine has its own weighting algorithms and its own set of indexing protocols. Example: your site might have a problem with Yahoo spiders because your uppercase URLs that appear on site pages won’t sync with Yahoo’s requirement for lowercase URLs according to Yahoo’s Site Explorer.

There are two ways to go here. First, if you’re a coder, create separate submissions for the big three search engines: Inktomi (MSN), Yahoo and Google. This requires that you carefully review submission guidelines, rework the site, submit it for consideration (or re-evaluation) and keep your fingers crossed.

The other alternative is to buy a site map generator. These software packs won’t break the bank (many under $100 and you can even find some OSS versions, though not recommended) and they take the hassle out of site submission.

You or your designer create one, generic site map. The software then formats the single site map to meet the demands of Google, Inktomi, Ask and literally dozens of other search engines, large or small.

If you wait long enough, spiders will find you but (for real) there are stories of sites being overlooked for months and even years. So, make sure your site map is properly formatted for each specific search engine before submission.

Link But Not Too Fast

Google’s recent patent application for its new algorithm (before Orion comes on line) indicates that the search engine is taking a closer look at inbound links to a site.

In the past, lots of inbound links indicated a quality site – one that other, similar sites would reference to their own visitors. Of course, this was abused. Links would be added for a month or two then disappear. Google’s new algo also watches the growth rate of in-bound links. If it sees too many too quickly it will raise a flag and tag your site as suspect.

Site link growth should happen naturally over a period of time. Vary anchor text to give links a more “natural” look and avoid what Google calls “burst link growth” which smacks of search engine spam. On the web, no one grows that fast. Slow and steady wins the race.

Avoid Session IDs

Eliminate session IDs from URLs. Long URLs are a challenge for the URL-challenged bot and are very likely to cause bot confusion and either a partial indexing or worse, mis-indexing by the search engine. Here’s how Google puts it in its own guidelines:

If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a simple text browser, then search engine spiders might have trouble crawling your site.”

- Google Site Guidelines

Too Many Websites

There are people earning a living by building websites, adding a bit of informational content to get the attention of crawlers and loading up these sites with Google Adwords – those contextual ads you see on some web pages. Ads by Goooogle.

If all you’re doing is building sites to generate Adwords revenues, you aren’t really providing a service to visitors and search engine spiders don’t like that. Not a bunch of links to other sites, spiders want visitors to access information quickly and, if possible, right off the SERPs without any secondary clicks to other sites. Even Googlebots don’t like to see lots of Google Adwords links on every page of every site you own. Spiders want relevant, useful content, helpful to the visitor, not a page with a bunch of links – even if they’re Google-sponsored links.

Avoid Private Whois Listings

Whois is the directory of site ownership – who owns what domain. You can choose to keep that information private but it looks suspicious to spiders that crawl this directory regularly.

The perception is that those with numerous private listings have something to hide. Maybe yes, maybe no. But as far as spiders are concerned, too many private Whois listings can hurt you by adding certain search engine filters to your site when indexed at HQ.

The key to SEO is to recognize that it’s a process and there are some short cuts but not many and unless you know precisely what you’re doing you could end up hurting your SEO rather than helping it.

Read the protocols of Google and Yahoo. (Inktomi’s protocols aren’t readily available.) Make sure that your HTML is properly formatted, that content is informational and useful to visitors and that inbound links don’t happen too quickly and that every link has some relevance to the subject of your site.

It takes time to optimize a site – it’s a process, not a goal. Be patient and play by the search engine rules. To do otherwise is simply shooting yourself in the foot – and you know that’s got to hurt.


Looking for some understandable advice about search engine optimization for your new web biz? Drop me a line and let's have a look.

Webwordslinger.com

No comments:

Post a Comment