Recovering Site Ranking After A Major SEO Mistake

Add Your Comments

Synopsis – There are multiple dangers awaiting those who market their products online, but one of the most frustrating is the feeling of a lack of control over the search engines you depend upon to spread the news of your products and services throughout the market. It’s possible to one day go online and check your keywords to gloat over your number-one ranking on Google only to find that you have not only lost that spot, but you have disappeared entirely from the search engine results entirely.

Chris Boggs, in his article “Recovering Site Ranking After A Major SEO Mistake,” discusses the problem, pointing out the three categories of reasons why a site may suddenly loss position – the “opps,” “duh,” and “dang, got caught.” With examples of each and ways to avoid and mitigate problems from each, Chris provides a succinct look at a webmaster’s worst nightmare.

The complete article follows …


Recovering Site Ranking After A Major SEO Mistake

Have you ever had the sinking feeling of checking your favorite search term one day to see if your site has climbed a spot or two, only to find a complete disaster has occurred? Although I’ve been lucky in this regard, over the years, I have had my share of Alka-Seltzer afternoons. I think the reasons sites suddenly plummet within organic search results fit into three classifications — the “oops,” the “duh,” and the “dang, got caught.” Most mistakes can be completely avoided by paying more attention to two things: SEO technical best practices and the search engine “rules.”

Insofar as the order of difficulty to get back in the game, the “oops, duh, and got caught” rank from easiest to hardest. The search engines do a great job of providing most of the tools needed to resubmit an adversely affected site, sub-domain or directory level, and typically the quickest fixes are as a result of “oops” mistakes. “Duh” mistakes can be a little more severe, and take additional time to fix both internally and in terms of regaining trust within the algorithms. “Dang, got caught” mistakes can take a long time to recover from — to the point where the domain should perhaps be written off as a business loss.

Oops And Duh

Accidents can happen with redesigning a site, and may have a negative effect on search engine results. These can range from a minor to a medium adverse effect. For example, redirects may be improperly mapped, resulting in negative user and search engine experience, including a large increase in 404 error responses. Many developers like to employ a temporary redirect to an error page and serve a 302 to a 200. In English (LOL), this tells a search engine that the page is only temporarily moved, and then serves an error page which claims it is OK. I have seen Sitelinks (the extra links below a branded search result) lead to 404 pages for weeks and longer in some engines.

Other “oops” and “duh” mistakes include (among dozens) improper use of the robots.txt file and setting up long server downtimes without properly informing the search engines. A friend once told me that the robots.txt file is like a ninja sword, and thus must be handled very carefully. Long server downtime can cause the site to be removed from the index. If you have accidentally blocked your entire site or even some major directories from being indexed, this can be a relatively painless fix (measured in days to weeks) by resubmission within search-engine- provided toolsets such as Google Webmaster Tools. Without being set up and verified in these toolsets at Google and Bing (and Yahoo! for the rest of 2010), you take great risk in your ability to mitigate.

Dang, Got Caught

Unscrupulous SEO practitioners and high-risk-taking marketers are the most likely to be victim to the sometime severe backhand of the algorithms. There is not much to say in this area other than a number of clichés having to do with playing with fire. If you participate in tactics such as same color text as background or buying thousands of links at a time from unrelated domains, you will be caught. Large brands may be able to win their way back relatively quickly, but for others, it can take months or years of reinclusion requests. Search engine toolsets can help with this too.

Rules, like laws, are open to interpretation. One last thing to always keep in mind is that search engine crawlers can’t see, but rather rely on what they can understand from the code as being presented on the page. If automated text readers can clearly understand your content, and it doesn’t differ from what is on the page, you are probably safe. Also, there are many free toolsets available on the Internet to see how search engines see your site, and the redirects and response codes they are given. Being prepared and knowing your site’s performance is the only way to keep yourself truly safe from disaster.

About the Author

Chris Boggs of Rosetta is a specialist in search engine optimization and paid search advertising. Chris joined Brulant in 2007 as the Manager of the SEO team, and Rosetta acquired Brulant in 2008.

Add Your Comments

  • (will not be published)