Is Keyword Density Becoming Less Important?

5 comments

However little or long you’ve spent optimizing your website, you’ll almost certainly have heard of the following pearls of wisdom; Despite popular belief, spending hours tweaking meta description and meta keyword lists will not result in a rankings spurt. There are no quick fixes in SEO. Content is king. And, keyword density is a major consideration when writing content. Right? Wrong! As with most things in SEO, it seems just as you get to grips with what can initially be quite a tricky concept, the landscape shifts again.

For a while now, it’s been known that content is gathering steam as a powerful ranking factor. The more relevant content you have, the more regularly you add new content and the more successful your content is in attracting inbound links, the better your site will rank. It’s also been universally accepted that when producing this content, not only do you need to write for the human reader, but you must also factor in the needs of a computer algorithm, requiring a scientific and calculated approach to when and how often keywords are used in that content. This is known as keyword density and makes SEO copywriting a craft that few can confidently claim to master.

However, we also need to consider that if content is indeed king, then weaving in one particular word or phrase a certain number of times can’t be all there is too it. In fact, it isn’t and the complexities of language, the use of language and changing search trends all need to be factored in.

Managing keyword density badly is actually very easy to master as it’s simply a case of inserting a large number of keywords in titles, descriptions and text. This makes the keyword density system as open to abuse today as meta descriptions and meta keyword elements were five or six years ago. As search engine algorithms become more complex and more intelligent, so keyword density can become less relevant – after all, if it’s as easy as simply keyword stuffing a page, how will that aid the user experience?

A far more interesting and useful concept is that of latent semantic indexing (LSI). Increasingly, as we see search engines becoming ever more lifelike in their ability to extract meaning from a web page, consideration of latent semantic indexing must become more prominent.

But, what does this all mean at a practical level? The first big takeaway is that it’s not useful or productive to get too hung up on finessing a particular keyword into each on page element as often as possible. Likewise, trying to adopt the mentality of a computer whilst still retaining the sensibilities of a human while writing website copy is also not a productive use of allocated online marketing hours.

A simple example is that a page talking about AdWords, Chrome, and Gmail is about Google. You don’t need to use the actual word ‘Google’ at every opportunity again and again for that to be understood. Not all examples are this straightforward – a page including Windows, Excel, and Word is obviously about Microsoft. However, the keyword ‘Windows’ could also refer to panes of glass – a totally different context in which the example page would not be relevant. Making that determination is something instinctive for a human reader, but difficult for a computer program to appreciate. That is where latent semantic indexing comes into its own as it can essentially compare the language of one page with another page talking about the same thing to ascertain meaning based on the principle that words used in similar contexts will have similar meanings. Essentially, this takes into consideration sentence structure and language by allowing for synonyms of terms to be used, to understand what a page is about and how relevant it is to a search query.

While the complexities of LSI may lead you to believe that content writing just got harder, actually the opposite is true. Trying to write web copy based on the concept of keyword density as an important ranking factor is difficult because you are forced to write against your natural instinct of not repeating a keyword again and again and again. It makes the task of writing tedious and difficult as you’re forced to forcefully include a particular word a set amount of times, regardless of what would otherwise be the natural progression of the text. Writing based on a belief in LSI is much easier as you’re free to write naturally, without trying to keyword stuff. The search engine can see through the use of synonyms to extract meaning from a text, allowing for a more creative and interesting use of language. Invariably, without a tedious keyword repetition, the content you create will be of more interest and read better to a search user – aiding the user experience.

About the Author

Rebecca is the managing director of search engine optimization agency Dakota Digital a full-service agency offering SEO, online PR, web copywriting, media relationship management, and social media strategy. Rebecca works directly with each client to increase online visibility, brand profile, and search engine rankings. She has headed a number of international campaigns for large brands.

Add Your Comments

  • (will not be published)

5 Comments

  1. Rebecca, Nice article. Keyword density should not be employed as the primary optimization method. You are correct LSI is a much stronger SEO methodology. I wrote about LSI back in 2007. Here is a link to help your readers appreciate how it works. http://www.webmetro.com/blog/Search_Engine_Optimization/Importance_of_Latent_Semantic_Indexing_in_SEO.aspx About this time we switched from the Keyword Density optimization paradigm to LSI and we have not looked back. With LSI - it helped not only support the main keywords but also the tail. In fact our clients rankings and traffic increased with the Mayday Update and Caffeine releases from Google. Once again nice article!

  2. Has LSI been proven as a usable, working, effective strategy by search engines? I will do my own online research of course, but I was wondering if you have done some already and could provide your findings in say how Google has implemented LSI into their algorithms and how successful (or not) it has been. Is there a way for webmasters and content writers to gauge the effectiveness of their content against LSI? Thank you very much for a great article and introduction into the ever changing environment of SEO! Happy Searching.

  3. Bort

    Discussing keyword density in 2010? Good lord.

  4. John

    This is yet another instance of too many buzz words and not enough understanding. Keyword density hasn't been on any serious SEO's radar for the past 3 years, maybe longer. Additionally, content isn't growing in value, it has always been valuable, but, considering Google's current algorithm has over 400 variables saying anything is king is pure jack assery. I have had nearly empty sites, from a content perspective, rank for massive numbers of competitive terms. I will say that fresh content on a regular basis increases attention and crawl rates, and therefore traffic and rankings, but the majority of this traffic is of the informational sort which is rarely competitive and only occasionally conversion quality.

  5. I personally don't see keyword density as ever going out of discussion because a time might come where it becomes a criteria for possible ban. you never can tell what these google guys can do. I have come accrross the LSI before but the truth remains that content alone is not the only criterion. there are more that 499 others.