LSI – Three letters that spell big changes

The methods for ensuring that a website performs well in search rankings has changed significantly over the last few years, with the major engines (specifically Google) continually attempting to improve the relevancy and quality of their results.
This has meant that search engine optimisation professionals have (rightly) been forced to adopt new processes for conducting their work, and can no longer rely on ‘one-size-fits-all’ solutions. It has also put much of the power in the hands of content producers – front-end copy has never been so important.
Before we go any further, let’s go back a few years….
Keyword stuffing – a brief history
Keyword stuffing is a completely outdated method of achieving high search engine rankings. It involves placing a large number of keywords into website text and meta data multiple times, sometimes in the main copy, sometimes as hidden text and sometimes by creating a mass of content that is more concerned with being readable by search engines than by actual human visitors.
When Google et al’s algorithms (the complex equations it uses to decide which websites appear where in its search results) were less complex, they would specifically look for occurrences of these keywords and descriptions as a major factor in deciding how well a page should rank. But things have changed and keyword stuffing will now result in penalties to a page’s ranking.
Google’s algorithms have been amended and evolved many times, and are now extremely adept at understanding not only the content of a page itself, but also the context of that content.
What is LSI?
LSI stands for Latent Semantic Indexing, and as a concept is much easier to understand than it sounds. Basically it alludes to Google’s ability to view the content of a web page (and website) as a whole, focusing on themes, similar words and relevancy as opposed to keyword density.
LSI closely resembles the thought processes that an actual user would go through to determine the relevancy of a page. So it doesn’t use a thesaurus to pick related words, it uses human and linguistic trends.
To illustrate this in action, you can perform a search in Google with a tilde (~) in front of the keyword to do a ‘semantic search’. So if you search for ~clothes you will see (among others) the following returned in bold:
clothes – dresses -fashion -t-shirts -clothing store -outfits -shirts -suits -wear -dress
Google also uses this technique to understand in what context a keyword is being used on a web page, as illustrated by the example in this video.
How Can You Take Advantage of This?
First of all, you can make sure you’re clued-up about which words and phrases are semantically linked to the content of your website. Use the Google ~search and the Google keyword tool to find ideas around which to base your content.
Secondly, if you aren’t good at writing copy (please be honest!) then get someone who is good at it to create your content for you. Arm them with your research (or get them to do it) and let them create copy that reads well and works from a search engine perspective.
The key is to make your content sound natural and ‘real’, whilst keeping in mind the fact that you don’t have to plaster it with keywords.
And finally, make sure you review your web copy regularly. If your content was written a while ago it could probably use an overhaul to ensure that it continues to work effectively.