In 2012, Matt Cutts explained how Google are looking into “levelling the playing field” when it comes to businesses creating high quality websites, without using aggressive SEO strategies. It was also suggested that Google would start to penalise businesses whose optimisation strategies consisted of engaging in over optimisation and black hat techniques. Since then, website owners have been eagerly checking their search rankings against there on-site optimisation methods.

To clarify, search engine over optimisation refers to a website creating too many SEO based improvements, to the point where Google can tell that the website is making changes just to rank. If there’s one thing you should take away from this, it is that you should always optimise your site for a human first, then Google second. If you can create a site that provides the end-user with a good experience, the result will have a reciprocal effect with Google, in that they value the relevancy of your site, thus placing you higher in rankings.

bigstock-Keyboard-with-Optimization-But-51162235-1 How to Avoid Over Optimising Your Website

The misinterpretation many website owners now have is that over optimisation is purely down to keyword stuffing. Keyword stuffing is pretty much the devil when it comes to Google’s preferences, but it isn’t the only factor contributing towards an over optimised web page. It should be noted now that any website owner thinking they’re in the clear just because they don’t stuff keywords need to have a good rethink and read the steps below.

So for those who are still not 100% sure on what to look for in an over optimised site, I have created a nice piece of visual content covering the majority of signs that point to an over optimised site.

Linking to Spammy Sites

A term frequently used in the digital world is ‘link juice’. This term basically refers to the amount of power and authority generated from a link to another site. When a site is authoritative, there is a higher amount of positive link juice. It’s important to remember that link juice is a two way system – It is not only links you have gone out and acquired that produce link juice, it is also outbound links from your site.

It is important to keep an eye on what websites you are linking to. To fully analyse whether or not you should link to a website, Google chrome extensions such as Mozbar, SEOquake and Majestic are fantastic tools to use. 3 Great Chrome Extensions to Analyse Your Target Sites has more information regarding these three tools.

In short, if a website has a DA lower than 20 you should avoid any engagement in terms of links. If you were to link to a number of these kinds of sites, it is likely that Google will immediately see your site as a lower value and therefore place you lower in the SERPs.

Using Too Many H1s on a Single Page

An H1 header is the piece of HTML code used for a page’s main headings. Think of this as a magazine’s main title. It is used to tell the reader exactly what the page is about. It is also used by Google when indexing and crawling your site. Used effectively in conjunction with a good Meta description, keywords and keyword targeted content, a good H1 header can help improve search engine page rankings.

A fully optimised H1 tag is extremely important, as it is the main indicator to search engines regarding what your site is about. However, take into consideration that any website should only have one H1 header. Some webmasters believe that stuffing in an abundance of different H1 headers means that they may be ranked for more keywords or search terms, this is not true, it is simply over optimisation.

If your website does happen to have more than one H1 header, I would highly recommend keeping the most relevant one and changing the others to H2s, H3s etc. These may not be as highly regarded by Google, but it beats being penalised every time!

Keyword Rich Anchor Texts

A keyword rich anchor text refers to links containing your target keywords within the HTML code. For example, if you were a swimming pool company, a keyword rich anchor text would be “swimming pools” or “outdoor swimming pools”. Since the Google Penguin update a few years ago, I would highly recommend staying away from using keyword rich anchor texts for external linking.

You might be thinking to yourself, “but surely this is a good way of incorporating keywords into your page”. You are right in a sense. To optimise your page, the occasional near exact match may help improve your SERP ranking, however in terms of over optimisation, it is best to stay clear of using this too often.

A good way to incorporate near exact anchor texts is to introduce long tail anchors into your HTML code. This basically means include the keywords, but in a sentence. For example, going back to the swimming pool company, you may want to set your anchor text as: “[text] [text] [text] swimming pool [text] [text]”, obviously incorporating the [text] contextually. This way you are spreading the keywords out and creating a less risky way to link back to your site.

Going Crazy With Ads

There is nothing more annoying to a web user than clicking on a site, only to find numerous advertisements. Not only does this decrease the trust for your website, but post penguin update, Google also feels the same way. Google have started to crack down on ‘ad over optimisation’ and have being giving out penalties to webmasters who are using this technique. Google bots can recognise when a site is full of advertisements and lacking substance relevant to the consumer, especially ‘above the fold’ content.

As much revenue as advertisements might bring to your site, if they are not crucial for the financial side of your business, I would highly recommend taking them down, or at least reducing them so that at least 80% of your website is relevant content. It is either that or risk having your website seriously drop in rankings, consequently having a ripple effect on your businesses profit.

Non-Relevant Keywords

Using non-relevant keywords is the key to failure. For example, let’s say your website sells bikes. A way of using non-relevant keywords and over optimising your site would be to exploit keywords that you feel are popular in search, but have no relevance to your site. As a bike retailer, setting your keyword to be “iPhone” when the release of a new iPhone is prominent will not help your site.

Using this form of black hat SEO puts you in Google’s bad books. Google continuously crawls and indexes websites for relevancy to the search term. They aim to rank your site’s keywords with relevant search terms. Therefore, if your content and back-end Meta have useless keywords in, they are likely to drop your relevancy in related search queries.

Directing all Internal and External Links to the Home Page

I know your probably thinking, why would I not link to my homepage? To clear this up straight away, I am not saying don’t link to your home page at all, what I am saying is there should be a healthy ratio of links to your top level navigational pages, as well as links to deep internal pages. The average for a healthy ratio is 50% either way.

The amount of deep internal links tends to be highest on pages generating extremely useful content. Think of it as link bait, if you were to create an awesome piece of content, people are going to want to link to this, thus providing you with deep internal links. One thing to make sure you are avoiding is pointing all the links on this page back toward your top level navigation pages or home page. Let your deep pages have their moment!

Too Much Duplicate Content

Every page on your website should serve a purpose to the user and provide relevant content. The trick back in the pre-panda days consisted of webmasters producing a variety of pages with very similar content, in order for each page to have a similar ranking for similar search terms. These days are over.

Google are now cracking down on duplicate content, as they view this kind of activity as over optimisation. As well as this, you shouldn’t optimise different pages to rank for very similar keywords or phrases. Instead, have one page that focuses on a range of similar keywords.

The way to get rid of duplicate content is very simple; use methods such as 301 redirects and canonical tagging. Identifying and Fixing Duplicate Content offers a much more in-depth way to fix any duplicate content you may have on your site.

Now you know a bit more about over optimisation, you might want to check your website for any of the signs. Optimizely is a great tool to use that will scan your URL and search for where you may need to focus closely on your site’s optimisation.