Great SEO Optimisation Needs More Than Just Good Content
Search engine optimisation is a dynamic and ever-changing field. What once worked to get your business to the top of the page rankings may now leave you with a hefty penalty from search engines like Google. Gone are the days of generating high rankings using black hat SEO tricks. Google’s recent update of Panda 4.0 recognises when websites are trying to fool them using tricks such as hidden keyword stuffing and duplicate content, therefore new methods of ensuring strong SEO are a must to stay in the search engines’ good books.
High quality content is constantly being referred to as one of the most effective ways to generate good rankings, and while this is a very true statement, the coverage of content being king is now widespread, and shouldn’t drown out other factors that also need analysing to ensure great SEO.
Below I have discussed four key aspects of a website that needs to be reviewed when optimising for SEO:
Having a high quantity of links is often mistaken as the perfect way to increase rankings on Google, however, as previously discussed, changes in Google’s algorithm now sees a high quantity of low quality links as spam, and thus penalises you accordingly. From this, we can take away that the quality of your potential links is much more important than how many you can add to your list.
When searching for websites to gain link from, use tools such as SEOquake, Mozbar and MajesticSEO as these are great for analysing a website’s authority and ultimately what effect they may have on your rankings. SEOquake can be used to review anything from the number of internal/external links the site has, to the value of their Google index. Mozbar looks more into their domain authority and page rank, whereas MajesticSEO takes a detailed look into the website’s trust and citation flow. Remember, make sure you are linking to companies that are relevant and provide useful content to your target consumer.
The following article provides useful tips on how to generate links.
Optimising Meta Tags
Meta tags are pieces of coding within you website’s algorithm. These tags help search engines determine how relevant the specific page is when correlating it with user searches. There are two main types of meta tags: description and title tags.
Description tags are what the user can see when searching for keywords in Google. To optimise this tag, you should be aiming for a word count of around 140-200 characters and implementing enough relevant key words into the content without over-stuffing it.
Below is an example of a description tag:
Image Source: Dariopetkovic.com
Title tags on the other hand are the main title included in the header of the web page. These tags are of utmost importance, as this often determines what shows up in the SERPs. Keyword research should be undertaken before deciding on title and description tags to ensure the relevancy of your coding to user search terms.
Moz suggests that the optimal format for a title tag is:
Primary Keyword-Secondary keyword-Brand Name
They also provide a useful article regarding how to produce effective title tags.
There is no use having good content and optimised keywords if your webpage speed is lacking. Kissmetrics states that 47% of consumers expect a web page to load in 2 seconds or less, and a one second delay in page response can result in a 7% reduction in conversions. Customers use the internet for quick and easy solutions to their wants and needs, so having a slow site speed can result in less interaction and ultimately a higher bounce rate.
Google site speed report is a good tool that measures site speed using three aspects of latency:
– Page-load time for a sample of page views on your site
– Execution speed or load time of any discrete hit, event or user interaction that you would like to track
– How quickly the browser parses the document and makes it available for user interaction
This information can be used to analyse the effectiveness of your site’s speed and what you need to improve on.
Create an XML Site Map
An XML site map is basically a list of URLs on your website with additional information about each web page. XML maps offer search engines an easier route to crawl your website when looking for things such as full list of sites, changes in pages, date last modified and relative importance of your website. Giving Google this list of URLs ensures that you are providing them with a head start when trying to research and index your content. Sitemaps also provide the webmaster with specific types of content within your site, such as images, video and news.
Site maps are most valuable when your site has any of the following:
– Dynamic content
– Pages that aren’t easily accessible by search engines
– Large archive of content pages not sufficiently linked with each other
– New site without many internal or external links.
The following post by Google Support demonstrates how to create your own XML site map.
SEO isn’t about fancy tricks or fooling search engines, it is a long-term strategy that will need constant monitoring. The four techniques discussed are just a few ways of optimising your SEO. Make sure you are keeping up with any changes to Google’s algorithm to ensure that you are fully equipped with an understanding of what Google will want to see from websites in the future.