Web design Carterton from Websites by Mark
01305 881027 / 01993 820005

Using on page SEO to improve your SERP’s

It is known that Google has over 200 ranking factors when deciding where your webpage appears in its search, but most of these have been designed to stop the abuse of black hat SEO companies.

Generally SEO is split into on-page and off-page, simply put on-page is the content on your webpage and is perhaps the only part you have complete control over.

The google news is that search engine place a lot of weight on this content so anything we can do to help with on page, the better.

With ranking factors being tweaked on a daily basis, it can seem impossible to make logical and well thought out changes, but this is far from the truth. As although there are almost an infinite amount of interrelated variable, there are some well documented changes that do have a constant effect.

And because no one actually knows Google algorithms, the only source of data we have is the results of any given search.

The method outlined below is known to work and requires no specialist tools or knowledge. That’s right this is about as close to a magic bullet as there is.

The technique revolves around looking at how the top sites for your search phrase use those search phrases in the make-up of their pages. By looking at multiple high ranking sites we get a ‘feel’ for the data and it removes the spikes from any one site.

Give me anonymous data

Google by default tries to be helpful by weighting sites you have already seen and by taking account of other historic browsing data. In other words when you do a search on Google the results you see might not be the same as someone else conducting exactly the same search.

Google does provide an incognito mode https://support.google.com/chrome/answer/95464?hl=en-GB but it’s been shown than Google can still track your footprint and online history even when this mode is being used.

One way to get around this is by using an anonymous web proxy, and although this does provide better results, as most tend to be based globally you may still not get local non-personalised results.

One of the best service I have found is http://www.whatsmyserp.com/serpcheck.php which allows you to display the top ten for any search phrase, with the ability for you to select the country. Sometimes this service has issues, so it’s worth using a couple of services to be sure of your clean top ten.

1, 3, 5 or 10 website analysis

The method we are using can be used against as many competitors as you wish but for sanity’s sake we find the sweet spot is three competing sites.

The more sites you do the analysis on, the more reliable the data, but the longer it takes, so another acceptable route is to look at the top 1 and then the top 3 to look at the trend.

How to do the analysis

The first step is to identify the primary key phrase. This is a black art in its own right and outside the scope of this article.

One you have the primary key phrase you need a list of high ranking pages which you will analyse against.

It’s then a simple case of looking at the source of each page…

The areas you are interested in (this gives the biggest bang for your buck)

Title tag
H tags
Body copy

Everything that is on the webpage does have an effect, but these three areas should be your primary focus.

To see the source of a page in Google Chrome (you can do this in any browser but each has its own way of displaying this information) is to press Ctrl+U (this is also available when you right mouse click on a web page)

If done correctly you are now seeing the code which controls how the website displays.

We will now be recording two types of data, the amount of times the whole phrase is used and each part of the phrase (each word). Google is aware of stop words and connective words (and, to, the etc..) so these should be ignored…

First we will look at the title tag

Press Ctrl+F and type in <title, Chrome will highlight the title tag. Simple count the phrase and part phrase and record the data.

For the H tags we want to search for <h and for the body, simply search for the phrase or part phrase and manually check the body area of the source for instances.

Normally you will only see a few instances on a page and if you find you have hundreds of instances of your search phrase, it normally means you have over cooked it.

Now repeat with the other 3 reference sites.

Analysis of data

What you are looking to do is be roughly in amongst the big players, if your web pages has far too many or too few instances of your key phrase (compared to your top 3) or part phrases, it could be an indication that Google won’t see your page as optimised.

This method is really useful as we know that our top three pages are websites that Google trusts and basing our page technically on their weights of key phrases is probably not a bad way to go.

After the work has been done and implemented, you will need to wait up to six weeks to see the results, and be prepared to tweak, as it’s possible that the top three sites have other reasons why they are where they are.

Although it’s not possible to guarantee results with any technique, the beauty of this one is that we have a google proof framework to check if our page is over or under optimised, compared to well ranking pages. We need to be careful to apply any changes intelligently as the page still needs to convert, but in certain situations it can be useful to help discover why your page is not ranking well and the sort of tweak you need to consider.