Using on page SEO to improve your SERP’s
It is known that Google has over 200 ranking factors when deciding where your webpage appears in its search, but most of these have been designed to stop the abuse of black hat SEO companies.
Generally SEO is split into on-page and off-page, simply put on-page is the content on your webpage and is perhaps the only part you have complete control over.
The google news is that search engine place a lot of weight on this content so anything we can do to help with on page, the better.
With ranking factors being tweaked on a daily basis, it can seem impossible to make logical and well thought out changes, but this is far from the truth. As although there are almost an infinite amount of interrelated variable, there are some well documented changes that do have a constant effect.
And because no one actually knows Google algorithms, the only source of data we have is the results of any given search.
The method outlined below is known to work and requires no specialist tools or knowledge. That’s right this is about as close to a magic bullet as there is.
The technique revolves around looking at how the top sites for your search phrase use those search phrases in the make-up of their pages. By looking at multiple high ranking sites we get a ‘feel’ for the data and it removes the spikes from any one site.
Give me anonymous data
Google by default tries to be helpful by weighting sites you have already seen and by taking account of other historic browsing data. In other words when you do a search on Google the results you see might not be the same as someone else conducting exactly the same search.
Google does provide an incognito mode https://support.google.com/chrome/answer/95464?hl=en-GB but it’s been shown than Google can still track your footprint and online history even when this mode is being used.
One way to get around this is by using an anonymous web proxy, and although this does provide better results, as most tend to be based globally you may still not get local non-personalised results.
One of the best service I have found is http://www.whatsmyserp.com/serpcheck.php which allows you to display the top ten for any search phrase, with the ability for you to select the country. Sometimes this service has issues, so it’s worth using a couple of services to be sure of your clean top ten.
1, 3, 5 or 10 website analysis
The method we are using can be used against as many competitors as you wish but for sanity’s sake we find the sweet spot is three competing sites.
The more sites you do the analysis on, the more reliable the data, but the longer it takes, so another acceptable route is to look at the top 1 and then the top 3 to look at the trend.
How to do the analysis
The first step is to identify the primary key phrase. This is a black art in its own right and outside the scope of this article.
One you have the primary key phrase you need a list of high ranking pages which you will analyse against.
It’s then a simple case of looking at the source of each page…
The areas you are interested in (this gives the biggest bang for your buck)
Title tag
H tags
Body copy
Everything that is on the webpage does have an effect, but these three areas should be your primary focus.
To see the source of a page in Google Chrome (you can do this in any browser but each has its own way of displaying this information) is to press Ctrl+U (this is also available when you right mouse click on a web page)
If done correctly you are now seeing the code which controls how the website displays.
We will now be recording two types of data, the amount of times the whole phrase is used and each part of the phrase (each word). Google is aware of stop words and connective words (and, to, the etc..) so these should be ignored…
First we will look at the title tag
Press Ctrl+F and type in <title, Chrome will highlight the title tag. Simple count the phrase and part phrase and record the data.
For the H tags we want to search for <h and for the body, simply search for the phrase or part phrase and manually check the body area of the source for instances.
Normally you will only see a few instances on a page and if you find you have hundreds of instances of your search phrase, it normally means you have over cooked it.
Now repeat with the other 3 reference sites.
Analysis of data
What you are looking to do is be roughly in amongst the big players, if your web pages has far too many or too few instances of your key phrase (compared to your top 3) or part phrases, it could be an indication that Google won’t see your page as optimised.
This method is really useful as we know that our top three pages are websites that Google trusts and basing our page technically on their weights of key phrases is probably not a bad way to go.
After the work has been done and implemented, you will need to wait up to six weeks to see the results, and be prepared to tweak, as it’s possible that the top three sites have other reasons why they are where they are.
Although it’s not possible to guarantee results with any technique, the beauty of this one is that we have a google proof framework to check if our page is over or under optimised, compared to well ranking pages. We need to be careful to apply any changes intelligently as the page still needs to convert, but in certain situations it can be useful to help discover why your page is not ranking well and the sort of tweak you need to consider.
Using Bing and Yahoo to gain advantages on Google
Although Google is the world’s largest search engine, does it always make sense to use its top 10 as the basis of your technical competitor analysis?
What is top 10 analysis?
Simply put top 10 analysis allows us to see the factors that Google feel are important and allows us then to implement the same rules on our own pages for the purpose of ranking highly.
Or in plain English, we don’t know why Google ranks any top ten pages the way it does, but we know it likes them, so if we can be more like them, Google will treat us in the same way.
This technique does work well and accommodates for any Google algorithm change, but without care, we might be analysing the wrong competitors.
What is the Google flaw?
In short Google weights in-bounds links and length of active domain in such a way where it distorts the index greatly. The reasons behind this does make some logic, but results in a top ten which in the conventional sense could be considered as not the most optimised.
In other words if you have a very old active domain and a lot of inbound links you have a distinct advantage and don’t have to worry about the other aspects of SEO so much.
Why Is Bing and Yahoo important in competitor analysis.
With Bing and Yahoo they weight on page aspects far more than Google, so you could say that the top ten on either search engine may show better on page SEO clues than Google ever can.
In addition as both Bing and Yahoo are in a state of perpetual catch-up, they tend to be far more transparent and loose with how they rank.
Performing a top ten cross search engine analysis
If you are using a top ten tool like SEO Profiler or IBP, simply feed the top ten from Bing or Yahoo into the top ten report and run it against your target page. This will allow you to compare your page against the strongest on page competitors for your search phrase.
To perform the same thing manually the advice would be to reduce the analysis to the top 3 and to reduce your analysis factors to title tag, URL/path, h tags and body copy.
The method is really simply, take your target search phrase and break it into it component parts.
Now simply compare the whole phrase and parts to the source of each of the three competitors html pages for title tag, URL, h tags and body copy.
What you are looking to do is make sure you use the phrase (or part phrase) as many times as the least times it’s being used and no more than the maximum use in each of the competitors pages.
As an extra tweak, as a minimum make sure your phrase is being used at least once.
A waiting game
It’s now time to wait up to six weeks for Google to re-index, look at the results and tweak.
If you are not making progress then the sad news is that you will need mote inbound links.
On Google it’s annoying but still very much a fact of life.
Without decent key phrase research your search marketing is useless
Perhaps the hardest idea to get over to new clients is the importance of Key phrase research in SEO. Although things always change, in general terms a search engine will try to match as closely to the phrase you enter.
While many business will know the key phrases that work for them, without research often these are either wrong or at the least misguided.
People search in keywords
While it’s true that people write and communicate in natural language, when people use search engines they tend to change the interaction into a series of keywords…
So as an example if you lived in Croydon and needed a plumber, if you were speaking to a friend you would ask ‘Do you know of any good plumbers?’. When the same person asks a search engine this changes to ‘Plumbers Croydon’. Part of this will come from time (most people like to type in the most direct way possible) and part will have come from their historic use of technology.
And indeed if you look at the majority of on page SEO work done today most will revolve around identifying and writing in exact terms around these killer key phrases.
Google AdWords saint and sinner
Although it now has massive flaws, Google AdWords Keyword planner still represents the best source of search data available.
An AdWords account can be set up for free (just go to the Google AdWords website and sign in with your Google account) and requires little skill to use.
It is important to realise how the data is collected and how it is flawed before sinking your SEO budget into a range of content that will never give you a return.
Includes historic
By Google’s own omissions, the reported search volumes includes historic volume data, so while large volume numbers are good, there is no guarantee that while the search volume was popular then, anyone searches for it now.
Buying phrases
It’s important to realise that not all key phrases translates to targeted traffic. For instance if you are a web designer, there may be a lot of searched for ‘web design courses’ or ‘web design software’, these searches are unlikely to contain visitors which are likely to want to take out your service.
By default it’s Global
By default the Keyword Planner will bring back global search results, for most companies they will want to see more local volumes, so make sure you restrict results as appropriate. (I.e. if you are based in Oxfordshire and that’s where 95% of where your clients comes from restrict the results to only show search volumes from within Oxfordshire)
1, 2 3 – Don’t bother
Even with good monthly search volumes it’s important to realise how many searches it means for you. Typically the first organic result in google will take 20% to 30% of the monthly search volume, and this goes right down to barely 1% for the bottom result on the first page. In other words if you search phrase gets 100 visitors a month, the first result on Google will get 20 to 30 of them and the tenth results will get just 1 visitor or a month. Make sure there are enough search volume to justify SEO.
They are not all customers
One of the biggest factors in the Google data being wrong is the type of searches which inflate these figures. While some will be potential customer searching for services, some will be generated by you when checking how you are ranking, some will be competing companies doing the same thing and lastly a lot will be SEO companies carry out their own SEO research. For key phrases with small volumes, these ‘fake’ searches my account for 75% of the searches recorded.
Tom can’t peep any more
While it is true that in the beginning days of Google it was able to record a near 100% of accurate data, with recent rulings and with people removing the ability for Google to record your search data, this translates into less accurate search volume data. In short more searches is better, you can’t assume the actual search volume is anywhere near.
Most searches does not equal the best search phrase
While search volumes are in indication of popular phrases, it’s not the end of the story. A search phrase which is popular is likely to have a number of other people competing for that top spot, this is referred to how competitive the phrase is. The more competitive a phrase is, the harder it is to rank for it. For new websites or for us mortals, choosing phrases which are less popular may allow us to get more customers in the long term. To give you your best phrase, simply take the search volume and divide this figure by the competitive value (given in the AdWords keyword planner), this will help you determine which search phrases have the most traffic and are the easiest to rank well for.
Common sense
Finally use your common sense, if the data seems wrong, it probably is, so it’s acceptable to spilt test phrases to prove your research. If you take the advice above you should see an improvement in your SEO and SERP efforts.