Five tips for great logo design
In the world of design, a lot of time is spent dreaming up a perfect logo, most commonly to represent a company or product. And a logo is important as it can help form a recognisable bond with a company or product After all, how many brands can you recognise by the logo alone? If you are trying to dream up the perfect logo design, here are some helpful considerations.
Have your logo tell a story
A logo should mean something, something relevant to your brand. Sometimes the story behind a logo is obvious, sometimes less so. Sometimes clever details are not initially noticed by the masses, then it goes viral when it is (think the hidden bear in the Toblerone logo-excellent for marketing when it’s spotted and shared!) For a logo with depth, don’t just put shapes and colours together, construct something that makes sense.
Use negative space well
Sometimes a logo with a plain background framing it can look good. But be careful or having too much white space, sometimes it does no favours. Judge the right amount of blank or coloured space (if any) to leave, or use it wisely by having it add something to the main logo area (think the hidden arrow in the FedEx design.
Sometimes simple is best
Yes,, we all love a clever logo. One that has some kind of trick that makes it very clever, such as the Amazon logo having the ‘A-Z’ of products it sells represented in its design. But sometimes keeping the logo simple is the best option-no tricks are needed. After all, you want to make sure everyone ‘gets it’. How simple is the ‘f’ that represents facebook? Very, and yet it is easily recognisable as being part of the branding. You have to be careful not to confuse anyone with what your logo is saying.
Have a unique logo
One thing that you don’t want is a logo that is too similar to an existing one, or one that someone is likely to come out with themselves, and if it is too generic of a design that is likely to happen. So although above we have said about keeping it simple, don’t go too simple! If you can find a way to make your logo unique, it’ll be all the better for it, and stand out well, making it easier for people to identify it with you. A good way to achieve this is with the styling of any imagery used, or a font. In regards to the font, think about how well the famous CocaCola one works.
Have fun !
This may seem a generic point, but if you are too uptight about the designing of your logo you can sometimes restrict yourself with your output. Think outside the box and get creative-your more likely to come up with a winner this way!
These tips will hopefully act as a helpful guide when it comes to the important process of designing a logo. And if you are struggling to come up with the goods there are plenty of very talented designers who will be more then happy to take the project on themselves!
Adobe Business Catalyst eggs in one basket for Witney firms
Since its inception as a company Websites by Mark has been attracted to the seemly professional look of the product called Adobe Business Catalyst.
What has stopped us in the past is pretty much the same reason we don’t ever recommend site builders like Wix and Moonfruit. Simply put when you use a platform like Adobe Business Catalyst, you are using a proprietary system and although you may pay for the service, you have to ask yourself about ‘what happens if?’
And to be fair when first became interested in the platform, during research we found a local competitor (at the time we were based near Witney in Oxfordshire) offering the platform to its customers, we were initially impressed how professional the platform looked. For us, one of the main blocks was the upfront cost to business which we saw as a barrier too far for most of our client base.
On March 24th, 2018, many customers have woken up to the news Adobe is killing the Business Catalyst platform. Worse still as the platform uses proprietary technology. It’s not as simple as putting the site onto third-party hosting.
In short when they finally do pull the plug, if you have not had a site rebuilt and hosted elsewhere, you will no longer have an online presence.
Although this is a real concern for anyone affected, the good news is there is time to do something about it.
Although not suitable for every sort of site WordPress makes a fantastic alternative.
Where WordPress is different is that the source code is open source, which means you never have the danger of anyone pulling the plug on the platform. You are also not tied to a single hosting provider. In other words, in a world where there is chaos. WordPress is about as future proof as it gets.
When you migrate from Business Catalyst you are likely to find that at least some of your pages may have a different URL, this will cause an initial drop in rankings, but the good news is with the right preparation, this is not a long-term issue.
Our offer to companies in Banbury, Carterton, Oxford, Oxfordshire and Witney
If you think you might have an Adobe Business Catalyst website we will be happy to take you through your long-term options.
Not only will you get a website which performs just as well as your defunct Adobe Business Catalyst website, but its fighting fit for the future and your hosting is likely to be cheaper as well.
If you are interested in finding out more, please visit our contact page for our contact details.
Designing a website which ranks well in Google
Since its earliest incarnation, Google has continually involved to both give more relevant results to its user as well as combat web spam created by individuals to game the system.
While a lot of techniques still work equally today, Google has added some interesting changes to the mix, which make designing a great looking site which ranks well a real challenge.
What hasn’t changed
Google mission for providing links to relevant websites has not really changed. It relies on a combination of inbound links and well-written content. And still puts a lot of weight on inbound links. So even with the world’s best-written content, someone with a better backlink profile will have a head start.
Google also likes to match search terms with content. So, if your webpage has phrases which people search for it is likely to perform better.
Normal SEO work cycle.
With SEO the starting point is always looking for terms which people search for and have an amount of traffic associated with them. There are several Keyword tools available, but as time has gone on, these tend to have become less reliable as regards to accurate data they provide.
So, once you have a potential list of phrases, as good SEO will first look at them on a human level and then look at competitors to see if they have come to the same conclusions. If you are targeting completely different phrases, there is a chance you might be right, but a bigger chance you are not.
When crafting your Web pages, you would then look to craft copy around the phrases and as a final touch maybe check that you’re not massively going over the top compared to your competitors with the number of times a phrase is used.
You would then wait for Google to re-index before evaluation and tweaking.
Website design in a Google 2.0 world
In more recent times, people who have followed a ‘script’ to apply SEO (i.e. titles must include the search phrase, H tags must include the search phrase etc…) have found their results have become poorer over time. Simply put this formula no longer works.
The first major change which has already happened is that Google is moving away for exact matches. And clues are available when you search. As an example, if you search for web design, not only instances of web design appear in the result but variations like website design, web designer and all manner of other similes. The other major changes are the stripping of stop and connecting words. So, in the above example, if someone was to search for web design London, Google would give equal match weights to variations such as web designer in London or even website design near London.
As a designer, it gives us more scope to include ‘proper’ English and still has pages which rank well.
But this is not the end of the story
While much focus has been given to semantic SEO and OG tags, there is still little evidence that Google lifts pages with these tags included. In short, Google still only trusts what it can see.
But this has one more massive consideration and that’s how Google evaluates a web page for SERPS’s.
Now while its true Google does evaluate the whole page, it does not evaluate the whole page equally.
In short, Google only is really interested in the content above the fold. So in other words, if you have a large header, large slider and then below the fold your actual content starts, although Google will still evaluate the whole page, Google won’t rank your page as effectively as if the content started immediately below a small header.
Worse still, Google now much more interested in a mobile-first world, which means we have even less space to get our SEO copy onto the screen.
Designing mobile first
One way around this is to construct your site as a mobile-first experience and optimise your site for this experience and then use media break points to improve the site design for larger formats.
While not recommended for all images, in the above example where the slider would take up the whole mobile screen, one solution would have the slider not showing for anyone using a small format mobile phone.
This would have not only the advantage of helping Google rank your site appropriately, but also give your users a better experience in the process.
In conclusion
While its possible to ignore the things, Google sees as important and still rank well, this approach relies on an amazing backlink profile. But considering how Google sees the world we can design and build a site which naturally performs well on all search engines including Google.
Yoast SEO – is it all bad?
For most people, the first time they come into contact with Yoast SEO is when they look for an SEO Plugin which will help their performance on their WordPress website.
So why do some people have success with the product and others struggle? In short, to get the best out of Yoast, you already need to know SEO.
But if you ‘know’ SEO is there any real benefit in the product?
Why Yoast as a stand-alone solution is flawed
To understand the underlying flaw with Yoast you need to understand what it is doing. In short, Yoast looks at the copy you put into the editor and make assumptions about how the title is used.
On the face of it, it makes sense, but if you have a theme which contains widgets, or any other content on the screen which you don’t enter via the editor, then Yoast will only ever be working with a partial picture of the generated source.
Now for some websites (with little competition), this might be fine, but if you are in a competitive field, you may find Yoast by itself can’t cut the mustard.
So does this mean Yoast SEO is a waste of code?
While by itself for pure SEO it does have holes, as part of your SEO armour it does have some useful features.
In short, the edit snippet section does make life easy as these directly relate to the title tags and open graph tags.
But to get the best out of Yoast, ironically you need additional help.
Content may be king but only if it’s spelt right
With Yoast’s green light system it can be easy to get caught up not thinking about the whole page. Using something like Grammarly is a real godsend. For those not familiar with Grammarly, it is a powerful (and free for most uses) online service which checks spelling and grammar as you type.
With Grammarly on board, you can actually make sure your copy is well written.
That’s great, but what about professional grade SEO tools
As mentioned above only looking at your in editor content is flawed, you need to look at the generated source code as a whole.
Products like SEO Profiler although not free, fill this gap, with a vast arsenal of tools, they simply take up the slack where Yoast falls short.
Many come with site audits, which pick up issues Yoast is not capable of detecting, key phrase research tools, ranking checkers, social media monitoring functions, in addition to tools to help build inbound links.
All these areas are vital if you wish to climb Google for serious search terms.
In conclusion
If you are not an SEO professional and have a website with a lot of competition, Yoast SEO is unlikely to get you to the top of your chosen search engine, but by using additional tools and putting the effort in, you too can rule the rankings.
Bringing the 80/20 rule to SEO
When something has been around for the best part of a century and it won’t go away, you know there must be something in it.
The Pareto Principle (also known as the 80-20 rule) is the simple idea that 20% of what we do has 80% of the outcome.
Or in other words for everything we do in business 20% will produce 80% of the profit.
So would it not be better to do something different with the 80% of what is wasted time?
The world’s largest marketing budget
No matter the company and their financial position, they will only command a finite amount of resources. In short, there will always be more to do then you have either time or money available.
So no matter if you are a large or small company you face the same limitations. The only real difference is with a larger company and budget, you have more room to do more of it. (Both the 80 and 20)
Making a start
Before we can talk about harnessing 80/20 we need to understand what we are doing at the moment. How close are we to the 80/20 rule? Perhaps we are ahead of the curve or behind, but without understanding where we are, we can’t possibly make a sensible change.
For a traditional company, we might look at our marketing budget, and see how each type of advert performs. How can we do this? The simplest solution is to tie in an offer code.
In other words, each time someone claims an offer, we know that form of advertising has been successful.
By comparing the results with different types of advertising, we can craft, test and change our outcomes.
The online world
With the online world, the platform may be different, but we can use the same methodology, but instead of using an offer code, we can tie different forms of advertising into different landing pages on our website.
In short, if we run a promotion on Facebook, for instance, we will get click-through statistics from Facebook and from looking at Analytics we can see how many people have made the journey from the ‘landing page’ to conversion.
For this to be successful it’s worth setting up a new page for each offer. Without this, it becomes harder to track which method has actually generated our sales.
SEO and 80/20
With SEO the execution changes, but not the methodology. In short, we want to concentrate our SEO on the 20% of pages generating 80% of our sales…
Or in plain language, we might want landing pages for every location or product we cover, but because we only have a finite budget and time, it makes sense to prioritise the pages that create the largest wealth.
Wealth is the important factor here. If our grab was purely for customers, we might end up having to serve more customers for less turnover. This means not only are we making less money, but the cost for us to support that larger customer base is higher.
You just need to look at the recent chaos at Ryanair to see what happens when you don’t make enough money per customer and something goes wrong.
Picking your fights
With SEO it’s about finding the key phrases that have enough traffic to justify the work and also don’t have a massively amount of competition.
Or in other words, if you are a Sole Trader with no real budget to build a substantial amount of links, pay for SEO professionals and write lots of creative content, trying to promote a page which is already being marketed by companies like Amazon and eBay, means you’re never going to win.
Local SEO tends to work well for companies who can identify a few phrases and locations which describe what they do or the services they provide.
An example
John is a plumber who lives in a small village near a larger town just outside a large city.
John’s website has only been going for a few months and has no real track record with Google.
Although the small village won’t really provide any work, he will find it easy to rank on the first page.
If John tries to target the town, there is more competition, but more searches, maybe he gets on the second page
If John tried to target the city, it’s worse still and if he hits the top 50 he will be lucky.
John knows that Google does not like multiple pages with the same content and he has heard the fewer phrase he optimises for, the more success he will have.
So for John at the moment, his ambitions should stay at the village level. As he becomes more established and can afford to spend more on marketing he can then think of marketing at the town and city level.
In this example, the village won’t bring in enough work, so some of his marketing budgets need to go on other forms of marketing.
For Tom who owns a larger company which has been going for years and is turning a good profit, he will look to target the locations where he can get the most work from… (Methodology will be the same, but he will have more factors to consider.)
In summary
The 80/20 rule can be a useful tool to allow us to think in terms of what we can do to get the biggest effect with the smallest effort.
It won’t fix everything but will allow us breathing space to understand the journey we are all on.
Quickly ranking on search engines like Google
So you just created the best website on the planet, not only featuring a cutting edge visual design, but also penned by the modern equivalent of Shakespeare.
You followed all the rules and it interesting and relevant. Two month on you still have no traffic and worse still Google wont index it.
How search engines work
With the terms world wide web, spiders, crawlers and bots, it’s easy to think the internet was created by oversized geeks, who’s fixation with science fiction and fantasy trumps the need for something practical.
But the truth is really the opposite and really theses jargon names mealy mirror the function of these terms.
Although we could use more jargon, World Wide Web (or the web) is a term that expresses the idea of interlinked items (or pages) and a spider or crawler follows these links to find its prey (the website pages)
So in short a spider/crawler is a piece of code which follows links on websites to find other pages.
Google then takes all this information, organises, sorts and then display this information depending on your personal search requirements.
That’s great but I’m still not ranked
So for a new website the challenge is to lay the threads to allow the spiders and crawlers to find us.
So the first way is to use Google Webmaster service to ‘tell’ Google any time your website changes.
And to be honest regardless if you’re ranking or not, the ability to tell Google anytime something has changed is a great idea. So no matter what the spiders and crawlers are doing, you are on their lists of sites to crawl.
The second way which in some ways more important is to make sure you have inbound links coming to your website.
The more sites which link to you, the more times a spider will come and visit.
Getting banned (when backlinks go wrong)
Before you spend thousands on buying building links, there is a caveat. In short link building is against Google’s T+C’s. Google are constantly looking for ways to detect unnatural ranking techniques and as inbound links are a major ranking factor, this will always be an area of abuse.
So the rule of thumb is that if all the inbound links have the same anchor, or you suddenly get more links then you would normally do, you could become under Google Spotlight. In addition Google considers the reputation of your website, so if you have always played by the rules, dodgy links will not be treated the same as if your domain has only just been registered.
Fantastic, but I’m still not ranking
In short Google does not have to take account of inbound links, it chooses to do so, so many places where you build links, Google will just ignore out of hand.
There are a few places where links will help with ranking and in short its social media.
All the major social networking sites will help with link building. The only caveat is that for a link to be seen its needs to be in a ‘public place’.
So for Facebook this means if you’re posting to your timeline and privacy is rigged so only your friends can see your posts, this means Google won’t see you either.
Facebook business pages do not have the same issue, but Google needs to be aware of your page, so if you are thinking of this route, some Facebook marketing could be the way to go.
Platforms like Twitter and YouTube also are public facing, so Google will pick up any links posted.
Remember these links are not going to help raise your rank, but are a method to help Google index you site.
How many links?
In short you need two inbound links Google trusts for your website to be indexed.
Although more links, means quicker indexing as we have already discussed, if you overcook it too quickly it can go massively wrong.
How quickly should I see myself indexed on Google?
For most sites, expect a page to be indexed within a month. For sites with a good reputation and lots of backlinks, the indexing process can be less than a day.
I have personally seen pages indexed in ten minutes before.
Nothing is still working
For people in this situation it can be incredibly frustrating, with the feeling that everyone is doing betting that yourself.
In truth, there may be reasons why things are not working and if you have worked through the above with little success, it’s time to call in the professionals.
It does not need to be expensive and if you’re running a business, it can save you a fortune in lost sales.
Html5 and the considerations for SEO
With the event of HTML5 we have all been promised that semantic mark-up is the way forward and for those who ignore this idea will be rewarded with lower rankings.
SEO in a HTML4 world
With HTML4 life was simpler, every page should have a single h1 tag and in source, the higher for your SEO text the better.
In fact HTML4 spawned a trend in bending the order of html mark-up to help with improvement in ranking.
This lead to two big problems…
The first is you are re-ordering source code simply to service the perceived needs of Google. This lead to the second issue which affect the usability of the site for some users. Explicitly blind users who use screen readers, or users who don’t have CSS turned on, would see a completely different running order.
HTML5 the game changer (well sometime maybe in the future.)
With HTML5 we have new elements and tags which inject meaning to our source code.
Unfortunately with this new format, there can be a lot of confusion with not only the standard, but also how it relates to SEO.
So as an example one of the new tags is one called <header>, most people assume (wrongly) this this related to the area at the top of the screen (Sometimes called a masthead) and will often be associated with the logo and strap line.
In fact <header> has a much more generic meaning, it simply means a heading area for part areas our content.
So yes it could be the masthead, but equally in each ‘section’ each could have its own header etc.…
Google’s view of HTML5
Google view of indexing webpages has not really changed massively since it started.
It first considers how many quality sources link to your webpages and then looks at each page to see how relevant it is for each search term.
And to be brutal, it uses the methodology that higher up in the source code something is, the more important it is.
So if you have a <header> or <h> in the masthead, Google will see this as the most important part of the page.
Now for a homepage, where we are promoting ‘brand’ this is OK, but for internal and landing pages should this be the case?
When the top of your page is not the most important part
For most websites there will be a masthead, followed by navigation, then content followed by a footer.
Even if we add a header and h1 into the content area, if there is a header or h tag in the masthead, then Google will consider the masthead header as our most important area of the page.
Worse still because there is a gap in mark-up between this and our content, Google may depress the importance of the actual content area.
For most websites, they want traffic from Google. And often our SEO copy will be in the main body of the page and not the masthead.
If the page is coded as HTML5 and uses <header> and <h> tags in the masthead, we may end up confusing Google as to where our most important content is.
In addition because we now have tags like <main> <article> and <section> it means how we use these will impact how Google understands our webpage.
How to help Google (and yourself)
The simplest way to make sure your mark-up works well for Google, is simply to disable the css and JS from showing and check how your page ‘looks’.
If it still makes sense, you are on the right track.
Next look at the generated source and check the document outline, does it naturally work or does it look ‘awkward’.
Although Google does do some visual check these day to prevent ‘hacks’, treat Google as a blind web user and you won’t go far wrong.
Final thoughts
Although it’s tempting to construct source code and use CSS to bend SERP’s, to future proofs your sites you are now better to make sure you sites still work, even if the CSS is turned off.
After all this is how Google really sees your page.
Using on page SEO to improve your SERP’s
It is known that Google has over 200 ranking factors when deciding where your webpage appears in its search, but most of these have been designed to stop the abuse of black hat SEO companies.
Generally SEO is split into on-page and off-page, simply put on-page is the content on your webpage and is perhaps the only part you have complete control over.
The google news is that search engine place a lot of weight on this content so anything we can do to help with on page, the better.
With ranking factors being tweaked on a daily basis, it can seem impossible to make logical and well thought out changes, but this is far from the truth. As although there are almost an infinite amount of interrelated variable, there are some well documented changes that do have a constant effect.
And because no one actually knows Google algorithms, the only source of data we have is the results of any given search.
The method outlined below is known to work and requires no specialist tools or knowledge. That’s right this is about as close to a magic bullet as there is.
The technique revolves around looking at how the top sites for your search phrase use those search phrases in the make-up of their pages. By looking at multiple high ranking sites we get a ‘feel’ for the data and it removes the spikes from any one site.
Give me anonymous data
Google by default tries to be helpful by weighting sites you have already seen and by taking account of other historic browsing data. In other words when you do a search on Google the results you see might not be the same as someone else conducting exactly the same search.
Google does provide an incognito mode https://support.google.com/chrome/answer/95464?hl=en-GB but it’s been shown than Google can still track your footprint and online history even when this mode is being used.
One way to get around this is by using an anonymous web proxy, and although this does provide better results, as most tend to be based globally you may still not get local non-personalised results.
One of the best service I have found is http://www.whatsmyserp.com/serpcheck.php which allows you to display the top ten for any search phrase, with the ability for you to select the country. Sometimes this service has issues, so it’s worth using a couple of services to be sure of your clean top ten.
1, 3, 5 or 10 website analysis
The method we are using can be used against as many competitors as you wish but for sanity’s sake we find the sweet spot is three competing sites.
The more sites you do the analysis on, the more reliable the data, but the longer it takes, so another acceptable route is to look at the top 1 and then the top 3 to look at the trend.
How to do the analysis
The first step is to identify the primary key phrase. This is a black art in its own right and outside the scope of this article.
One you have the primary key phrase you need a list of high ranking pages which you will analyse against.
It’s then a simple case of looking at the source of each page…
The areas you are interested in (this gives the biggest bang for your buck)
Title tag
H tags
Body copy
Everything that is on the webpage does have an effect, but these three areas should be your primary focus.
To see the source of a page in Google Chrome (you can do this in any browser but each has its own way of displaying this information) is to press Ctrl+U (this is also available when you right mouse click on a web page)
If done correctly you are now seeing the code which controls how the website displays.
We will now be recording two types of data, the amount of times the whole phrase is used and each part of the phrase (each word). Google is aware of stop words and connective words (and, to, the etc..) so these should be ignored…
First we will look at the title tag
Press Ctrl+F and type in <title, Chrome will highlight the title tag. Simple count the phrase and part phrase and record the data.
For the H tags we want to search for <h and for the body, simply search for the phrase or part phrase and manually check the body area of the source for instances.
Normally you will only see a few instances on a page and if you find you have hundreds of instances of your search phrase, it normally means you have over cooked it.
Now repeat with the other 3 reference sites.
Analysis of data
What you are looking to do is be roughly in amongst the big players, if your web pages has far too many or too few instances of your key phrase (compared to your top 3) or part phrases, it could be an indication that Google won’t see your page as optimised.
This method is really useful as we know that our top three pages are websites that Google trusts and basing our page technically on their weights of key phrases is probably not a bad way to go.
After the work has been done and implemented, you will need to wait up to six weeks to see the results, and be prepared to tweak, as it’s possible that the top three sites have other reasons why they are where they are.
Although it’s not possible to guarantee results with any technique, the beauty of this one is that we have a google proof framework to check if our page is over or under optimised, compared to well ranking pages. We need to be careful to apply any changes intelligently as the page still needs to convert, but in certain situations it can be useful to help discover why your page is not ranking well and the sort of tweak you need to consider.
Using Bing and Yahoo to gain advantages on Google
Although Google is the world’s largest search engine, does it always make sense to use its top 10 as the basis of your technical competitor analysis?
What is top 10 analysis?
Simply put top 10 analysis allows us to see the factors that Google feel are important and allows us then to implement the same rules on our own pages for the purpose of ranking highly.
Or in plain English, we don’t know why Google ranks any top ten pages the way it does, but we know it likes them, so if we can be more like them, Google will treat us in the same way.
This technique does work well and accommodates for any Google algorithm change, but without care, we might be analysing the wrong competitors.
What is the Google flaw?
In short Google weights in-bounds links and length of active domain in such a way where it distorts the index greatly. The reasons behind this does make some logic, but results in a top ten which in the conventional sense could be considered as not the most optimised.
In other words if you have a very old active domain and a lot of inbound links you have a distinct advantage and don’t have to worry about the other aspects of SEO so much.
Why Is Bing and Yahoo important in competitor analysis.
With Bing and Yahoo they weight on page aspects far more than Google, so you could say that the top ten on either search engine may show better on page SEO clues than Google ever can.
In addition as both Bing and Yahoo are in a state of perpetual catch-up, they tend to be far more transparent and loose with how they rank.
Performing a top ten cross search engine analysis
If you are using a top ten tool like SEO Profiler or IBP, simply feed the top ten from Bing or Yahoo into the top ten report and run it against your target page. This will allow you to compare your page against the strongest on page competitors for your search phrase.
To perform the same thing manually the advice would be to reduce the analysis to the top 3 and to reduce your analysis factors to title tag, URL/path, h tags and body copy.
The method is really simply, take your target search phrase and break it into it component parts.
Now simply compare the whole phrase and parts to the source of each of the three competitors html pages for title tag, URL, h tags and body copy.
What you are looking to do is make sure you use the phrase (or part phrase) as many times as the least times it’s being used and no more than the maximum use in each of the competitors pages.
As an extra tweak, as a minimum make sure your phrase is being used at least once.
A waiting game
It’s now time to wait up to six weeks for Google to re-index, look at the results and tweak.
If you are not making progress then the sad news is that you will need mote inbound links.
On Google it’s annoying but still very much a fact of life.