SEO Guide to Better Web Site Traffic 2017
A website with no visitors (traffic) is like a company brochure which remains in its delivery box and never gets handed out. There is absolutely no point in spending time and money building, hosting and maintaining a website if no one ever looks at it.
So why have a website? Simply, this is 2017 and very much the digital age. People now refer to the Internet for information, research, buying comparisons and online purchases as the norm. It is no longer the domain of the ‘techy new adopters’.
Twenty five years ago we were still using the printed Yellow Pages in the UK for finding companies. As a business you would negotiate on which areas to put your advert in and how big the advert would be. Today I am not even sure there is a paper version of Yellow Pages, it has all been replaced by online marketing.
So if you need people to know about your business (which unless you are an undercover spy that is probably every business!) you need a website and that website must attract visitors.
How do you get people to find your site?
Well first and foremost, despite much of what is spread around by many ‘experts’, it, much like 25 years ago, takes TIME. You do not build a business over a weekend and have a monthly income in 6 digits by Monday morning just because you have a website! That is complete fantasy. So think in years, not days, weeks or even months to build a decent traffic stream that is going to convert into business.
As Zig Ziglar once said “There is no elevator to success, you have to take the stairs”.
Let’s go back in time.
In the 1990’s the Internet slowly came to life and took over from ‘bulletin boards’. The first websites were quite basic and search was provided by Alta Vista and Yahoo. In these early days you could get listed in these search engines by simply promoting one or two words that you wanted to be found for and this would get you on page 1. As more websites came online, it was possible to ‘game’ Alta Vista and Yahoo with ‘keyword stuffing’ and various other dubious techniques. So there was a continual battle to try to ‘out spam’ the spam sites to stay in the top 10 rankings.
Moving on to the early 2000’s and a new kid on the search engine block appeared called Google. They were different. Rather than relying solely on the number of ‘signals’ from the website to work out where it should be ranked, Google added in external signals. They looked at how many links a website had from other websites and added that into the keyword mix.
Google quickly became the Search Engine everyone used, and to this day still has a huge slice of the search market. A new industry sprang up around building massive amounts of links, selling links, link networks and many more techniques all designed to seduce Google into ranking a website higher.
Through that first decade of the 21st century it was a continual ‘cat and mouse’ game played out between Google and the SEO industry around Keywords and Links. For instance Google would improve its algorithm to look at placement of keywords on a web page and penalise for hidden keywords or look for the keyword used too many times in the page title.
Then in 2011 Google started its journey of outlawing techniques of promoting websites above others by using ‘spammy’ methods. They wanted to have the search results returned providing the best experience for its users.
They developed better ways to look at site structure, words used across the site, external links, text used in those external links, site load speed, content freshness and many other factors all of which effect the rankings of a website.
The downside of this is it meant a huge increase in the amount of work to rank a website well but more importantly it has also introduced a large amount of ‘unknown’ and ‘randomness’ into the ranking of a site. It is quite common for a business which relies heavily on its website for sales to suddenly find it has dropped from page 1 in Google search to some remote page which no one looks at. Result? Sales drop like a stone.
Many businesses have learned the painful way that it is ESSENTIAL not to put all your lead generation into a single source such as Google. This technique is tantamount to business suicide.
Top tips to improve search engine traffic to your web site in 2017.
- Firstly and most importantly, remember that Google is only one source of traffic. It is an important one but do not become ‘Google Blinded’ and fail to address the other traffic sources. In the days before the Internet, no business would rely on one single source of lead generation. A business may have an advertising campaign running, mailshots going out, cold calling, exhibitions and other ways of making connections to new potential clients. Those same rules still apply.
- Understand your market on the Internet. The site must focus on Keywords that people are using to find your type of products or services. If people are NOT searching for particular words and phrases then do not use them. They may seem the correct ones to you but it is not you who is trying to find your website.
This is a common failing for many websites. Put yourself into the shoes of your customers and think about what they would type into Google. You need to be listed for those terms. Ask your customers. Brainstorm ideas with colleagues, friends and family. Google can only match what someone types into their search bar with what text you have on your website.
There are many tools to try to help identify keywords but one of the best is the Google search engine itself. Spend time trying lots of different words and see which ones return the best results for your market. You can also use tools like SEMRush to view what keywords your competitors are being found for. This often produces some surprising results.
If you have an existing website, always start with a full audit. You need to have a good understanding of how the site is structured.
- Start by checking the setup of your domain. Search engines hate finding duplicate content on the same domain so if your site can be accessed from http://binaryone.eu and http://www.binaryone.eu then the search engines will see two copies of all your content and penalise you.
- The page URL’s should be words that reflect the keyword for that page. Do not use anything that you cannot read and understand. Examples –
- Bad URL Example: http://www.example.com/product.aspx?ID=11526&IT=5f7d3d
- Good URL Example: http://www.example.com/ vitamin-health-supplement/
- Make sure every page on your site has a unique Title and Description tag around the keyword research you did and the page content.
- Every page should have at least a single H1 header tag which should reflect your Title and preferably an H2 tag reinforcing your keyword.
- The content should be well written and grammatically correct – Google now checks this.
- Images should have an alt tag which should reflect the keyword for your page.
- Site navigation menus should be logical and all navigation should be able to be followed by a search engine crawler.
- Make sure the footer of every page shows your company address and postcode.
- Check your sites links with a tool such as Screaming Frog to find and fix any broken ones.
- Use an external site audit program such as the one SEMRush offer which will show up any further important issues you need to fix at this stage.
- Check (and create if your site does not have one) an .xml Sitemap.
- Check (and create if your site does not have one) a Robots.txt file.
- If you do not already have one, create an account on Google for their Webmaster Tools for your domain. Make sure to upload your sitemap.xml to this account and test the Robots.txt file. Login and look at this account daily to check on how Google sees your website.
- Repeat the above for a Bing account. (yes Bing are a search engine and do provide at least 20% of search traffic so don’t forget them)
Following these steps will give you a reasonable understanding of what your website looks like to a search engine crawler and you can then start to implement changes where they are required.
It is essential to have a well-structured website that has the keywords that people are using, properly registered and indexed by Google and Bing to have any chance of improving the sites rankings and traffic.
Your next task needs to be a full audit of your external links.
As I mentioned earlier in this article, Google started by building its ranking signals around external links. This still applies today. Fact – the best structured site in the world with great content will never get any ranking or traffic from Google without good links in place.
It used to be possible to pay to have links built quickly and in bulk and these would be all that was needed to increase a websites ranking. Then Google introduced a new way of looking at links. It now penalises rankings if sites have paid links or what they call ‘dubious link sources’. This caused a major impact on sites across the globe and continues to be one of the biggest challenges for SEO and building a websites ranking.
To check your link profile –
- Setup your Google Webmaster Tools account as mentioned above. This will provide you with an important list of external links which Google has found pointing to you.
- Use programs such as AHREFS and MAJESTIC to also build a list of links that are pointing to your website.
- These links need to be analysed. There are numbers of different programs for doing this such as URL Profiler. From the output of this analysis you now need to perform the following.
- The first thing you need to look at is your Anchor Text profile. This is the text that you click on to activate the link. If this is showing a high proportion of your links using keyword text then this will get you penalised in Google. There are various ratios banished about on keyword to brand to generic text. Very roughly you need well over half of your anchor text using your domain URL and brand name and only less than 20% using your keyword.
- Check that your links are coming from different domains. If you have for example 100 links but they all coming from 5 domains then you may need to look at this further.
- Check each domain to see that it is at a minimum a domain you do not mind being linked from.
These are the basics to look at and to be corrected. You should try to get bad domains removed by contacting sites and if that fails you can submit a disavow file to Google to have them ignore those domains. Anchor text ratios and domain spread can be addressed by building new links using anchor text that is missing from your profile and building links from new fresh domains.
Did I say this process would be quick and easy? Expect to spend considerable time and resource to carry all of these tasks out. Performed well they will produce an improvement in time in the rankings for that website.
These are the basics for any site to have a chance of being ranked for its chosen keywords in a search engine. It is important to address all of these areas continually as the site evolves and it is especially important to keep monitoring your link profile to ensure you are not attracting a bad link profile. Remember, links are not only built by you!