This month, 4mation Technologies has teamed up with SEO experts Online Marketing Gurus to offer our top tips for improving your website’s presence in search engine results. By following these three technical recommendations from 4mation and three content tips from OMG, you’re sure to get your website off to the right SEO start this year.
Your first port of call in this sea of SEO should be to make sure that search engines are actually “indexing” all of the pages of your website. Think of search engines like phone books and each of your pages as the people listed. If they’re not in the directory, they can’t be found. Simple.
Since Google accounts for 75% or more of the overall search market share, we’ll begin there. Start by creating a Google Webmaster Tools account if you don’t already have one. Webmaster Tools contains a bunch of very useful SEO features, including some that we’ll touch on later in the article. But for now, let’s concentrate on getting your pages listed. In Webmaster Tools, go to Crawl > Sitemaps. If you see a sitemap shown that Google has found automatically, you’ll also be able to see the date Google last processed it and if any issues were found that need to be addressed. However, if there isn’t a sitemap, that’s probably because your website CMS doesn’t automatically create one or you have this feature disabled. To find out more, try Googling the name of your website CMS (eg. “wordpress”) and “xml sitemap”. Many popular website content management systems offer this functionality as standard, or have plugins that you can easily install to add this functionality.
If you need to build an XML sitemap, rest assured there are a number of free online tools available to do this for you. One such tool is xml-sitemaps.com, that will crawl your website pages and build an XML sitemap of up to 500 pages for free. However, we should warn that due to the nature of crawling a website (that is, starting from the home page and following all of the links on all of your pages to discover every page on your website) means that any landing pages you may have created that aren’t linked to from another page of your website won’t be included. This is why getting your CMS to generate the sitemap is always the preferable option, because it knows all of the pages that exist on your website without needing to crawl them. In some instances, there are pages you may not want search engines to index – for example, a landing page for a special offer that you only want to promote via email or paid advertising. To avoid these pages being crawled by Google and by a sitemap generator, you should exclude those pages in your website’s robots.txt file. (If Google has already indexed pages you would rather remain unlisted, you can submit them for removal in Google Webmaster Tools > Google Index > Remove URLs).
Next up: the quality of your content. Once the search engines know where to find all of your pages, they’re going to make an (algorithmically determined) judgement call on how suitable each of those pages are to people searching for your subject matter. The easiest way for them to do that is to start by checking the key indicators of each page’s content, namely: the Title tag, the Description tag, the page’s heading tags, image tags and links off to other pages and websites. If any of these are missing, lacking information, duplicated or, in the case of links, broken (that is, linking to a page that doesn’t exist) – this can severely impact the perceived quality of your website and its overall search engine rankings.
Rather than having to go through each of these pages manually, there’s a fantastic tool that can “crawl” all of your pages for you and spit out an Excel spreadsheet highlighting the areas you need to improve. Intriguingly named Screaming Frog SEO Spider, this small desktop application will scan through up to 500 websites pages for free – with paid licences available for larger sites.
Once you’ve downloaded an assessment of each of your pages, you can start to tidy things up. Screaming Frog includes helpful hints including suggestions on the maximum character length for items like your Title, Description, H1 and H2 tags, as well as warnings for when you may have inadvertently (or intentionally) duplicated content (more on that later). Go through each of your pages and make sure that all of these key content areas contain a concise and accurate description of the content of your pages – but most importantly, make sure that none of them are left empty. Once you’ve done that, check the errors that have been highlighted – particularly 404 errors on pages that have been crawled because they’re referenced somewhere on your website. This can often be caused by typos in URLs or URLs that have been moved, changed or removed since the page content was written. Everybody hates broken links. When you click a link, you expect to be taken to where you want to go. So it’s not surprising that search engines factor broken links into their assessment of a website’s quality. When they’re in the business of serving their customers with relevant, quality content that provides a great user experience, how do you think your website will rank in their results if it’s riddled with broken links?
Google’s algorithm also favours indexing and showing distinct content, so duplicating content from pages within your own website or from external websites without properly identifying it as such can cause SEO problems. Thankfully, there’s a tag for that. Let’s say that you re-publish a blog post on your website that you originally wrote as a guest post for another blog. Or you have a product that fits into multiple categories and your website generates one page per category, duplicating the product description on each page. By adding a “canonical” tag to the secondary pages containing the duplicate content, you’re telling the search engines which page of your website should be given preference and credited as the original content source – and which pages knowingly contain duplicate content. In the past, some dubious websites used duplicate content as a way to “game the system” and improve their search rankings by having more content pages containing often-searched keywords. Over time, search engines like Google have adjusted and improved their algorithms to identify these duplicate content scammers and instead favour those websites that provide quality, distinct content. Make sure that your website is one of the latter.
As mentioned in one of our previous articles on online marketing trends, Google only recently (last November) announced changes that mean they are now displaying the words “mobile friendly” next to each of the search results shown to their mobile users – but of course, only for websites that are mobile friendly. If you don’t have a mobile website, or a responsive website that provides a tailored experience to mobile users, it’s unlikely that Google will tag your results as “mobile friendly” – and that could lead to fewer people clicking through to your site, no matter how well you might be ranking. Over time, the fewer clicks you get, the lower your ranking will become. It can be a vicious cycle. As with many of the recommendations above, creating a great user experience for your visitors – no matter what device they’re using – is generally the key to performing well on search engines and, not surprisingly, in online business.
Even though ranking for a term no longer means stuffing the same keyword over and over again, the content has to relate to the target audience and therefore Google’s users. Before you start writing, make sure you undertake the necessary research to guide which keywords you should target on each page. For example, if you’re a plumber, don’t overuse the keyword “blocked drains” on your home page. If you want to rank for “plumber sydney”, then make sure you use targeted terms in the content. Rather than continually writing ‘Plumber Sydney’, make sure your sentences make sense and are correctly structured. A great tool to use is the Keyword Planner Tool. Here are a few more examples of targeting the keywords for the right pages:
All of the above keywords generate search volume and we need to help Google understand what the page is about. The keyword planner tool will also list different ‘ad groups’ which will help you to identify grouping of different terms that are associated with each other.
Remember, you don’t need to write these terms exactly the same. Make sure the content flows nicely and it’s a piece of content that you would be proud for your customers to read.
Content doesn’t always have to be in the form of text. There are lots of other great ways to convey a message. In saying that, we always need to make sure that our content can be shared and indexed by search engines. Here are some other ways to generate great content on your site:
E-commerce category page content: One of the most overlooked SEO Strategies is writing great content for your e-commerce site’s category pages. Products aren’t enough. Google loves seeing content on these pages. Here is an example where we added content but also created a “Read more” DIV so that this extra content doesn’t get in the way of customers looking at the products:
Lists: Creating great lists can break up text on your site so it’s easier to read and get straight to the point.
For example: How to bake the best muffin on earth:
Videos: Videos are great for a number of reasons. They are much more likely to be shared, they tell a story and most importantly they illicit trust. With the amount of websites that exist, written content can sometimes become overwhelming. If customers don’t like what they see, they will move on to your competitor’s site in no time. You can create videos about your business, customers or even testimonials. Or here is one we created. Pretty funky, huh?
Infographics: Infographics can be pretty cool if you do them right. They are the large-format images that contain a lot of data, presented in the form of graphics and charts. There are even websites where you can make them yourself, such as piktochart.com. Here is one we prepared earlier:
If your content can’t be crawled and indexed, it’s pretty much worthless in the eyes of search engines. You can create the best content in the world, but if search engines can’t get to it or understand it, that content just won’t rank and you’ll receive little traffic as a result. Some examples of content that search engines may struggle to index include:
Content inside images: Be careful with free website builders, in particular the ones that generate lots of images where text would have been sufficient. For example, if your website generates your page headings as images or includes your captions inside the images they relate to – those images can’t be read by search engines, so the content in them will simply be ignored. To Google, they’re just seen as a picture. Always use text where possible.
Content that is only accessible after logging in: Any content that is password protected cannot be indexed. Just like a visitor to your website, if Google needs to login to read your content and it doesn’t have the password, then it can’t read them – and those pages won’t be included in the search index.
Content that can’t be reproduced or shared: Any content that is created dynamically, on-the-fly and that is likely to change each time a page is loaded is unlikely to be indexed. These pages tend to be mostly code-driven or script based – and to search engines, they look like code rather than content.
4mation Technologies can help your business with a range of online development solutions to ensure that your website is optimised for search engine visibility, user engagement and driving customer conversions. Our SEO content partner, Online Marketing Gurus, offer a range of services that cover content marketing, technical SEO and online PR.
Get in touch today to find out how we can help improve your online presence and maximise your digital marketing results.