Specialist SEO Agency with clients in Oxford, across Oxfordshire & the Thames Valley
01865 306 889   [email protected]

Lesson Six: Website Health Check

This sixth and final lesson of Module One is a website health check, which is an extremely important thing to do periodically with your small business website.

Ensuring your website is performing as you expect (hope) is vital and in this lesson we guide you through a handful of easy checks you can do every so often to ensure you are on-top of things and aware of any potential issues that may arise.

Well done on making it this far and to the end of Module One. Lots more to learn so we hope you’re enjoying this SEO training course so far.

Watch the next lesson: Module 2 – Lesson One: Leveraging Google’s Services – Part 1

Video Transcription

Hello, and welcome to this final lesson of module one.

We have covered a huge amount of ground so far, so take a moment to congratulate yourself as everything you have learned probably puts you on a par with many so-called SEOs out there.

No, really, you’re taking control of your destiny and learning the skills you need to build your business online and that’s fantastic.

Okay, so enough congratulating yourself. We have work to do and we’re going to look at some of the more technical aspects of website optimisation. Don’t worry, I’m going to keep it as simple as I possibly can. First, let’s talk about sitemaps. Now there are two types of sitemaps: one is an HTML version which is aimed at the end user so you’ve probably encountered this yourself in the footer of websites. You click through and it basically shows the structure of the website. It lists posts and pages and basically helps you navigate around a website.

We then have an XML version, which is ultimately the same thing. However, this is structured for search engines to help them better crawl and understand your website. Now, if I’m losing you, don’t worry, as I want to jump onto my computer right now and show you both of these sitemaps and what we have to do with them.

Right. So let’s recap. There are two sitemaps that we need for your website, an HTML version and an XML version. The HTML version will often be found on your website and it will look very similar to this, just a list of the web pages on your website ordered by hierarchy, and generally there’ll be a link in the footer or somewhere on your website for users to view this page and assist with their navigation around your website.

If you don’t have an HTML sitemap, you can use a plugin or a tool to generate one for you very easily and once it’s there, pop a link in the footer and you’re done. You don’t need to do anything else with the HTML version. The XML version, on the other hand, is something we definitely do need and I’m going to show you why we need it and talk you through how to get it now.

Often the XML version doesn’t look as pretty, it will just be a list similar to this and it would just list all of the pages. This is what helps Google crawl your website. Now, most CMS’s will have plugins and tools and the ability to install or activate an XML plugin. WordPress, for example, has exactly this. You can use the Yoast SEO plugin or one of many different options.

If your website is custom built or if it’s in a CMS that you can’t find plugins for, you can generate an XML sitemap manually yourself by going onto Google and looking up ‘XML sitemap generator’. A quick search for this and clicking on the first organic result here will bring you this website.

Now, all you need to do here is pop in your URL, choose your change frequency, which generally for most websites is either weekly or monthly depending on how often you update your website, and then start. You have a maximum crawl limit of 500 pages and once generated, you will get an XML file.

You then need to upload this to your website where your website files are hosted. Now, you may do this yourself or you may get someone like a web developer to do this on your behalf but that XML file needs to be popped with your website files. And once it is live, you’ll generate a page similar to this. It can look many different ways, but ultimately, this is the type of thing that you’re looking for and crucially, it will be on a URL which will have .xml.

Now, once you have this, you need to take this page. You don’t need the whole URL, you only need the actual name of that page and you need to go to Search Console. In here, go to your dashboard and to where you have sitemaps. If you click into this, there is an option at the very top right-hand side which is ‘add and test a sitemap’, click in here and past in the URL string of that sitemap. You can test it or just straight submit it. The test will only take a minute or two and this will force Google to crawl your website and it will just help Google better understand the hierarchy and structure of your website. It’s a five-minute job but it’s well worth doing, and it’s something that a lot of site owners don’t do.

So definitely, get that done. And one word of warning, don’t upload or don’t add into here the HTML version. It’s something I see time and time again. If you have a sign-up like this, great, but do not upload it into here because you’ll just get errors galore. You need to make sure that the version you are uploading to the sitemap section of Search Console is .xml. As long as you’ve got that, submit that. As ever, if you have any questions on doing this task, just let me know. Just get your sitemaps created, installed and published.

Next up is the robots.txt file. Now, this is a very small file and most websites by default will have this, and the reason I bring it up is we need to very quickly check that there aren’t any disallow commands within this file, as if there are, they can prevent search engines from crawling and indexing your content. It’s a very quick check and I’m going to show you how to do it right now.

Ok, so what I need you to do is hop onto your website and at the end of the URL put: /robots.txt. If you get served with a page, great. If you don’t and there’s ‘page not found’ or a 404 or ‘this post doesn’t exist’ or anything like that, then you don’t need to worry.

However, if you have it, that’s fine, we just need to make sure that the wording doesn’t just say this. If it just says ‘user agent’ and has an asterisk, which means ‘all’, ‘use agent’ or ‘disallow’, this is telling all search engines to ignore this website. I always like to check this and it’s important that you do as well because if you’ve had your website recently rebuilt, or you’ve launched a new website, or you’ve changed your website, or you went down for maintenance mode, or one of many different reasons, often your website will be put into this no-index rule, which basically tells Google ignore this website, delete this website, don’t take note of this website. Basically, do not index it.

Now, obviously, we’re trying to optimise your website and get you at the top of Google, so this is completely counterproductive. It’s a quick two-minute check, I just want to ensure that this single command isn’t in this file. If you’ve got various other disallow commands which are forward sash and there are certain folders or file names in here, that’s absolutely fine because you can exclude certain things from your website. For example, I’m excluding all the admin section, I’m excluding text, I’m excluding any PDFs. That’s all fine. But what I don’t want to be doing is excluding all webpages from my website and this command will do that.

So it’s a quick check that would take you two minutes. As long as you don’t have this, you’re fine. If you do have this, you need to speak to your web developer and tell them to remove it as soon as possible. And as ever, if you’ve run into any trouble, just let me know.

Lastly, we need to resubmit your website to both Google and Bing. This forces the search engines to re-index your website and take note of all the various on-page changes that we have made to date and throughout this module.

Now obviously, if you haven’t finished optimising all your individual pages as far as the on-page SEO goes, then don’t do this, but once you’ve completed all those on-page changes and you’re happy that your website is ticking every box that it needs to, then follow the steps I’m about to show you right now.

Go onto Google and search for ‘submit site to Google’. Enter that, and the first result will be ‘submit URL in Google’. Now, this will assume that you’re logged in to your webmaster or your search console account, so make sure that you are. If you’re not, it will prompt you to do so. You’ll then be greeted with a URL bar and an ‘I’m not a robot’. All you need to do is take your website URL, pop it into here, ‘I’m not a robot’ and submit request.

This again forces Google to crawl and index your website and having made all the changes that we have throughout this module, it’s really important that we get Google to re-crawl your website and take note of all of these improvements as soon as it can so they can start moving you up those search result pages.

Once you’ve done it for Google, then do exactly the same thing for Bing. So, it’s just ‘submit to Bing’ and again, you’ll be greeted with the same idea. Pop in your homepage URL there, enter the code here and submit. Once you’ve done that for both, those are the two primary search engines, you don’t need to worry about submitting anywhere else. We’ve already done the sitemaps so with this as well, within two or three weeks, you’ll be seeing your on-page changes that you’ve made indexed in Google and improving your website.

Well, there you have it, Google will now re-crawl your website and take note of all the changes and improvements that you’ve made.

We have reached the end of module one, so let’s quickly recap everything that we’ve covered throughout this first module: We started with an SEO overview where I gave you a good sense of how SEO is impacting businesses just like yours right now.

We then moved onto housekeeping where we checked analytics and Search Console to make sure they were set up and recording your traffic. Then we moved onto keyword research where I showed you several ways in which to find and cherry-pick the very best keywords so that you can stand a greater chance of appearing top of Google. Then we looked at learning and understanding on-page SEO, I walked you through the 12 on-page ranking factors that are vitally important.

We then looked at optimising your website on a page by page basis covering all 12 of those important tags. Lastly, we looked at a website health check to make sure that there are no underlying issues that could be affecting or prevent your website from performing and reaching the very top of Google.

We have covered a lot of ground and hopefully, you have learned a great deal. As we move into module two and beyond, we’re going to take SEO to the next level and we’re going to look at the ranking factors beyond your website that affect your ability to perform well online. It’s going to be fun and I look forward to seeing you there!

get in
touch