Specialist SEO Agency with clients in Oxford, across Oxfordshire & the Thames Valley
01865 306 889   [email protected]

Lesson Six: Website Health Check

This sixth and final lesson of Module 1 is a website health check, which is an extremely important thing to do periodically with your small business website.

Ensuring your website is performing as you expect (hope) is vital and in this lesson we guide you through a handful of easy checks you can do every so often to ensure you are on-top of things and aware of any potential issues that may arise.

Well done on making it this far and to the end of Module 1. Lots more to learn so we hope you’re enjoying this SEO training course so far.

Watch the next lesson: Module 2 – Lesson One: Leveraging Google’s Services – Part 1

Video Transcription

Hello, and welcome to this final lesson of module one.

We have covered a huge amount of ground so far, so take a moment to congratulate yourself as everything you have learned probably puts you on a par with many so-called SEOs out there.

No, really, you’re taking control of your destiny and learning the skills you need to build your business online and that’s fantastic.

Okay, so enough congratulating yourself. We have work to do and we’re going to look at some of the more technical aspects of website optimization. Don’t worry, I’m going to keep it as simple as I possibly can. First, let’s talk about sitemaps. Now there are two types of sitemaps. One is a HTML version which is aimed at the end user. So you’ve probably encountered this yourself, often found in the footer of websites. You clicked through and it basically shows the structure of your website. It lists posts and pages. I basically helps you navigate around a website.

We then have an XML version, which is ultimately the same thing. However, this is structured for search engines to help them better crawl and understand your website. Now, if I’m losing you, don’t worry, as I want to jump onto my computer right now and show you both of these sitemaps and what we have to do with them.

Right. So let’s recap. There are two sitemaps that we need for your website, a HTML version and an XML version. The HTML version will often be found on your website and it will look very similar to this, just a list of the web pages on your website ordered by hierarchy and generally there’ll be a link in the footer or somewhere on your website for users to view this page and assist with their navigating around your websites.

If you don’t have a HTML sitemap, you can use a plugin or a tool to generate one for you very easily. And once it’s there. Pop a link in the footer and you’re done. You don’t need to do anything else with the HTML version. The XML version, on the other hand, is something we definitely do need. And I’m going to show you why we need it and talk you through how to get it now.

Often the XML version doesn’t look as pretty. It will just be a list similar to this and it would just list all the pages. And this is what helps Google crawl your website. Now, most CMSs will have plugins and tools and the ability to, with a few clicks to instal, to instal or activate an XML plugin. WordPress, for example, has exactly this. You can use the Yoast SEO plugin or one of many different options.

If your website is custom built or if it’s in a CMS that you can’t find plugins for, if you look up XML sitemaps and you can’t find anything, you can generate one manually yourself by going onto Google and looking up XML sitemap generator. A quick search of this and clicking on the first organic result here will bring you this website.

Now, all you need to do here is pop in your URL, choose your change frequency, which generally for most websites is either weekly or monthly depending on how often you update your website and then start. You have a maximum crawl limit of 500 pages. And once generated, you will get an XML file.

You then need to upload this to your website where your website files are hosted. Now, you may do this yourself, you may get someone like a web developer to do this on your behalf. But that XML file needs to be popped with your website files. And once it is live, you’ll generate a page similar to this. It can look many different ways, but ultimately, this is the type of thing that you’re looking for. And crucially, it will be on a URL which will have dot XML.

Now, once you have this, you need to take this page. You don’t need the whole URL. You only need the actual name of that page and you need to go to the search console. In here, go to your dashboard into where you have sitemaps. If you click into this, there is an option at the very top right-hand side, which is add and test a sitemap. Click in here and pasting in the URL string of that sitemap. You can test it or just straight submit it. The test will only take a minute or two. And this will force Google to crawl your website and it will just help Google better understand the hierarchy and structure of your website. It’s a five-minute job, but it’s well worth doing and it’s something that a lot of site owners don’t do.

So definitely, get that done. And one word of warning, don’t upload or don’t add into here the HTML version. It’s something I see time and time again. If you have a sign-up like this, great, but do not upload it into here because you’ll just get errors galore. You need to make sure that the version you are uploading to the sitemap section of the search console is dot XML. As long as you’ve got that, submit that. As ever, if you have any questions on doing this task, just let me know. But that’s it. Just get your sitemaps created, installed and published.

Next up is the robot dot TXT file. Now, this is a very small file and most websites by default will have this. And the reason I bring it up is we need to just very quickly check that there aren’t any disallow commands within this file, as if there are, they can prevent search engines from crawling and indexing your content. It’s a very quick check and I’m gonna show you how to do it right now.

Okay? So what I need you to do is hop onto your website and at the end of the URL put forward slash robots dot TXT. That’s robots dot TXT into that. If you get served with a page, great. If you don’t, and if there’s page not found or 404 or this post doesn’t exist or anything like that, essentially if the page doesn’t exist, then you don’t need to worry, that’s fine. Yeah, you don’t need this file.

However, if you have it, that’s fine. We just need to make sure that the wording won’t necessarily be what’s here. But we need to make sure that it doesn’t just say this. If it just says user agent and has an asterisk, which means all, use agent or disallow, this is telling all search engines ignore this website. And I always like to check this and it’s important that you do as well. Because if you’ve had your website recently rebuilt, or you’ve launched a new website, or you’ve changed your website, or you went down for maintenance mode, or one of many different reasons, often your website will be put into this no-index rule, which basically tells Google ignore this website, delete this website, don’t take note of this website. Basically, do not index it.

Now, obviously, we’re trying to optimise your website and get you on top of Google, so this is completely counterproductive. So it’s a quick two-minute check, which I just want to ensure that this single command isn’t in this file. If you’ve got various other disallow commands which are forward sash and there are certain folders or file names in here, that’s absolutely fine because you can exclude certain things from your website. For example, I’m excluding all the admin section. I’m excluding text, I’m excluding any PDFs. That’s all fine. But what I don’t want to be doing is excluding all webpages from my website and this command will do that.

So it’s a quick check that would take you two minutes. As long as you don’t have this, you’re fine. If you do have this, you need to speak to your web development and tell them to remove it as soon as possible. And as ever, if you’ve run into any trouble, just let me know.

Lastly, we need to resubmit your website to both Google and Bing. This forces the search engines to re-index your website and take note of all the various on-page changes that we have made to date and throughout this module.

Now obviously, if you haven’t finished optimising all your individual pages as far as the on-page SEO goes, then don’t do this, but once you’ve completed all those on-page changes and you’re happy that your website is ticking every box that it needs to, then follow the steps I’m about to show you right now.

Simply go onto Google and search for submit site to Google. Enter that, and the first result will be submit URL in Google. Now, this will assume that you’re logged in to your webmaster or your search console account. So make sure that you are. If you’re not, it will prompt you to do so. You’ll then be greeted with a URL bar and a, “I’m not a robot.” All you need to do is take your website URL, pop it into here, and I’m not a robot, submit request.

And this again forces Google to crawl and index your website. And having made all the changes that we have throughout this module, it’s really important that we get Google to recruit your website and take note of all of these improvements as soon as it can so they can start moving you up those search result pages.

Once you’ve done it for Google, then do exactly the same thing for Bing. So it’s just to submit to Bing. And again, you’ll be greeted with the same idea. Pop in your homepage URL there, enter the code here, submit. And once you’ve done that for both, those are the two primary search engines. You don’t need to worry about submitting anywhere else. We’ve already done the sitemaps. So with this as well, within two or three weeks, you’ll be seeing your on-page changes that you’ve made indexed in Google and improving your website.

Well, there you have it, Google will now re-crew your website and take note of all the changes and improvements that you’ve made. And we have reached the end of module one. So let’s quickly recap everything that we’ve covered throughout this first module. We started with an SEO overview where I gave you a good sense of how SEO is impacting businesses just like yours right now.

We then moved into housekeeping where we checked analytics and the search console to make sure they were set up and recording your traffic. Then we moved into keyword research where I showed you several ways in which to find and cherry-pick the very best keywords so that you can stand a greater chance of appearing top of Google. Then we looked at learning and understanding on-page SEO. I walked you through the 12 on-page ranking factors that are vitally important.

We then looked at optimising your website on a page by page basis covering all 12 of those important tags. Lastly, we looked at a website health check to make sure that there are no underlying issues that could be affecting or prevent your website from performing and reaching the very top of Google.

We have covered a lot of ground and hopefully, you have learned a great deal. As we move into module two and beyond, we’re going to take SEO to the next level and we’re going to look at the ranking factors beyond your website that affect your ability to perform well online. It’s going to be fun and I look forward to seeing you there.

get in
touch