Technical SEO for ecommerce

Last updated: December 14, 2023

Ben Hodgson

Ben Hodgson is an SEO Strategist at The Evergreen Agency; working with clients to deliver their strategy and keep them informed of all updates and progress made to improve their online presence through best practice SEO.

Read Ben's bio here

Technical SEO: a crucial element to the user experience of your website and something you should definitely be taking into consideration.

In this guide, find all the information you need to ensure your website is based on a solid foundation; supporting future growth:

  • What is technical SEO?
  • Why is technical SEO important?
  • How do I evaluate my website’s performance?
  • Best tools to use for Technical SEO
  • Technical SEO mishaps, mistakes, and do’s and don’ts

Let’s get into it!

What is technical SEO?

Technical SEO is the umbrella term used to describe the overall technical performance of your website. It encompasses best website practices including technical optimisation, website accessibility, schema markup, technical best practices, and the overall structure of your website.

Technical SEO looks at how well your site is optimised on the technical level. Using a plethora of tools, it’s possible to identify and examine your technical website performance.

This includes factors such as:

  • Overall site speed and accessibility
  • How well-optimised your website pages are
  • How easy it is for search engines to find, read and understand your content.

All of these are essential factors that build a solid technical foundation for your website. If properly assessed, maintained and optimised, it will greatly increase your chances of performing well in search engine results, and getting your website seen by the right audiences.

Why is technical SEO important?

Technical SEO builds out the technical foundation of your website, which means it’s essential to get it right. This isn’t just for the benefit of your customers either; technical requirements and accessibility compliance are set out by GDPR legislation.

Furthermore, it’s just good to provide a fantastic user experience and a speedy service. By doing so, you can win over more customers and drive conversions.

How do I evaluate my website’s performance?

As the web evolves and user behaviour changes, it is important to keep up to date with the latest technical standards. You can achieve this by conducting regular technical audits, keeping informed of the ever-changing online legislation, and following best practices.

There are a few different metrics you can use to benchmark your website’s performance over time.

The two core considerations are page speed and technical auditing.

We look into this in more detail below.

Website technical audits

  • Technical audits pick up issues by crawling your site and analysing pages against a set list of standards.
  • Audits pick up linking issues, errors, orphaned pages, accessibility issues, performance issues and more. Use the report as the foundation of your technical SEO strategy.
  • Audits can be as broad or specific as necessary. Once you’ve audited your website, you can review the report which will outline all detected issues, why they’re important, and what you can do to fix them.

Be sure to use mobile-first indexing and enable Javascript to get an accurate understanding of how Google sees your site, as this is how they crawl.


Pagespeed is evaluated using a number of different methods.

A great couple of tools to gauge your page speed are Lighthouse and Google PageSpeed Insights. Both tools are free to use and are published and maintained by Google.

The following tools look at your Core Web Vitals to piece together a strong overall picture of your website loading time. :

First Contentful Paint
Largest Contentful Paint
Total Blocking Time
Cumulative Layout Shift
Speed Index
First Input Delay (Due to be updated to INP in March 2024 – you can read more about this important Core Web Vital update here)

Don’t forget: It is important to read the published documentation and understand how you can improve each of these key performance indicators.

Tips to reduce website load time

Here are some quick wins that you can utilise to speed up your site:

  • Use next-generation image formats to reduce resource loading times – such as WebP or AVIF.

Legacy image formats are not efficiently compressed, making loading them more time-consuming.

  • If you use plugins, do so sparingly.

Plugins increase load time as they often rely on third-party resources stored elsewhere. Most features should be available natively where possible.

  • Avoid overusing Javascript.

Executing Javascript is a costly expense resource-wise. If your site uses a lot of Javascript, consider if you can rework existing functionality to decrease load time.

Best tools to use for Technical SEO

There are plenty of tools available which claim to help with SEO. Here are a few of the ones that we use, each covering a different aspect of technical SEO.

They can be used to complement each other and to build up a larger picture of your overall technical SEO health, highlighting areas to improve as well as reaffirming good optimisations.

Google Search Console

Google Search Console is invaluable to understanding which search queries and keywords are bringing visitors to your site. Being a Google product, this information is incredibly accurate as this is 100% Google’s data. You can examine on a page-by-page basis the queries, keywords, and impressions you are getting.

Moreover, a little spreadsheet manipulation will enable you to see a full breakdown per site category, so long as your URL structure is in good order! Google Search Console is completely free to use, and you should be using it to gain a better understanding of how your site is displaying in the SERP.


Sitebulb is our auditing tool of choice which we use to survey and assess clients’ websites against core web vitals, and accessibility standards, and to assess crawlability. With the option to fully customise each audit or to reuse settings from previous audits, it saves time when performing regular check-ups to assess technical standards.

Since Google uses mobile-first indexing, we recommend you use this setting when conducting technical audits. If you have a desktop-only site with a separate mobile site, you can switch to desktop indexing.

PageSpeed Insights

PageSpeed Insights from Google gives you a great breakdown of your core web vitals and offers solutions on how to fix these issues. Completely free to use and open source, this is a great way to quickly get an overview of how your site is performing.


Lighthouse is similar to PageSpeed Insights however provides a broader view of data and recommendations. It can be executed as a browser extension, in Chrome DevTools, or as a Node module.

It is best used in conjunction with PageSpeed Insights.

Technical SEO mishaps, mistakes, and do’s and don’ts

1. Robots.txt

What is robots.txt?

Your robots.txt file instructs bots (crawlers) how they should crawl the pages on your site.

How does it work?

You can use your robots.txt to specify different instructions per crawler.

Specifying a link as ‘do follow’ indicates that you ‘vouch for’ the target URL. There are many instances where this is appropriate, such as linking to a high-quality guide or resource that you want to share with your readers.

No-follow links are where you do not ‘vouch for’ the target URL. Again, there are many different situations where this is useful. The most common use would be applying this to all links used in comment sections on your website.

If you have commenting functionality where users can share links, you don’t want to be vouching for unknown and potentially harmful websites with do-follow links.

Common mistakes people make when using robots.txt

If you have sensitive areas which should not be accessed, you cannot simply rely on a disallow rule to keep content hidden. Instead, consider user authentication and use a login screen to secure areas as appropriate. Specify the crawler name, or ‘User-agent’, and list each URL you do not want the bot to crawl.

Most modern Content Management Systems (CMS) will automatically populate your robots.txt file, so you will not need to worry too much about it. Nonetheless, it’s worthwhile taking a look to make sure you’re not accidentally disallowing crawling to areas you do want crawled.

Working example

Here is a segment of Facebook’s robots.txt, with the rules for Google’s crawler named Googlebot:

User-agent: Googlebot
Disallow: /ajax/
Disallow: /album.php
Disallow: /checkpoint/
Disallow: /contact_importer/
Disallow: /dialog/
Disallow: /fbml/ajax/dialog/
Disallow: /feeds/
Disallow: /file_download.php
Disallow: /job_application/

2. Navigating your sitemap

What is a site map?

As the name suggests, your sitemap is a map of your website. It also contains metadata; information about the URLs such as when the page was published, alternative language versions, and if/when it was last updated.

How does it work?

Sitemaps provide navigation support to bots that crawl your website for indexing. Bots will read this information and understand that URLs contained within this file are important to your site.

3. Index or no-index?

Indexing your core pages is paramount for ranking well in search engines for topics and themes that you are targeting. Equally, you do not want pages (such as a checkout page) appearing in search results when they possess no real intrinsic value for visitors.

In most instances, your CMS will automatically no-index the pages that it feels are not important; administrative pages or pages critical to technical infrastructure are not providing good value to users and, therefore, Google.

Occasionally pages are no-indexed when they should however be indexed.

No-indexing a page that holds value can negatively affect your site in Google’s eyes. Important pages like a category page or product collection being no-indexed will make them ‘invisible’ to search engines and subsequently will not be shown in search results.

If a category or product(s) are not visible, you won’t rank for them. So make sure they’re indexed!

When should I no-index a page?

You should no-index a page when you do not want it to be shown in search results. This can be pages that are used for administration, development or staging, or those crucial to the technical functionality of your website.

Everything else that adds value should be indexed.

An easy way to know if a page should be indexed is by asking yourself “Would I find this page useful”? If yes, then it should be indexed.

4. Canonical URLs

A canonical URL is the correct URL that users should be directed to when differentiating between duplicate content, or content that offers the user similar information. Duplicate content can negatively affect your rankings. However, if duplicate content pages cannot be avoided, it is vital to canonicalise one URL as the ‘main’ location for the content.

By canonicalising a URL you are explicitly stating which page is the ‘main’ one. This can be done in most CMSs, or in the page code directly by including a rel=”canonical” link element in the head element. For more information on how to correctly set up a page as canonical check out Google’s documentation.

Pages should self-canonicalise unless it has been identified as duplicate content, in which case you should canonicalise the page you wish to accredit the equity to and perform a 301 redirect from the non-canonical page to the canonical.

5. Redirects

Redirects are not universal. Surprisingly there are nine possible redirects a server can respond to requests with. The most commonly used are 301 and 302 redirects.

When you are implementing a permanent change, use a 301 redirect. If the change is temporary, if you’re doing maintenance etc., use a 302 redirect. These are the primary types of redirects you will need to implement.

Redirects, when implemented correctly, are harmless and do not impact SEO. Make sure you do not have 1000s of them though, as this will absorb a lot of your crawl budget and may result in parts of your site not being crawled.

If you have a lot of redirects that form a redirect chain, the end content will not be crawled as the request will timeout. If you’re going from page A to page F, one redirect from A to F is all that is required. Do not redirect from A to B, C, D, E then F.

6. URL formatting

There are four versions of your homepage URL. These are:

That’s four different pages with duplicate content, vying for link equity and diluting your online authority.

Choose one URL standard to use for your site and stick with it. HTTPS is the main standard and a must for ecommerce sites (or any site that handles financial transactions).

Canonicalise your chosen URL and redirect all requests to the other URLs to the canonical. This way you are streamlining all incoming traffic to the correct URL, maximizing equity.

Go get technical

There are many different facets, all of which are individual instruments within the overall orchestra of technical SEO. A cumulative effort in each of these areas will result in a strong technical foundation and strong results in the SERP.

This is not an exhaustive list of tools that you can use, but they provide a good starting point in order to accurately assess the technical standing of your site. Use these tools in conjunction with one another to develop a strong overall understanding of where and how to improve.

Need more help understanding your technical SEO? Check out our ecommerce hub for more resources, or visit our SEO page to see how we’ve mastered technical optimisation.


👋 We are Evergreen and we grow ecommerce brands.

👉 See our case studies.

👉 Discover our story.

👉 Subscribe to our YouTube channel.

👉 Join our weekly newsletter for digital marketing that cut through the noise.

Sign up to our newsletter

By signing up you consent to storage of my data according to the Privacy Policy

Interested in working with us?

Get in touch Arrow

Visit us

Unit 1 & 2, Willows Gate,
Stratton Audley,
Oxfordshire, OX27 9AU

Monday – Friday: 09.00 – 17.00

Sign up to our newsletter

By signing up you consent to storage of my data according to the Privacy Policy

Sign up to our newsletter