Technical SEO is the aspect of SEO that really looks at your site as though you were a search engine, so that you can alter it to show the parts of the your site that you want them to see. Doing technical SEO right can help your site run more smoothly, get indexed better and allows search engines to rank it for the search terms it should.
- Included in our Technical SEO Service
Below are some of the things we do as part of our technical SEO service. Click any of the items to read more about what it involves.
Having a logical and clean URL is important for SEO because it allows search engines understand your site better and helps them index it. Creating a good URL structure is important to get right from the start, but can be altered for already existing sites. It particularly helps with larger sites, allowing search engines to understand the categories of your webpages, and to reach pages which are “deeper” in the structure.
It’s important to keep a good overview of the status codes of your pages to and to maintain these accordingly. If you have a lot of 404 pages and links pointing to these, this doesn’t provide a good user experience, and also sends bad signals to search engines. The same goes for redirecting pages to unrelated topics. An initial audit needs to be done of your websites pages, and then the right actions should be taken to reduce issues.
The robots.txt file on webpages allows you to block robots (also known as “bots” or “crawlers”) from accessing your pages, and therefore control which pages search engines see (to an extent). This can be used to hide pages which don’t provide valuable information, but which must be shown. Pages also need to be checked to see that they’re not blocking search engines from viewing pages that they’d like them to see.
Too much code on pages with written content can make it difficult for search engines to find the text, and therefore index them for your chosen keywords. Page code-to-text ratio should be monitored for all pages.
Sitemaps are an overview of how your site’s structure fits together. These can be submitted to search engines, to help them understand and index your site.
Site speed is extremely important for the experience of users on websites. Slow-loading pages result in more people clicking back to search results. It is therefore certainly an indirect ranking signal for search engines, and is also quite possibly a direct ranking signal. This means the website should be audited and tuned up to maximize page loading speed.
Structured data markup allows your to add additional information to search results pages. This means that you can add things like reviews, additional pages, address details so that it appears directly in the search results below your website name. This greatly increases the chances that searchers will click onto your website
The above steps and more are included in our technical SEO process. Technical SEO should be constantly monitored, and is something that should be built in to your SEO process, updating as you go and informing the other aspects of SEO.
We therefore do an initial assessment to see if there are any errors to fix from the outset and then continually tweak from the technical side to ensure your site is being indexed properly.
Check our Case Studies
The URL structure of your website determines how your webpages are categorised and classified. Having a good URL structure means organising the pages of your website according to a logical system, having a bad one entails having no system. Having a good system is good for SEO in that it helps the user experience and it helps search engines index your website. If your webpages follow a logical pattern, search engines know how to quickly and easily read your webpages.
- What is a http status code?
A http status code is a response code which helps determine any faults in the loading of a webpage. A status code of 200 (“OK”) is the normal response of a successful http request. A response of 301 means that a webpage has been permanently redirected to another webpage. Whereas a 404 means that the page is not found, and can be due to – among other things – a page having been deleted. Dealing with irregular status codes both helps the user experience and search engines’ understanding of your website.
A sitemap is related to the URL structure of your website, and is like a printed image of your webpage structure. These can be generated for your website and then submitted to search engines so that they can better understand and index your site.
- What is a robots.txt file?
A robots.txt file basically tells bots how to interact with the page it is visiting. A bot is what search engines (among other entities) send out to “crawl” your website, in order to read what’s on your website and then index that information. It’s a sort of robot that collects information en masse. The reason robots.txt files are important for SEO is that it can stop a bot from crawling certain pages – ones that you’d rather not have indexed. This helps if you have a lot of content that needs to be on your site, but doesn’t necessarily add value to your site, or that wouldn’t be valued by search engines. By blocking search engines from reading this content, you can curate more easily your site’s value in the eyes of search engines.
- What is structured data markup?
Structured data markup in terms of SEO means adding certain html tags to the code of your webpages to better help search engines understand specific, more complex aspects of your page, such as ratings and publication dates. Tending to this properly can allow you to have these added pieces of data appear when your website turns up in the search results, which helps the appeal of your website. More information can be found here.
- Why is site speed important for SEO?
It is highly likely that page speed is a direct ranking factor for Google. However, even if it is not, site speed is such a contributor to good user experience, that it is certainly an indirect contributing factor to your site’s overall SEO value.
- Why is page code to text ratio important for SEO?
Page code to text ratio is important because having too much code in comparison to text can make it difficult for search engines to read your page. You could therefore be obscuring what’s actually on your page with the code you’ve put on there. It should be therefore constantly monitored and optimised.
Request my Free Consultation