Thursday, March 28, 2024
Daily Tech Updates


The Ultimate Technical SEO Guide for Your Website

By TrendsTechBlog , in MARKETING , at March 27, 2022 Tags: , , ,

Let’s start with the question: What’s technical SEO? Technical SEO is a web page optimization method that aids in the discovery, understanding, and indexing of site pages by search engines such as Google. Unlike on-page SEO optimization, technical SEO isn’t focused on optimizing the content on the site, but on technical variables that impact search engines’ capacity to effectively assess the site

Technically, SEO alone won’t drastically increase the position of a site on search engines if the content isn’t of good quality and if on-page optimization isn’t done. However, if a technical issue prevents search engines from indexing your site, it won’t appear in the results regardless of the quality of the content and other SEO activities you have undertaken.

By reading below, learn:

  • how to start making it simpler for Google to find your website and start displaying it in search results as soon as possible,
  • what’s the Google Search Console and what are some of the most important features that this platform provides you,
  • how to speed up your site and avoid someone leaving the site just because the page takes too long to load.

The whole guide is written from the point of view of someone who doesn’t know how to program, so you don’t need any programming (or any other) experience to follow this text and technical optimization of the site. This is especially true if the site you’re optimizing is made in WordPress.

If the site you want to optimize isn’t made in WordPress, we assume that you’re definitely cooperating with the developer who created the site, and in the text, we’ll list the exact parts in which you need developer help.

What Activities Does Technical SEO Cover?

Many factors affect technical SEO, and we’ll arrange them as follows for the purposes of this article:

  • Aspects that influence site indexing
  • Content that’s duplicated
  • So-called noindex tag
  • Robots.txt file
  • Canonical tag
  • Load speed optimization
  • Site structure
  • Responsive web design
  • SSL certificate

Some of these terms may be unfamiliar to you but don’t worry, they aren’t overly complicated. Let’s first look at the most important factors that determine whether your site’s pages will appear on search results at all or not.

Indexing

Indexing basically means that search engines were able to find your site when they searched the Internet and easily added it to their site database. Based on whether the site is properly indexed or not and what the search engines found on the site, they later rank it for relevant terms.

In order for a site to appear on search engine results, it’s necessary for search engines to find that site through crawlers (small bots that go on the Internet and find sites). Most of the time, this will happen on its own after a while, but there are a few things to consider to do to make their job easier and faster.

You also need to make sure that there’s nothing to prevent your site from being indexed. The Google Search Console platform allows you to verify most of these items and more.

Google Search Console

It’s a free tool that allows you to understand a lot concerning your website and the people who visit it. You can use this platform to learn things like the number of visitors who visit a website through search engines as well as the terms people employ when searching, the number of organic hits, etc.

With the Google Search Console, you can also check that your site’s pages are properly indexed, check that your site is optimized for mobile devices, and see any potential errors. This is the part that would interest us most for the needs of this guide.

Before you can use all of these options, you must first add the site to the Google Search Console. Worth noting:  it’s better to do this as soon as the site is launched given that the Google Search Console commences gathering data from the moment you add your site to the platform and can’t display any data from an earlier period.

The process of adding a site to the platform is very simple.

  • First step: Log in to your Google Account – use one where Google Analytics is implemented (we’ll explain why below)
  • Second step: Visit the “About” section and click on “Start now” inside the Google Search Console page.

Third Step: Next page will show two options for integrating the site with the platform If you have several subdomains and demand DNS validation, the first option is preferable. In the vast majority of circumstances, a different choice with a far simpler verification process would suffice. Write down your domain and press “Continue”.

It’s time for verification. The easiest method we recommend in all cases is Google Analytics verification. If you’re logged in to a Google Account where Google Analytics is already set up, simply select this option and then click “Verify”. After a short check, you’ll be taken to the Google Search Console platform where you’ll be capable to see a great deal of information about your page, and indexing will be faster. If you have just added a site, you won’t be shown any data, but it will become popular over time.

The next action is to submit a Google Search Console XML Sitemap. A sitemap is a file through which you provide information about pages, videos, and other content on your website and the relationships between them. Search engines like Google read this file in order to properly search and index the site. The indexing of your site might be expedited by submitting a Sitemap to Google Search Console. This is especially important for new sites, but it is also useful for sites that have been around for some time. To add a sitemap, you must first create one.

Creating an XML Sitemap in WordPress

In this regard, we emphasize the Yoast SEO WordPress plugin, which is very useful when it comes to optimizing WordPress sites. Although the option is a little hidden, this plugin also allows you to easily construct a sitemap.

The XML Sitemap can be found under General> Features> XML Sitemaps>. See the Yoast SEO plugin’s XML Sitemap section. You’ll be shown all the sitemaps that are likely to include content types such as posts, pages, projects, categories, etc. Depending on the structure of the site, in most cases, it’ll be enough to enter folders of posts and pages in the Google Search Console. Copy the URLs to these folders and paste them into the Sitemaps tab of Google Search Console.

Create an XML Sitemap if the Site Isn’t WordPress

If your site isn’t created in WordPress or another popular CMS (Content Management System) system, we assume you have a developer who created the site, so ask him/her to create an XML sitemap. If for some reason you still don’t have access to the developer, you can create a sitemap using the free Screaming Frog tool. Instructions for creating a sitemap using this tool can be found at https://www.screamingfrog.co.uk/xml-sitemap-generator/. Once you get sitemap links from the developer or use the Screaming Frog tool, you can also enter them into the Google Search Console in the Sitemaps tab.

Manual URL Preview

Another thing you can do to potentially speed up the indexing of your site’s pages is to manually review the URL. You can do this in the URL Overview tab in the Google Search Console platform. At the top of the page, type the URL you wish to index, and you’ll obtain the most up-to-date information Google has on that address. There you can see if Google can regularly find and index the address or there are some issues as well as whether the site is optimized for mobile devices. Hit the button saying “Require indexing” if the link hasn’t been indexed yet, or if you’ve made modifications to the website that you want Google to index more rapidly.

We recommend that you request indexing each time you place a new page on your site in order for Google to find and rank the page earlier.

Identifying Indexing Problems

In the “Coverage” area of the Google Search Console platform, you can look for indexing faults or notifications on your webpage. You can also see how many of the site’s pages have been correctly indexed and how many have been omitted from the ranking.

Duplicated Content

Simply put, duplicated content is a situation where multiple pages on the Internet contain the same or very similar content. Duplicated content can be found within the same site (e.g. multiple pages with the same text) or on different sites (e.g. the same text is found on two different sites).

Google doesn’t penalize sites directly for duplicate content, but it can cause many other crawling issues. Duplicated content is always best eliminated and there should be original content on all pages of the site.

There is an option to keep duplicated content or maybe there won’t be a way to remove it from the website. For example, if your site has different addresses for mobile and desktop versions, addresses with and without www prefixes, etc. In this case, you can only remove those pages from the Google index, and duplicated content will then not negatively affect your SEO.

Removing Pages From the Google Index

You’re probably wondering why you’d like some pages of your site not to appear on Google results. Isn’t it true that having more pages rank is better for SEO? Not always. Some sites are simply not good for first contact with site visitors and should be excluded from search results. This doesn’t mean that they’ll be removed from the site (users will be able to access them without problems), but they won’t appear on Google.

Here are some examples of pages you might want to remove from the Google index:

  • Categories and tags pages – if you have a blog and don’t want to display pages where all the texts by categories and tags are listed.
  • Comment pages – if you have a lot of comments and they are organized into pages, you don’t want those pages to be ranked on the results.
  • Carts, payment pages, thank you pages, and similar pages are always good to remove from search engines.
  • Duplicated content pages – avoid duplicate content whenever you can, but if you already need to have duplicated content on some pages – remove it from the index.

One way to remove these pages from the index is to request it in the Removals section of the Google Search Console platform. This is a great solution if you only need to temporarily (up to 6 months) remove the address from the index.

For example, if you have pages about seasonal promotions so you don’t want them to appear until next season. Hit the “New request” button. Then, after entering the web address you wish to erase and clicking the “Next” button, Google will ask you if you want to temporarily remove the address from the index. Confirm and the request will be sent. If you still want to delete some pages from your browser permanently, the best option is the noindex tag.

Noindex Tag

A Noindex tag is a piece of HTML code that can be added to a page and signals to search engines not to index that page. Search engine crawlers can visit the site but don’t include it in the results. If the site you’re optimizing is made on the WordPress platform, you can add the noindex tag to the pages using the Yoast SEO plugin. While editing a page at the bottom you’ll have a Yoast SEO section and in the Advanced tab, you can choose not to index that page.

This is a great method for removing individual pages from the index, but what if you want to remove categories and tags? This can be done in the Yoast SEO plugin’s Taxonomies tab in the Search Appearance section. If your site isn’t built on the WordPress platform, we assume that you have a developer who can add a noindex tag to pages you don’t want to appear on search results. If for some reason you don’t have access to the developer, you can add a noindex tag inside the <head> part of the HTML code of the page and this tag looks like this: <metaname = “robots” content = “noindex”>.

In the URL Overview tab of the Google Search Console platform, you can check that the noindex tag is working properly. Enter the URL and click the “Test Active URL” button. If you get a message the URL isn’t available to Google, it means that the noindex tag is set correctly. However, if you get a message that the URL is on Google, it means that the noindex tag isn’t implemented properly.

Once you’ve placed a noindex tag on a page, it usually takes Google a few days to a few weeks to pull changes. There is also one more way to control what search engines read and index on your website – the robots.txt file.

Robots.txt File

It’s a text file that instructs search engines on how to crawl specific HTML files or sections of websites. A robots.txt file isn’t required by most websites because Google and other search engines can easily index all of the site’s important pages.

In cases where you want to remove a page from the index, the noindex tag can solve this easier than a robots.txt file. The robots.txt file, on the other hand, is quite handy if you wish to exclude certain files (PDFs, images, and so on) from the browser index. This is a rare case and would be beyond the scope of this text, but you can find all information at Developers at Google’s website if you want to learn more about it.

You’ll very certainly come upon a robots.txt file indirectly if you have a site built on the WordPress platform and for some reason noticed that your site doesn’t appear at all in search results. Once you have installed WordPress and started working on the site, there’s no point in indexing that site because it’s under construction. WordPress has a feature that automatically uses a robots.txt file to require search engines not to crawl and index pages, and that feature is automatically enabled. It’s therefore important that, once your site is ready for users, in the Settings> Read section of the WordPress CMS, remove the bar in front of the Discourage web browsers option from capturing this site. A few days after you make this change, your site will likely be in the search index.

We’ve seen a few ways to remove pages with duplicated content from the index. But what if for some reason you don’t want to remove those pages from the index, you don’t want to change the content, and you don’t want it to negatively affect your SEO? The solution is the canonical tag.

Canonical Tag

It’s also called “rel canonical” and notifies search engines which URL is the primary page for a specific piece of content. This address ranks in search results while search engines crawl much less often and don’t generally include other sites that have similar content in the results.

One of the greatest ways to avoid difficulties appearing on multiple pages of a site and caused by the same or similar information is to utilize the canonical tag. The easiest way to set up a canonical tag on a page is, again, the Yoast SEO plugin for WordPress. In the WordPress panel, go to the page you want to add a canonical tag. You can enter the address of a page with the same or comparable content in the Advanced tab, and you want only that page to be indexed for that content.

If your site isn’t built on WordPress, you can add a canonical tag using HTML by entering inside the <head> section of the page: <link rel = “canonical” href = “https://example.com/sample-page / “/>.

Site Speed Optimization

Google strives to give consumers high-quality, fast-loading information. Users want sites to load in seconds, and if they don’t, they go back to the search results to find a better, faster alternative. This reflects badly on the average time spent on the site and the bounce rate, and therefore on SEO. Therefore, page loading speed is among the most important factors of technical SEO.

Alright, it obviously makes sense to you because no one likes waiting for a website to load, but how do you speed up a website? Some of the things you should consider if you want your site to load quickly are:

  • Web Hosting
  • Caching
  • Minification of the elements
  • Image compression
  • CDN

We’ll briefly look at each of these factors.

Web Hosting

All content on your site must be stored on a server. That server is essentially your web hosting. You can find many hosting providers and packages but now let’s take a look from the Technical SEO point of view. We need to see what is the location of the server and to see what is the category of hosting

The server’s location is crucial because the more distant the server is from the source of the request to load content, the longer it will take for the server to respond and the site to load. So, the best move would be to pick some servers that are close to your target audience. If most of your users are in the UK, then you should select some hosting provider that has servers there. Most of the time, you won’t be able to find servers in exactly the location you need (or they are over your budget), but there is also a solution for that – use a CDN. There are two main categories of hosting to choose from: shared and dedicated hosting.

  • Shared hosting means a hosting package where the site shares the server with some other sites and in this case, you rent only part of the server. As a result, your site will load slightly slower, but you’ll pay significantly less for hosting.
  • Dedicated hosting means renting an entire server that will only host your site (or sites). This means that the loading time will be much faster, but this type of hosting is also much more expensive.

For small and medium-sized businesses, in most cases, shared hosting will be enough, for which you can expect to set aside around 30-50 euros per year. We recommend Hostinger. If you have more than one site, you can go through the site cheaper if you take one of the hosting packages that allow more sites.

Even after doing all the other speed optimization methods, we’ll list here, dedicated hosting can halve the loading speed. However, with this benefit comes a price of at least 800 euros per year, so we recommend this only to larger companies for which this cost wouldn’t be a significant investment.

Caching

Cache memory is located between one or more web servers and users, and watches incoming requests and saves copies of responses – such as files, images, and HTML pages. When a user requests the same URL again, the cached copies are served to them instead of the content returned from the original web server.

Caching is utilized for two primary reasons:

  • Load speed time – The page loads faster because the request is handled by the cache (which is closer to the client) rather than the source server.
  • Reduce data traffic – Because content representations are re-displayed from the cache, this decreases the volume of data traffic. This is good because hosting providers usually limit how much data traffic can be used on a daily or monthly basis.

If your site isn’t made in WordPress, the developer who created the site probably implemented caching (ask just in case). If the site is made in WordPress, there are several plugins that allow you to easily implement caching:

  • WP Rocket
  • WP Super Cache
  • W3 Total Cache
  • WP Fastest Cache

Some of these plugins (WP Rocket, for example) allow for element minification in addition to caching.

Minification of the Elements

Another technique to accelerate your site is to use clean and compressed HTML, JavaScript code, and CSS. The very first task should be to get rid of any unnecessary code. This redundancy can result from features you no longer have on your website or from poor coding. It all comes down to that the more clean code there is, the faster the site will load.

As we have already mentioned, WordPress plugins like WP Rocket plugins allow you to do this without coding if your site is made through this platform. After cleaning the code, it’s recommended that you compress it with a program such as GZip.

It’s important to note that minifying elements using plugins can often cause problems loading elements on the site. Therefore, it’s recommended that after turning on the minification, make sure that everything on the site works properly. If an issue arises, simply disable the minification option in the plugin and your site will resume normal operation.

Image Compression

Large images slow down your site which is pretty bad for the user experience and therefore SEO. Image optimization is the process of reducing the space that images take up on the server, using a plugin or script, which speeds up page loading time. All images on the site should be compressed to a size of 1-2 megabytes or less.

High-resolution images are necessary when printing, but on the web, there’s no need to be so large because the additional quality on the screens isn’t noticeable. Lossy and lossless compression of images are the two primary kinds. It’s important to understand these terms because different tools offer different types of compression.

  • Lossy – This type of compression can significantly reduce the image size, but there is also a risk that the images will significantly lose in quality. If you opt for lossy compression, it’s very important that you don’t overdo it.
  • Lossless – This type of compression reduces the size to a lesser extent, but doesn’t cause image deterioration.

The image formats you should use for web use are PNG and JPEG. PNG files are usually of better quality and compression usually doesn’t affect the quality, but PNG files take up more space. PNG is, for instance, recommended as a format for images you put inside newsletters (previously created via the Benchmark platform containing responsive and mobile-friendly templates) because it’s more appropriate when more advanced transparency control and handling of logos and complex colored icons are required. JPEG files use both lossy and lossless compression and the quality can be adjusted for the optimal combination of quality and size.

Some of the programs you can use for compression are:

  • Adobe Photoshop
  • Tiny PNG
  • JPEG Optimizer
  • Optimizilla

CDN

A Content Delivery Network (CDN) is a collection of servers distributed across the globe, in parts that are geographically separated, for faster delivery of web content. CDN services store site content on servers around the world and, depending on where people accessing the site are located, send content from the nearest server.

As we’ve already mentioned in the section on hosting, this leads to faster loading of the site. If your consumers are located all over the world, a CDN is very beneficial. If this isn’t the case and most users are in a narrow geographical area, it’s enough to rent hosting near that location and then you don’t need a CDN.

There are numerous CDN providers available, with varying levels of quality. Cloudflare is one of the more popular low-cost services although we don’t recommend it. One CDN we’ve had a good experience with is BunnyCDN, especially since it’s paid per-used stream so you only pay as much as you really need and you can choose which continents the CDN will be included for and which won’t.

Site Structure

The structure of a website refers to how pages are organized and linked. The optimum site layout is one that makes it easy for search engine users and crawlers to discover the information they need on a website.

Google will have a hard time identifying and indexing pages on your site that are a few clicks away from your homepage or aren’t linked to anything at all. However, if the site is well-structured and the pages are linked to one another, Google will have no trouble finding and indexing all of the pages. Moreover, when you link to other pages of the site from sites that already have authority and backlinks, you transfer some of that authority to those other pages. This is a fantastic approach to boost the visibility of new web pages. A well-organized site improves the user experience by allowing users to find the information they need quickly and effortlessly.

Responsive Web Design

Today, users access sites more often via mobile phones than via laptops or desktop PCs. As a consequence, there is a rising demand for responsive site design. Responsive web design means that sites must be clear and easy to use on all devices that users use.

If it’s necessary to zoom in on the elements, the elements are on the edges or off the screen, etc., it’s very bad for the user experience and search engines punish it. Google implemented mobile-first indexing a few years ago, where it looks at how well the mobile version of the site is optimized, and based on that, it decides where the page should show in the search results. You can make the site responsive with CSS code and if you use WordPress, your site will automatically be responsive if you use any more popular theme (just in case, review well the theme summaries).

Click “Test URL” after you previously entered the URL in Mobile-Friendly Test made by Google to see whether or not it’s responsive. After a brief analysis, the tool will tell you whether the site page is well optimized for mobile devices or not.

SSL Certificate

SSL (Safe Sockets Layer) adds another layer of security between the web server and the browser, making the website more secure. If you have active SSL protection, users are less likely to hack information that they supply on your site, such as payment information or contact information. This added security is rewarded by search engines, which rank SSL-enabled sites higher in search results.

How can you know whether or not the SSL certificate is operational? There’s the green padlock displayed to the left of the URL in the browser. To have it there, the web address is supposed to begin with “https”, not with an old “http”. Most of the better web hosting firms either include an SSL certificate with cPanel for free or charge an extra fee for SSL. For a basic certificate, it’s usually approximately 10 euros per site.

It’s sufficient to have that basic certificate setup if no payments are made through your site. Although it needs some technical knowledge, you can always set up SSL through cPanel for free and yourself using the Let’s encrypt service.

Conclusion

Technical SEO is sometimes very complex and in this guide, we’ve listed the most important things you need to know in order for your site to be well technically optimized. If you do technical SEO correctly, research your keywords well, and write quality content regularly, SEO results won’t be lacking.

TrendsTechBlog