top of page

Technical SEO Guide - Deepak Parmar

We all know about basic SEO fundamentals, but today I'm going to talk with you about technical SEO things.

So let's begin the Technical SEO Guide

In Technical SEO, the most important thing is website speed, and a good hosting provider can solve half of your website performance problems, so here are things you should look for while selecting the web hosting provider for your website or blog.

  1. Response time: Always ensure that your hosting provider offers at least a 99.9% uptime guarantee. The higher the uptime response, the greater the percentage of the time your site will be online and accessible.

  2. Your hosting provider's server location: the farther away your server is, the longer it may take for your site to load. If the majority of your visitors are from the United States, try to choose a service location in the United States, but if they are from all over the world, you can use a CDN.

URL structure is also the important part of technical SEO, it should be user and search engine bots friendly, the best practices for URL structuring are

  1. In the URls, always use hyphens, not underscores.

  2. Always use the relevant keyword as the category name in the page URL. Never choose irrelevant category names.

  3. Try to avoid using dates in URLs and instead use a simple URL structure that includes your primary keyword.

  4. Once Google’s John Mueller said that excluding homepage URL, a URL with and a URL without a trailing slash are considered different web pages, so try to avoid using trailing slash URLs and without trailing slash URLs both at the same time. Instead, stick to one so that it will not create confusion for bots

  5. Keep your web page URLs shorter. This will be helpful for both search engine bots and users.

An XML Sitemap is a collection of webpages on your website that you want to index, and an XML Sitemap is the way by which a search engine bot can find and index your pages quickly. So the best practises of the XML sitemap in technical SEO are:

  1. Non-canonical and robot.txt-blocked URLs should be avoided in the XML sitemap because they serve no purpose.

  2. Always use the lastmod tag. Google cares about it.

  3. Google ignores priority tags and does not care much about change frequency tags (you can avoid them) .

  4. Make sure your XML sitemap does not contain more than 50,000 URLs. If your website or blog has more than 50,000 URLs, you will need to split your URLs across multiple XML sitemaps.

Crawling budget is also the part of technical SEO and robot.txt is the way by which we can optimize the crawling budget of our website

Robot.txt is a simple file that tells search engine bots which URLs should not be crawled. You can access any website's robots.txt in a simple way like this: example.com/robots.txt Here I’m not going to talk about how to create robots.txt. For that, you can watch my robots.txt video it's in Hindi language.

But here I'm going to talk about why you need robots.txt and which pages you should block in robots.txt

Why do you need robots.txt?

  1. to block search engine bots from crawling any specific page or directory.

  2. To optimize crawling budget of your website by blocking unwanted pages from crawling,

Pages you should block in robots.txt

  1. Non-Canonical Pages (Duplicate Pages)

  2. Pagination pages

  3. Thank you page

  4. Pages with no content

  5. Admin page (like: wp-admin)

The idea of this post I have taken from the Search Engine Journal technical SEO blog. I have shorted it and written in my own language.

36 views0 comments
bottom of page