Technical SEO: A Simple Introduction

Gopal AdhikariGopal Adhikari
10 min read

You may have heard that there are three types of SEO: On-page SEO, Off-page SEO, and Technical SEO. However, there are only two types: On-page SEO and Off-page SEO, with Technical SEO being a subset of On-page SEO, that means Technical SEO is one of the On-page SEO practices . On-page SEO encompasses all optimizations performed on a web page, including technical aspects such as HTML structure, URL structure, and more. Hence, Technical SEO is considered a part of On-page SEO.

What is Technical SEO?

Technical SEO is a significant part of On-page SEO, performed by developers while creating the website. It includes critical elements like HTML structure, mobile friendliness, and user experience. Effective Technical SEO can significantly boost website traffic by improving search engine rankings.

Importance of Technical SEO

Technical SEO is essential for ensuring better indexing and crawling of the website by search engines, which is crucial for higher rankings. Optimizing the technical aspects of a website such as HTML structure, providing sitemaps etc can help search engine to crawl, understand and categorize content more effectively. Therefore, optimizing technical elements is vital for improved crawling, indexing, and overall SEO performance.

Technical SEO Checklist and Techniques

To ensure your website is fully optimized for search engines, follow this comprehensive Technical SEO checklist. This guide covers essential elements that will help improve your website's indexing, crawling, and overall search engine performance.

  • Sitemap : Create and Submit a Sitemap to ensure you have a sitemap.xml file that lists all important pages of your website. Submit it to Google Search Console and Bing Webmaster Tools to help search engines discover and index your content efficiently.

  • Robots.txt : Create and optimize your robots.txt file to control which pages search engines can crawl. Ensure that it does not block important pages or resources that should be indexed.

  • Canonical Tag: Canonical tags are used in case of duplicate content by specifying the original content url. Search Engines down ranks such web pages or even whole website can be blacklisted. It can save from such penalties.

  • Broken Links : Regularly check for broken links and fix broken links (404 errors) on your website. Broken links can negatively impact user experience and search engine crawling.

  • 404 Page : Create a user-friendly 404 error page that guides visitors back to relevant content on your site, improving user experience and reducing bounce rates.

  • 301/302 Redirection: Implement 301 redirects for permanent URL changes and 302 redirects for temporary changes. It ensures that link user engagement is preserved and users are directed to the correct pages.

  • Structured Data: This helps search engines understand your content better and can enhance your search results with rich snippets.

  • OG Protocols : Use Open Graph (OG) tags to improve how your content appears when shared on social media platforms like Facebook. This can increase click-through rates and social engagement.

  • Twitter Cards : Add Twitter Card tags to ensure your content is displayed attractively when shared on Twitter, enhancing visibility and engagement.

  • Webpage Speed : Improve your website's load time by optimizing images, leveraging browser caching, minifying CSS and JavaScript, and using a content delivery network (CDN). Little delay in web page load time can essentially affect client involvement and can lead to low user engagement.

  • URL Structure : URLs should be clean, descriptive, and include relevant keywords. Avoid using complex query strings and keep URLs short and readable.

  • Silo Structure : Organize your website content into silos or categories to create a clear and logical hierarchy. This improves navigation, user experience, and search engine crawling.

  • Mobile Friendliness : Optimize your website for mobile devices by using responsive design techniques. Mobile-friendly sites rank better on mobile searches and provide a better user experience.

  • SSL and HTTPS : Use an SSL certificate to encrypt data and switch your website to HTTPS. Secure sites are preferred by search engines and provide a safer user experience.

  • SEO Tools : Use tools like Google Search Console, Bing Webmaster Tools, and various third-party SEO tools to monitor and optimize your website’s performance.

  • Monitoring Tools : Regularly use monitoring tools to track your website’s uptime, performance, and security. Tools like Google Analytics, SEMrush, and Ahrefs can provide valuable insights into your site's health and SEO performance.

By following this Technical SEO checklist, you can ensure your website is optimized for search engines, leading to better indexing, higher rankings, and increased organic traffic.

Sitemaps: Your Website's Blueprint

SItemaps is made up of two words “site” and “map” which means it contains the map of a website. It's an XML file listing all the URLs (web page addresses) on your website. Search engine use sitemap for effective crawling of web pages.

Here's what a basic sitemap entry looks like:

<urlset>
    <url>
    <loc>https://www.gopal-adhikari.com.np/</loc>
    <lastmod>2024-07-17T05:04:12.505Z</lastmod>
    <changefreq>yearly</changefreq>
    <priority>1</priority>
    </url>
</urlset>

You can check the live sitemap of this website here and the blogs sitemap here. When the application size grows, people prefer multiple sitemaps. It doesn’t matter in which route your sitemap is defined, it should be submitted to the search engine for effective crawling.

Robots.txt: Setting Boundaries for Search Engines

It is a regular txt file which defines the rules that should be followed by search engine crawlers (bots) that visit your website. This simple text file tells the bots which pages they can and cannot access. It should be available in your /robots.txt route. We do want search engines not to crawl some pages such as dashboard, user profile etc.

Here's what a basic ads.txt should looks like:

User-Agent: *
Allow: /

This says crawlers are allowed to crawl all the available web pages in the website.

For disallowing the search engine to crawl certain page which can modify as

User-Agent: *
Allow: /
Disallow: /dashboard

This allows search engines to crawl all web pages except /dashboard.

Canonical Tags: Avoiding Duplicate Content Tag

Canonical tag is an HTML element, used to point to the original content of a web page helping search engines to understand which content to prioritize.

For example, if you have an article on your blog and another page that summarizes it, the summary page should include a canonical tag pointing to the original blog post. This ensures search engines focus on the more valuable content.

In case of duplicate content, not providing the canonical tag can downrank your web pages since search engines promote the originality. This can even lead to blacklisting the website. So providing the canonical tag is very crucial to prevent our web pages from search engine’s penalties.

Here's what a canonical tag looks like:

<link rel="canonical" href="https://www.gopal-adhikari.com.np/blogs/technical-seo-a-simple-introduction">

Canonical tag should be placed inside your head element.

Imagine clicking a link on a website and ending up at a dead end. Broken links frustrate users and can make your website seem outdated. Check for broken links regularly to ensure the better user experience. You can take the help of a broken link checker to find out if there are any dead links.

404 Page: Creating a Positive User Experience for Error Pages

Even the best websites can have broken links. When this happens, it's important to have a user-friendly 404 page. instead of just displaying an error message, offer helpful options like a search bar, links to popular content, or a sitemap. This increases users' engagement and helps them find what they're looking for.

301/302 Redirection : Managing URL Changes

Sometimes web pages need to move to a new location. Redirects ensure users and search engines land on the correct page. There are two main types of user redirects:

  • 301 Redirects (Permanent): Use a 301 redirect when a page has permanently moved. This is common when a website changes its domain name or URL structure.

  • 302 Redirects (Temporary): Use a 302 redirect for temporary situations, like a "coming soon" page before launching new content.

Structured data : Enhancing Search Visibility with Rich Information

Structured data is a way to provide search engines with additional information about your web pages. This can improve the way your website appears in search results, making it more informative and visually appealing.

This is how a simple structured data looks like

<html>
  <head>
    <title>Party Coffee Cake</title>
    <script type="application/ld+json">
    {
      "@context": "https://schema.org/",
      "@type": "Person",
      "name": "Gopal Adhikari",
    }
    </script>
  </head>
  <body>
    <h2>Gopal Adhikari</h2>    <p>
      Hi my name is Gopal Adhikari.
    </p>
   </body>
</html>

This is the minimal structured data, but you can read about structured data in schema.org. You can write your schema and validate in the schema validator by schema.org.

All the schemas supported by schema.org are not supported by google. You can check google’s official structured data documentation. You can validate your schema org working in google or not in the rich result test provided by google.

Open Graph Protocols (OGP)

Open Graph protocols, a creation of Facebook, empower websites to include extra metadata. This metadata provides social media platforms like Facebook, Instagram, and Twitter with rich information about your content. Consequently, when your webpage is shared, these platforms can display a more engaging and informative preview, including a title, description, and image.

By implementing Open Graph tags, you control how your content appears when shared. Essential tags include:

  • og:title: The title displayed on social media.

  • og:type: The type of content (website, article, product, etc.).

  • og:url: The canonical URL of the page.

  • og:image: The image to be displayed.

  • og:description: A brief description of the content.

Example

<meta property="og:title" content="Gopal Adhikari - Front End & Back End Expertise" />
<meta property="og:type" content="website" />
<meta property="og:url" content="https://www.gopal-adhikari.com.np/" />
<meta property="og:image" content="https://www.gopal-adhikari.com.np/your-website-image.jpg" />
<meta name="og:description" content="Discover Gopal Adhikari's expertise in MERN stack development. Specializing in frontend and backend web development. Check out my projects and blogs.">

You can read more about Open Graph protocols in their documentation. You can check your social media preview on different platforms here.

Twitter Cards

While similar to Open Graph, Twitter Cards are specifically tailored for Twitter. They offer more control over how your content looks on the platform. Different card types exist, including:

  • Summary: Displays title, description, and image.

  • Summary with Large Image: Similar to Summary, but with a larger image.

Example

<meta name="twitter:title" content="Gopal Adhikari - Front end &amp; Back end Expertise">
<meta name="twitter:card" content="summary_large_image">
<meta name="twitter:site" content="Gopal Adhikari's portfolio and blogs.">
<meta name="twitter:image" content="https://www.gopal-adhikari.com.np/open-graph/main-og.png">
<meta name="twitter:description" content="Discover Gopal Adhikari's expertise in MERN stack development. Specializing in frontend and backend web development. Check out my projects and blogs.">

You can read more about twitter cards on twitter official documentation.

Web page speed

Website which loads faster ensures better user experience and can increase user engagement. Search engines like Google, Bing consider web page speed as a ranking factor. Users are more likely to leave a slow-loading website, negatively impacting search rankings and user engagement.

If your application is developed in Next.js, read this article to increase your Next.js speed.

URL structure

A clean and well-structured URL is crucial for SEO success.Clean and organized URLs are SEO friendly because it is simpler to look at and for clients to peruse and keep in mind.

Mobile friendliness

There is a high chance that the majority of people are surfing the internet using mobile, so making a website mobile friendly is very crucial as it directly affects the user experience and user engagement. Search engines moreover consider web page speed as a ranking factor.

SSL and HTTPS

Using SSL and HTTPS is vital for SEO because Google prioritizes secure websites. SSL certificates ensure data transfer between client and server is secure by encrypting the data while transferring. This not only protects user information but also boosts trust and credibility, leading to better search engine rankings

SEO Tools

You can use SEO tools like Semrush, Ubber Suggest, Small SEO Tools to analyze your website's performance, identify keyword opportunities, and track your search rankings. These tools provide insights into your competitors' strategies and help optimize your content for better visibility and higher rankings on search engines.

Monitoring Tools

Monitoring tools, such as Google Analytics, allow you to track and analyze website traffic, user behavior, and conversion rates. These insights help you understand what is working and what needs improvement, ensuring your website remains optimized and user-friendly, which is crucial for maintaining good SEO performance.

Summary

The article clarifies that Technical SEO is a subset of On-page SEO, involving critical elements like HTML structure, mobile friendliness, and user experience. It discusses the importance of Technical SEO for better indexing and crawling by search engines, which is essential for higher rankings. The article also provides a comprehensive Technical SEO checklist, including tasks like creating sitemaps and robots.txt files, adding canonical tags, fixing broken links, optimizing page speed, using structured data, implementing Open Graph protocols and Twitter Cards, and more, to ensure optimal website performance and search engine visibility.

0
Subscribe to my newsletter

Read articles from Gopal Adhikari directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Gopal Adhikari
Gopal Adhikari

I am a web developer with interest in mobile app development and cloud.