Technical SEO

Image - Technical SEO
Image – Technical SEO

Technical SEO helps to know how search engines crawl and index your website without any problems. Google Search Console and Bing Webmaster Tools are free for website analyses and sightings of any technical SEO issues. Search Engines have technical limitations. They may not understand your content either fully or partly. Technical SEO factors matter and you need to address these limitations that search engines come across by understanding how they work. You may have unique titles or a lot of images and text that tell the search engine what the post is all about. Search engines recognize JavaScript code running on your website. At times you block your content visibility not knowing that you have not connected or opened up your website to search engines. Connect SEO with other business activities. Improve further with visuals such as videos or engagement with local content and news.

Table of Contents – Technical SEO

Introduction

It would be advantageous to know about the SEO fundamentals to fix technical SEO issues for your website. Maximize organic traffic potential by taking advantage of the free technical SEO tools available to fix your website. The mix of related technical SEO difficulties pertains to site architecture & accessibility, on-page SEO content optimization, website speed, seamless user experience, search engine crawling, mobile-friendly website, indexing, and linking, and many more. The main objective of Technical SEO is to enable search engines to crawl and index your pages. Designing an SEO-friendly website architecture that is mobile-friendly with faster load speed times is the key to implementing website Technical SEO. This in turn results in visibility and improvements in organic search rankings. To know more about the fundamentals of SEO click the link SEO Fundamentals.

Non-Technical v/s Technical SEO

SEO can be divided into three sections. First is the On-page SEO [optimization to improve the content]. Second is the Off-page SEO [links, promotion of content]. The third is Technical SEO which focuses on how well spiders can crawl your site and index your content. The Screaming Frog SEO Spider crawls website URLs. It gets key elements to break down and audit technical and On-page SEO. Track results using Google Analytics.

Fig1 - Non Technical v/s Technical SEO
Fig1 – Non-Technical v/s Technical SEO
Related Content
* On-page & Off-page SEO
* SEO Fundamentals
* Screaming Frog
* Links in Google Search Results

Site Indexing for Technical SEO

The procedure of downloading data from websites and storing it in databases by Search Engines is referred to as Website Indexing. Their purpose is to serve relevant results against queries submitted by users. Websites can be indexed by XML sitemaps, robots.txt, Meta Robots tags, etc.

Fig2 - Google Search Console - Left bar menu
Fig2 – Google Search Console – Left bar menu

Technical SEO – Google Search Console

Go to coverage under index (fig 2) in the left bar menu and see how many pages have been crawled by Google. To check the status of any particular URL, enter it in the top menu box. Within the coverage, section open the report and view a breakdown of errors (if they exist). Mobile Usability covers the mobile issues that you should optimize for your website.

Related Content
* Google Search Console

Technical SEO – Sitemap

Sitemaps point search engines to pages on your website and ensure the crawling of pages. Both XML and HTML sitemaps help search engines crawl your website.

HTML Sitemaps

An HTML sitemap aids users in acquiring an extensive understanding of your website. HTML sitemaps are easily read and understood. They are a simple page with links to pages on the website. In the general overview find pages of interest on the sitemap page. In a large website, the HTML sitemap content is split or categorized to be better organized and understood. It provides for a good user experience.

XML Sitemaps

XML sitemaps are specifically created for search engines intended for reading by search engine robots. They are more specific to SEO & search engines. It includes the scene activity of a webpage and provides unique information about each URL. Search engines get access to additional data. When the page was last updated? How often does the page change? What is the Page importance of other pages on your site? The content is more logically analyzed by search engines that are useful for new, undiscovered sites.

Fig3 - Technical SEO: Screaming Frog Interface - Crawl & Sitemaps
Fig3 – Technical SEO: Screaming Frog Interface – Crawl & Sitemaps

A SiteMap can be created by using the Yoast SEO for WordPress websites. Another option is xml-sitemaps.com. We will create an XML sitemap using the Screaming Frog tool (fig 3) and upload it to our website.

Related Content
* Yoast SEO
* Screaming Frog

Screaming Frog

In the screaming frog, the interface crawls a website eg sapcanvas.com and clicks start. The tool crawls a list of pages within the site. The progress of the crawl can be seen at the right as the crawl progresses through the pages of the website. The results show the address of the page, the type of content for the page whether or not it is an HTML file or image, or code such as javascript, status code, error codes, title tags, and more….. After the crawl is completed you can create an XML sitemap by going to sitemaps in the top bar menu and choosing to create XML sitemaps. Screaming Frog gives several options and is best left at default. There are additional options available if you need something specific.

Fig4 - Technical SEO: Sitemaps - Pages Tab
Fig4 – Technical SEO: Sitemaps – Pages Tab

The Pages tab (fig 4) includes pages like No Index Pages or Canonicalized pages or paginates ie pages in a series such as pages 1,2,3… or pdfs.

Fig5 - Last Modified Tab
Fig5 – Last Modified Tab

The last modified tab (fig 5) tells search engines when the page was last modified. You can go by the server response or custom date.

Fig6 - Priority Tab
6 – Priority Tab

The Priority tab (fig 6) sets the priority for specific pages within your site.

Fig7 - Change Frequency Tab
Fig7 – Change Frequency Tab

Change Frequency tab (fig 7) shows how often the pages are updated such as daily, weekly, and monthly.

Fig8 - Images Tab
Fig8 – Images Tab

Choose whether or not to include images (fig 8).

Fig9 - Hreflang Tab
Fig9 – Hreflang Tab

The Hreflang (fig 9) binds multiple language URLs together.

XML File Download

After selecting the information from the various options create the sitemap by clicking ‘Next’. The screaming frog will then ask where you want to save the file. After you have selected the location for saving the file you will be required to name the file. To get an idea of the sitemap file, open it by clicking the below link – ‘XML Site Map’ and view the PDF. See the additional information which is provided at the top of search engines. Also view each page with extra information such as how often it is updated, the URL of the page, and the priority of the page. Depending on the setting you have chosen see all the information along with when the page was last modified.

Related Content
* XML Site Map

Technical SEO – Schema

Schema or schema.org is microdata (structured data) that when added to HTML enhances how search engines read to present the webpages in SERPs. It is the method of markup preferred by search engines such as Google, Bing, etc.

Fig10 - Google Structured Data Testing Tool
Fig10 – Google Structured Data Testing Tool

In Google’s Structured Data Testing tool (fig10) paste your website’s URL or the code snippet. After the tool runs it will check the structured data code to flag any errors if detected. It shows whether the structured data code is in the right format or not. Google support recommends checking sites using this tool during the development stage and before deploying your website. To learn more about Schema, click the link Schema SEO.

Technical SEO: Site Accessibility

Crawling tools such as Screaming Frog, SEMrush Site Audit, and DeepCrawl, help us know- how search engines work. how do they crawl your website? They check on crawlability, indexability, HTTPS implementation, internal linking, etc. All crawling tools combine data sources. Additional data can be obtained from Google Search Console or Google Analytics or just manually add URL lists and upload those. For larger websites opt for a SaaS crawler. eg: SEMrush Site Audit, Deepcrawl, botify, Audisto, etc

The Crawl Budget is the number of pages Googlebot crawls and indexes on a website within a given period. The factors that affect the crawl budget are – the domain’s age and size, link profile, and the quantum of new content. For crawling all links should be reachable and readable. Crawling efficiency depends on the use of correct HTTP status codes, no duplicate & thin code, and page speed.

Fig11 - Google Search Console - Legacy tools and reports - Crawl stats
Fig11 – Google Search Console – Legacy tools and reports – Crawl stats

The crawl stat tabs in Google Search Console (fig 11) located under ‘Legacy tools and reports’ provide information on how many pages are being crawled per day.

Fig12 - Crawl Stats
Fig12 – Crawl Stats

With the tab – ‘fetch as Google’ keep an eye on daily crawling trends, Use this information (fig 12) as a basis and follow up with the trend.

User Agent

A robots.txt file shows whether certain user agents/ crawling software can or cannot crawl parts of a website. These crawl instructions are stated by allowing or disallowing the behavior of user agents.

Javascript SEO

A Technical SEO professional must know and understand JavaScript. Javascript SEO allows crawling, indexing, and ranking of websites by Search Engines. Google can crawl JavaScript but avoid massive use of clientside JavaScript for the content that needs indexing.

The procedure for indexing JavaScript takes place in two waves and requires about a week. The key to JavaScript SEO is Server-side rendering or Hybrid Rendering. In Server-side rendering, the content on the website that is significant will be crawled and indexed in one wave rather than two. The Server-side render must include indexable URLs and internal links must be a h ref tags and ‘anchor text’. Other SEO fundamentals will apply such as title tags, meta description, headings h1, subheadings h2, h3…Image SEO, Structured Data, canonical tags, robots metatag, etc. Exceptions are on click events and fragment identifiers which will not be indexed. In Hybrid Rendering pre-rendered versions of the pages, important metadata, and content are served to the browser.

Canonical tag

A canonical tag informs search engines that a particular URL is the main copy of a page. This URL needs to appear in search results. The canonical tag when used puts an end to issues caused by duplicate content visible on multiple URLs. A canonical tag code would look like – <link rel=”canonical” href=”https://sapcanvas.com”>

Site Architecture & Linking in Technical SEO

Website architecture is all about the structure and linking of web pages. A perfect website architecture helps users to find what they are looking for and search engine crawlers to find and index pages. From an SEO point of view, a flat architecture is the best which means users and crawlers alike can reach any page on the website within 4 clicks. In a flat architecture, Google spiders find all the pages on the website thereby maximizing your crawl budget. On the other hand, a deep architecture means that certain pages take 4-10 more clicks to reach the particular page of your interest. It is complicated and is bad both for SEO and UX.

Page Depth

All content should be 3 – 4 clicks from the home page. This indicates that the website is optimized for both the users and crawlers. As a rule, the more important a page is, the closer it should be to the home page.

H & alt tags

H tags are an on-page SEO factor that conveys to search engines what the website is all about. Make sure the h1 tag contains the targeted keywords relevant to your content. The h2 tag is a subheading and should have similar keywords as in the h1 tag. Next h3 tag becomes a subheading for the h2 tag and so on. Ensure that the targeted keywords placed in your content are within the norms specified by the keyword density requirements.

Internal Links ensure the availability of all documents. It bunches content under circumstances for what a page is supposed to rank for. Link types can be image links, text links, navigational links, content links, etc. The element that is associated with links is <a href>, no follow attribute, anchor text for destination links. An example of navigational links can be seen in breadcrumbs which show where you are from the domain. Google can display breadcrumbs in search results if you have used markup properly using schema.org. The robots.txt file manages crawlers to make sure significant URLs are not blocked by using the <a href> and not JavaScript.

The goal of using Hreflang is to serve the correct content to the user. It is a tag attribute that informs search engines of the association between pages in different languages on your site. A website can be multilingual and you want search engines to send users to the content in their language.

In SEO Link juice is a term that describes the equity that gets passed when one page or site is linked to another through hyperlinks. Search engines view such links as votes by other sites that your page is significant for promotion. Hyperlinks point to a complete document or to a particular element within that document. For eg adding a hyperlink to an image on your post and playing the video by clicking the image and playing the video in the post instead of opening a new tab. This helps users remain focused on your post. As such hyperlinks can be text, graphics, icons, or objects in a document that is linked to another file. Hreflang tags and no follow tags do not pass any link juice.

No follow links are hyperlinks with a rel=“nofollow” tag. eg: < a href=“https://sapcanvas.com” rel=“nofollownical SEO</a>. The nofollow link is a way to inform search engines not to count their links to other pages as a vote to favor that particular content.

Orphaned Links are created when you create a page and fail to link it. These orphan pages do not get views and users may not get access to proper content by incorrect linkings. Identify the orphaned pages on your website that are no longer linked with your content. On the other hand, broken links occur when pages are deleted, misspelled URLs, renaming of sites, etc. Test your links regularly and repair broken links to provide a good user experience. When you move content to a new URL use the redirect to forward users and search engines from the existing URL to that new URL. It can be used when you delete a page, change domain names, or merge websites.

Fig13 – Technical SEO – MERKLE SEO Tools

Technical SEO – Tools

An Inspection of your website with free & paid tools can be done to measure the different components of Technical SEO that indicate site popularity.

Check whether your website is Mobile Friendly

Use tools such as Responsinator to check your mobile-friendliness, such as using a responsive design website. Tools such as mobile-friendly tests on Google Search Console, Mobi Ready, etc can check if your website is mobile-friendly. Optimize your mobile-friendliness by blocking or allowing pop-ups from your browser, make font size adjustments, and ensure touch elements are not near to each other.

Check for Crawling and Indexing issues

Use Screaming Frog – a desktop software to crawl your website. Analyze your site’s structure, URL, content, titles, headings, pages, metadata, links, images, etc. Export the crawl data to identify errors or issues that need to be fixed. Issues and fixes may be related to and are as under –

  • Use Atomseo to find broken links and 404 error pages. Redirect 404 ‘Not Found Page’ to the home page of your website by using the pro feature of the AIOSEO WordPress plugin. The Siteliner tool will scan your site to find broken links, duplicate content, redirects, etc, and fix these issues. Ahrefs broken link checker can crawl your website to search inbound and outbound links. For WordPress sites, use the Broken Link Checker plugin to check, edit, or remove broken links from your dashboard.
  • The Sitechecker tool is used to check duplicate content and missing SEO titles. Another tool is Duplichecker for detecting plagiarism and duplicate content. SEO Review Tools – Duplicate Content Checker is also an option.
  • Follow recommendations of page speed tools to improve slow-loading pages.
  • Fix any incorrect redirects. Use a Redirection WordPress plugin.
  • Image optimization – Resize and compress images to reduce their file size without compromising quality. Large image files can result in slowing down page loading. Read more by clicking on Optimize Image Loading Speed.
  • Minify HTML, CSS, and JavaScript files and reduce file sizes. Delete unnecessary formatting & commenting to optimize load times.
  • By setting cache headers you can leverage browser caching. This facilitates returning visitors by faster loading of your website due to the storage of static web resources nearby.
  • Use the SISTRIX tool to find out the websites’ visibility index.
  • The URL Inspection tool is a feature available in Google Search Console for checking the index status of a webpage.
  • Test your Robots.txt file with Ryte for crawling on a particular URL to find whether it is allowed or not
  • Implement Schema Markup for your site. Use the Structured Data Test of SEO Site Checkup tool to check if your webpage is using structured data markup. Merkle Fig13, Search Bloom, etc are schema markup Generator tools. XML Site Maps helps search engines find, crawl, and index website content. SEOptimer is a free site map generator tool
  • The SEO Browser tool will display your site as visible to search engines and crawlers. Use it to check your website structure, links, metadata, etc.

Check the speed of your website

Use tools like GTMetrix, Varvy Pagespeed Optimization, Pingdom, Web Page test, Google PageSpeed Insights, etc to test your website speed. Use ‘BrowserStack’ tool to check your site’s compatibility and usability on different devices and browsers.

Check your website security

Use Pentest tools to review your website security by scanning for vulnerabilities. Use HTTPS, regular software updates, strong passwords, etc to protect your website from external threats of malware and hacking.

Review your on-page SEO

Use tools such as Yoast SEO, and AIO-SEO tool to audit your on-page SEO. Optimize your content for on-page SEO such as titles, focus keywords, meta descriptions, headings, adding internal & external links, etc. Ensure keywords flow naturally in your content. Check Keyword density with SEOMATOR Keyword Density tool and calculate the number of keywords and frequency.

Majestic SEO, & SE Ranking, etc are backlink monitoring tools. SE Ranking can also be used to check H, Title, and Meta description. SEO Review suit tools can also help to check your HTML headers.

Analyse your off-page SEO

Use a tool like Majestic to analyze your off-page SEO. Majestic can also be used to check the domain pages and measure the quality of links by checking inbound links and the keywords used. Use the tool’s data to evaluate your backlink profile, identify your competitors, find new link opportunities, and find influencers who attract website links. Google Search Console can be used for website analysis. Tracking & measurements can be done with Google Analytics. Hotjar is another usability testing tool to find how visitors behave on your site. SEO Review Backlink Checker, SEMRUSH Backlink Checker, SERanking Backlinking Analysis tool, Moz Link Explorer, Ubersuggest Backlink Analyzer, and Ahrefs Backlink Checker tools are other options for analyzing backlinks.

Technical SEO: Conclusion

Measure various SEO metrics using Google Analytics. For WordPress websites use plugins such as Yoast for SEO and WP Super Cache for optimizing website speed and performance. Ensure your website is ‘Mobile Friendly’ and provide a great UX across all devices. Check how search engines view your website in Google Search Console.

Research for keywords using the free tools – Google Autocomplete, Ubersuggest & Keyword Shitter. Develop Buyer Persona and evolve content around a set of keywords. Choose the target keywords that you can rank for. Write a unique title tag with the target keyword and include it in the URL. Meta Descriptions should be written to include target keywords and can increase CTR. Include the target keyword in the filename and the alt text for images. The target keyword should appear in the first 100 words of your post, title header h1 also in the subheadings h2, h3……

Ensure to add at least 2 inbound links and 2 outbound links (other blog posts, .edu, and .gov resources) to your article. Content should be a minimum of 300 words or more. Make use of the social sharing buttons on your website. Follow a comprehensive link-building strategy. Use tools such as MOZ SEO, Ahrefs, and Majestic SEO, replicate your competitors’ authoritative links, and try reaching out to those sources. Publish content on high domain authority sites such as YouTube, Slideshare, Quora, Guest Posting, etc to drive traffic to your website. For Content Ideas use the BuzzSumo tool. Get your Google My Business Page if you have a business with a physical location.