We are also on YouTube! Subscribe

How to Add Custom robots.txt file in blogger? (Boost SEO)

Please wait 0 seconds...
Scroll Down and click on Go to Link for destination
Congrats! Link is Generated

Hello Everyone! Welcome Back to TechiFeed. I have written a new informative post for you to read. So, Keep reading to find out more!

Every browser crawling bot initially interacts with the website's robots.txt file & crawling rules. That is, the robots.txt file is crucial to the search engine ranking of the Blogger website.

How to Add Custom robots.txt file in blogger? (Boost SEO)

This tutorial will show you how to make a flawless customized robots.txt file in Blogger and optimize your blog for search engines.

Custom robots.txt file in Blogger


The robots.txt file tells the search engine which pages it should and should not crawl. As a result, we can manage how search engine bots operate.

Inside the robots.txt file, we indicate user-agent, allow, deny, and sitemap functionalities for search engines such as Google, Bing, Yandex, and others. Let us first define all of these concepts.

Typically, we employ robot meta tags for all browsers crawling bots to index blog articles & pages throughout the web.

However, to save the crawl budget and prevent search engine bots from specific portions of your website, you must grasp the Blogger blog's robots.txt file.
User-agent: *
Disallow: /search?q=
Allow: /
Sitemap: Your-Website-Url/sitemap.xml

Examine the Blogger Blog's default Robots.txt file


To make an ideal custom robots.txt file for your Blogger blog. We must first comprehend the Blogger website structure and examine the standard robots.txt file.

This file looks like this by default:
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://www.example.com/sitemap.xml
  • The bot type is declared in the first line of this file (User-Agent). Google AdSense is not permitted here (displayed in the 2nd line).
  • It implies that AdSense advertisements may show across the website.
  • The following user agent is *, which signifies that all search engine bots aren't permitted to access /search pages. It implies that all search and label pages are prohibited (same URL structure).
  • And they allow tag specifies that all sites other than those in the refusing to allow section will be crawlable.
  • The following line includes a post sitemap again for the Blogger blog.

Make an ideal custom robots.txt file for your Blogger Blog.


We learned how the Blogger blog's default robots.txt file plays its role. Let's make it as SEO-friendly as possible.

The default robots.txt file permits the archive to index, which causes duplicate content problems. We can avoid duplicating material by preventing bots from crawling our archive section. As a result,

  • It will crawl all search & label pages if you use /search*.
  • To prevent crawling of the archive area, add a Disallow rule /20* to the robots.txt file.
  • Because the /20* rule prevents bots from exploring all posts, we must add a new Allow rule to the /*.html section that permits bots to crawl posts & pages.

Posts, not pages, are included in the default sitemap. So, for pages placed in https://example.blogspot.com/sitemap-pages.xml or https://www.example.com/sitemap-pages.xml with a custom domain, you must add a sitemap.

As a result, the new flawless custom robots.txt file for the Blogger blog will look something like this.
User-agent: Mediapartners-Google
Disallow: 

#below lines control all search engines, and blocks all search, archieve and allow all blog posts and pages.

User-agent: *
Disallow: /search*
Disallow: /20*
Allow: /*.html

#sitemap of the blog
Sitemap: https://www.example.com/sitemap.xml
Sitemap: https://www.example.com/sitemap-pages.xml
It would be best if you changed www.example.com with the Blogger domain and a custom domain name.

If the custom domain name is www.example.com, the sitemap will be located at https://www.example.com/sitemap.xml.

You may also look at the current robots.txt @ https://www.example.com/robots.txt.

The setting in the above file is optimal robots.txt practice for SEO. It will assist the Blogger's blog to appear in search results while saving the website's crawling budget. 

To appear in search results, you must provide SEO-friendly content.

However, if you wish to enable bots to crawl the entire website, the optimum robots.txt and robots meta tag settings are enhanced robots meta tag and robots.txt file

The mixture is one of the most effective ways to improve the SEO of a Blogger blog.

How to change the robots.txt file on my Blogger blog?


This Robots.txt file is situated at the website's root level. However, there is no root access in Blogger, so how can this robots.txt file be edited?

Blogger includes all root file settings, such as robots.txt and ads.txt files, in its settings area. Log in to your Blogger account and update the robots.txt file.

  1. Navigate to the Blogger Dashboard and select the Settings option.
  2. Scroll down to the crawlers and indexing area and toggle the switch to enable custom robots.txt.
  3. Click on custom robots.txt to open a box where you can paste the robots.txt file and update it.

After you've updated the Blogger blog's custom robots.txt file, go to https://www.example.com/robots.txt, where www.example.com should be substituted with your domain address.

Frequently Asked Questions


What is a robots.txt file in Blogger?

A robots.txt file is a text file that tells search engine robots which pages or sections of your website should not be crawled or indexed.

How do I create a custom robots.txt file in Blogger?

To create a custom robots.txt file in Blogger, you will need to create a new page in your blog and name it "robots.txt." Then, you can add the code for your custom robots.txt file to this page.

What code should I include in my custom robots.txt file for Blogger?

The code for your custom robots.txt file will depend on your specific needs. However, a basic robots.txt file for Blogger might include the following: User-agent: * Disallow: /search Disallow: /search?q=*

How to check robots.txt file work correctly?

You can check if your custom robots.txt file is working correctly by visiting the URL of your robots.txt file (http://www.yourblog.com/robots.txt) and checking that it displays the correct code.

Can I use a robots.txt file to block specific pages on my Blogger site?

You can use a robots.txt file to block specific pages on your Blogger site by adding the appropriate code to the file. For example, you can use the "Disallow" directive to tell robots not to crawl or index certain pages.

Can I use a robots.txt file to block specific search engines from crawling my Blogger site?

You can use a robots.txt file to block specific search engines from crawling your Blogger site by adding the appropriate code to the file. For example, you can use the "User-agent" directive to specify which robots should be blocked.

Stay Connected

I hope you liked the above informative content. Please share this post and follow our blog for more great content.

If you have any questions or problems, feel free to ask in the comments section.

© Copyright: TechiFeed

About the Author

Hi, I am a good student and a part-time blogger.

Post a Comment

To avoid SPAM, all comments will be moderated before being displayed.
Cookie Consent
We serve cookies on this site to analyze traffic, remember your preferences, and optimize your experience.
Oops!
It seems there is something wrong with your internet connection. Please connect to the internet and start browsing again.
AdBlock Detected!
We have detected that you are using adblocking plugin in your browser. The revenue we earn by the advertisements is used to manage this website, we request you to whitelist our website in your adblocking plugin.
Site is Blocked
Sorry! This site is not available in your country.