Hello Everyone! Welcome Back to TechiFeed. I have written a new informative post for you to read. So, Keep reading to find out more!Every browser crawling bot initially interacts with the website's robots.txt file & crawling rules. That is, the robots.txt file is crucial to the search engine ranking of the Blogger website.
Custom robots.txt file in Blogger
The robots.txt file tells the search engine which pages it should and should not crawl. As a result, we can manage how search engine bots operate.
Inside the robots.txt file, we indicate user-agent, allow, deny, and sitemap functionalities for search engines such as Google, Bing, Yandex, and others. Let us first define all of these concepts.
Typically, we employ robot meta tags for all browsers crawling bots to index blog articles & pages throughout the web.
However, to save the crawl budget and prevent search engine bots from specific portions of your website, you must grasp the Blogger blog's robots.txt file.
User-agent: * Disallow: /search?q= Allow: / Sitemap: Your-Website-Url/sitemap.xml
Examine the Blogger Blog's default Robots.txt file
To make an ideal custom robots.txt file for your Blogger blog. We must first comprehend the Blogger website structure and examine the standard robots.txt file.
This file looks like this by default:
User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Allow: / Sitemap: https://www.example.com/sitemap.xml
- The bot type is declared in the first line of this file (User-Agent). Google AdSense is not permitted here (displayed in the 2nd line).
- It implies that AdSense advertisements may show across the website.
- The following user agent is *, which signifies that all search engine bots aren't permitted to access /search pages. It implies that all search and label pages are prohibited (same URL structure).
- And they allow tag specifies that all sites other than those in the refusing to allow section will be crawlable.
- The following line includes a post sitemap again for the Blogger blog.
Make an ideal custom robots.txt file for your Blogger Blog.
We learned how the Blogger blog's default robots.txt file plays its role. Let's make it as SEO-friendly as possible.
- It will crawl all search & label pages if you use /search*.
- To prevent crawling of the archive area, add a Disallow rule /20* to the robots.txt file.
- Because the /20* rule prevents bots from exploring all posts, we must add a new Allow rule to the /*.html section that permits bots to crawl posts & pages.
Posts, not pages, are included in the default sitemap. So, for pages placed in https://example.blogspot.com/sitemap-pages.xml or https://www.example.com/sitemap-pages.xml with a custom domain, you must add a sitemap.
As a result, the new flawless custom robots.txt file for the Blogger blog will look something like this.
User-agent: Mediapartners-Google Disallow: #below lines control all search engines, and blocks all search, archieve and allow all blog posts and pages. User-agent: * Disallow: /search* Disallow: /20* Allow: /*.html #sitemap of the blog Sitemap: https://www.example.com/sitemap.xml Sitemap: https://www.example.com/sitemap-pages.xml
If the custom domain name is www.example.com, the sitemap will be located at https://www.example.com/sitemap.xml.
You may also look at the current robots.txt @ https://www.example.com/robots.txt.
The setting in the above file is optimal robots.txt practice for SEO. It will assist the Blogger's blog to appear in search results while saving the website's crawling budget.
However, if you wish to enable bots to crawl the entire website, the optimum robots.txt and robots meta tag settings are enhanced robots meta tag and robots.txt file.
How to change the robots.txt file on my Blogger blog?
This Robots.txt file is situated at the website's root level. However, there is no root access in Blogger, so how can this robots.txt file be edited?
Blogger includes all root file settings, such as robots.txt and ads.txt files, in its settings area. Log in to your Blogger account and update the robots.txt file.
- Navigate to the Blogger Dashboard and select the Settings option.
- Scroll down to the crawlers and indexing area and toggle the switch to enable custom robots.txt.
- Click on custom robots.txt to open a box where you can paste the robots.txt file and update it.
After you've updated the Blogger blog's custom robots.txt file, go to https://www.example.com/robots.txt, where www.example.com should be substituted with your domain address.
Frequently Asked Questions
What is a robots.txt file in Blogger?
A robots.txt file is a text file that tells search engine robots which pages or sections of your website should not be crawled or indexed.
How do I create a custom robots.txt file in Blogger?
To create a custom robots.txt file in Blogger, you will need to create a new page in your blog and name it "robots.txt." Then, you can add the code for your custom robots.txt file to this page.
What code should I include in my custom robots.txt file for Blogger?
The code for your custom robots.txt file will depend on your specific needs. However, a basic robots.txt file for Blogger might include the following: User-agent: * Disallow: /search Disallow: /search?q=*
How to check robots.txt file work correctly?
You can check if your custom robots.txt file is working correctly by visiting the URL of your robots.txt file (http://www.yourblog.com/robots.txt) and checking that it displays the correct code.
Can I use a robots.txt file to block specific pages on my Blogger site?
You can use a robots.txt file to block specific pages on your Blogger site by adding the appropriate code to the file. For example, you can use the "Disallow" directive to tell robots not to crawl or index certain pages.
Can I use a robots.txt file to block specific search engines from crawling my Blogger site?
You can use a robots.txt file to block specific search engines from crawling your Blogger site by adding the appropriate code to the file. For example, you can use the "User-agent" directive to specify which robots should be blocked.
I hope you liked the above informative content. Please share this post and follow our blog for more great content.
If you have any questions or problems, feel free to ask in the comments section.
© Copyright: TechiFeed
Post a Comment