How to create a custom robots.txt file for Blogger blogs


Robots.txt file is a file for giving orders and instructions to organize the archiving of your site in search engines because it is specialized in dealing with it and many bloggers and site owners are not aware of the importance of the robots.txt file called the Robots Exclusion Protocol

The use of the robots.txt file is often to prevent search engines from archiving things that are not useful to the user, such as the login page or sections that we do not want to archive.

Because its main function is to keep the spiders away from searching for sensitive files on your blog or website and avoid duplicate content on your blog or search engines.

How do I create a robots.txt file?

The robots.txt file is a public file and anyone can view it because it is on the following path:

How to manually create a robots.txt file: Of course, we will not go into defining a robots.txt file a lot because, as we said, it only matters to those who have problems archiving blogger blogs, and their blogs are not permanently archived.

How to add robots.txt files to Blogger blog in a correct and proper way

First: Go to the settings page in the Blogger blog, then set search preferences, and then choose Enable Robots.txt File

Second: Add the following code in the space allocated for it, as shown in the previous image

User-agent: Mediapartners-Google

User-agent: *
Disallow: /search
Disallow: /search?updated-min=
Disallow: /search?updated-max=


Don't forget to change queen-news in the previous

 code on your blog

How do you know which pages have been archived in Google?

Go to the Google search engine and write the following code


Of course, do not forget to replace my blog's link with your blog's link, and the link must be without http

Note: Sometimes some people resort to seeing the last hour or the last 24 hours to see the last topic pages, are they archived or not, but they do not find them because the pages are sometimes archived, but they do not appear when clicking on the events of the last hour or the last 24 hours

And you can make sure of this and whether the page is curated or not by writing part of your topic title after the link

for example :

site: Robots

Note: The interaction of visitors and frequent visits to your posts on Blogger improves the speed of their archiving, so always make sure to have good topics that are useful to visitors and are not transmitted because the quality of your blog's content and its exclusivity is always the main engine for archiving your pages

What is Alexa ranking And how is it affected by the multiple domains of Blogger

Is it legal to record video calls on WhatsApp ? And how to do that via this application