Blogger Robot.txt File
Blogger Robot.txt File
A robots.txt file is a text file that is used to instruct web crawlers (also known as robots or bots) which pages or files on a website should not be accessed. This is useful for websites that have pages or files that are not intended for public viewing, or for pages that are under construction.
This is a sample robots.txt file for a Blogger website:
User-agent: *Disallow: /searchAllow: /Sitemap: https://www.example.com/sitemap.xml
This robots.txt file tells all web crawlers that they are allowed to access all pages on the website (the "Allow: /" line), but they should not access the "/search" page. The "Sitemap" line specifies the location of the website's sitemap, which is a file that lists all the pages on the website and helps web crawlers discover new pages.
Note that the robots.txt file is a suggestion, not a requirement, for web crawlers. Some crawlers may choose to ignore the instructions in the file and access the disallowed pages anyway.
Post a Comment
image video quote pre code