Monday, 8 July 2013

How to enable custom robots.txt file in blogger

How to enable custom robot.txt file in blogger. custom robots.txt is one of the best way to instruct the search engine for not crawling certain pages of your blog.

                                                 what is robots.txt ->

you want to know that what is robots.txt . robots.txt is generally a file which is having some simple code and you can restrict any particular page on your blog which you don't want to crawl by search engines and get indexed in search engines.

                                   How to disallow particular post    ->

if you want to exclude a particular post from your blog to disallow in search result,then you can add these lines in the code->
disallow:/yyyy/mm/post-url.HTML
you can put year of post in place of yyyy and put month of post in place of mm respectively.

                                  How to disallow particular page     ->

if you want to disallow a particular page from your blog to not crawling by search engines,then you can simply copy the page URL and can remove page address and you can do like this ->
disallow;/p/page-URL.html

                             How to add custom robots.txt to blogger 

1) you should go to your blogger blog.

2) then you should navigate to settings -> search preferences -> crawlers and indexing -> custom robots.txt -> edit-> yes

3) now,you should paste these robots.txt file code in the box.
      // blogger sitemap generated on 12.11.2013
           user-agent*
            disallow:/search
             allow:/
sitemap:http://www.techppp.com/atom.xml?redirect=false&start-index=1&max-results=500
         in the above code you have to replace www.techppp.com with your URL and put this in the box.
4) then,you should click on save changes button.

5) you are done.

so, these are some steps by which you can Enable custom robot.txt file in blogger.

No comments:

Post a comment