News Update :
Home » » Add Custom Robots.txt File in Blogger

Add Custom Robots.txt File in Blogger

Add Custom Robots.txt file in Blogger

In one of my previous posts I even have discussed about Custom Robots Header Tags for blogger. If you have got scan that post then I hope you guys are aware with its importance in search rankings. Today, I go together with a very useful and must aware blogging term that\'s Robots.txt. In blogger it\'s known as Custom Robots.txt which means currently you\'ll customise this file per your alternative. In today’s tutorial we\'ll discuss about this term in deep and are available to know about its use and edges. i\'ll additionally tell you ways to add custom robots.txt go in blogger. so let begin the tutorial.

What is Robots.txt?


Robots.txt could be a computer file that contains few lines of simple code. it\'s saved on the web site or web log’s server that instruct the online crawlers to the way to index and crawl your blog in the search results. which means you\'ll prohibit any web content on your web log from net crawlers so that it can’t get indexed in search engines like your web log labels page, your demo page or any other pages that don\'t seem to be as important to induce indexed. forever keep in mind that search crawlers scan the robots.txt file before crawl any web content.

Each web log hosted on blogger have its default robots.txt file that are some things look like this

    User-agent: Mediapartners-Google
    Disallow:
    User-agent: *
    Disallow: /search
    Allow: /
    Sitemap: http://yoursite.blogspot.com/feeds/posts/default?orderby=UPDATED

Explanation

This code is divided into three sections. Let’s first study every of them at that time we\'ll learn how to add custom robots.txt go in blogspot blogs.

   1. User-agent: Mediapartners-Google

This code is for Google Adsense robots that facilitate them to serve higher ads on your web log. Either you are mistreatment Google Adsense on your web log or not merely leave it as it is.

    * User-agent: *

This is for all robots marked with asterisk (*). In default settings our blog’s labels links are restricted to indexed by search crawlers which means the online crawlers will not index our labels page links as a result of below code.

    Disallow: /search

That means the links having keyword search just when the domain name are ignored. See below example that could be a link of label page named SEO.

    http://www.infotechseo.blogspot.com/search/label/SEO

And if we have a tendency to take away Disallow: /search from the higher than code then crawlers can access our entire web log to index and crawl all of its content and web content.

Here Allow: / refers to the Homepage which means net crawlers can crawl and index our blog’s homepage.

Disallow particular Post
Now suppose if we wish to exclude a specific post from categorization then we are able to add below lines in the code.

    Disallow: /yyyy/mm/post-url.html

Here yyyy and millimeter refers to the publishing year and month of the post severally. as an example if we\'ve published a post in year 2013 in month of March then we\'ve to use below format.

    Disallow: /2013/03/post-url.html

To make this task easy, you\'ll merely copy the post uniform resource locator and remove the web log name from the beginning.

Disallow particular Page
If we\'d like to interdict a specific page then we are able to use a similar methodology as higher than. merely copy the page uniform resource locator and remove web log address from it which is able to something look like this:

    Disallow: /p/page-url.html

    * Sitemap: http://yoursite.blogspot.com/feeds/posts/default?orderby=UPDATED

This code refers to the sitemap of our web log. By adding sitemap link here we have a tendency to are merely optimizing our blog’s crawl rate. suggests that whenever the online crawlers scan our robots.txt file they\'re going to find a path to our sitemap where all the links of our published posts present. net crawlers can find it easy to crawl all of our posts. Hence, there are higher probabilities that net crawlers crawl all of our web log posts while not ignoring one one.
Note: This sitemap can only tell the online crawlers about the recent 25 posts. If you wish to increase the amount of link in your sitemap then replace default sitemap with below one. it\'ll work for first 500 recent posts

    Sitemap: http://yoursite.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500

If you have got quite 500 published posts in your web log then you\'ll use two sitemaps like below:

    Sitemap: http://yoursite.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
    Sitemap: http://yoursite.blogspot.com/atom.xml?redirect=false&start-index=500&max-results=1000

Custom Robots.Txt Adding to Blogger

Add custom robots.txt in blogger. so below are steps to add it  to now the main part of this tutorial is the way .

   1. visit your web logger blog.
   2. Navigate to Settings >> Search Preferences ›› Crawlers and categorization ›› Custom robots.txt ›› Edit ›› affirmative
   3. currently paste your robots.txt file code in the box.
   4. Click on Save Changes button.
   5. you are done!

Add Custom Robots.txt file in Blogger


How to Check Your Robots.txt File?

You can check this file on your web log by adding /robots.txt at last to your web log uniform resource locator in the browser. Take a glance at the below example for demo.

http://www.infotechseo.blogspot.com/robots.txt

Once you visit the robots.txt file uniform resource locator you may see the whole code that you are mistreatment in your custom robots.txt file. See below image.

Add Custom Robots.txt file in Blogger



Final Words!

This was the today’s complete tutorial on the way to add custom robots.txt go in blogger. i really strive with my heart to create this tutorial as simple and informative as doable. but still if you have got any doubt or query then be happy to ask me. Don’t place any code in your custom robots.txt settings while not knowing about it. merely ask to Pine Tree State to resolve your queries. I’ll tell you everything thoroughly. Thanks guys to scan this tutorial. If you wish it then please support Pine Tree State to spread my words by sharing this post on your social profiles.
                                    Happy Blogging
:

Post a Comment

We're glad you have chosen to leave a comment. Please keep in mind that all comments are moderated according to our comment policy , and all links are nofollow. Do NOT use keywords in the name field. Let's have a personal and meaningful conversation.
Thanks

 
About Us | Contact Us | Privacy policy | Term of use | Widget | Advertise with Us | Site map
Copyright © 2014. InfoTech SEO . All Rights Reserved.
Design by Heart Khan | Support by SEO Services | Powered by InfoTechSEO