Blogger Custom Robots.txt File – Add It Into Blogger

Blogger Custom Robots.txt File – Add It Into Blogger


Robots.txt file is very important for a site to control search engine’s crawler activity. We will find blogger custom robots.txt file in the settings section. Blogger makes it easy to add a custom robots.txt file. That’s why today I am going to talk about this.
If you are running a website then you can not avoid any single issue which can hamper your site’s ranking in search engines. We always consider the robots.txt file as a major part of our site. So we should know what is robots.txt and how to use it properly on our site?
Among many third-party blogging platforms, Blogger (Blogspot) is most famous. You can naturally build up your professional blog on blogger very quickly. The CMS of Blogger is very easy to use. One can build and run his website on blogger without any cost. Blogger is powered by Google, so the quality of this free blogging platform is undoubtedly standard.
The only thing we should know is the proper use of blogger. Many new bloggers still don’t know how to use this robots.txt file in blogger. As I have said that, this is very important to know your blog’s robots.txt file is working correctly or not and how you can add it to your blog so let’s see.

What Is Robots.txt File?

Robots.txt file is a simple text file which controls the activity of the search engine’s crawler. Let me make it clearer. Robots are made to crawl a website which is live on the web and gets the update from that site to notify search engines about that. Have you ever think that how Google or other search engines are getting your website’s information and serving that in front of thousands?
The answer is : search engines are sending robots to crawl your site again and again. The crawler is coming to your site and getting the updates to your site and sending the data to the search engine’s database. In this way search engine come to knows which information you are providing in your blog and when to serve it for user’s requirement. The crawler has many names. Someone call it bot, someone call it robot, someone may call it spider so don’t get confused.
Blogger Robots.txt file
So what is the connection between this crawler and the robots.txt file? If your website contains robots.txt file, then search engines crawler will come to your site and search for it. After finding your robots.txt file, the crawler will come to know that which pages or URLs it should crawl and index and which pages are not.
If some pages or URLs are blocked for crawlers through your robots.txt file, then search engine’s crawler will not crawl and index those pages or URLs. In this way a robots.txt file control robots activity on your site. So using robots.txt file will help search engine’s crawler to crawl and index your site more accurately.
Also, you can protect your confidential data from being indexed by search engines through this small robots.txt file. As I was telling you that, not only search engines have the crawler. Many spammy crawlers are also available on the web. You can protect these spammy crawlers from visiting your site and save your bandwidth by using a robots.txt file in your blog.

About Blogger Robots.txt File:

We are clear about the importance of robots.txt file for our website. So now if you are using blogger then how to find blogger robots.txt file? Does blogger really have a robots.txt file? Can we use our own robots.txt file in blogger?
I know that this kind of question is playing in your mind. The simple answer is YES for all of these issues. Blogger provides a robots.txt file for every generated Blogspot blog. You can check your blogger blog’s robots.txt file by visiting this URL :
http://www.yoursite.blogspot.com/robots.txt or if you are using any custom domain in your blogger blog then visit : http://yourdomainname.com/robots.txt. You will be able to see your default robots.txt file live. This will look like :
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://www.blogornate.com/sitemap.xml
This is blogger’s default robots.txt file we were talking about. Now I will explain this robots.txt file. How is it working and what are the meaning of these codes?

Explanation Of Blogger Robots.txt file?

User-agent: – We are watching in our default blogger robots.txt file. The first line contains something like “User-agent”. We use this “User-agent” in our robots.txt file to target a particular robot. You can see that this is targeting the “Mediapartner-Google” bot and telling its accessibility limit in our blog through “Disallow:”
User-agent: * – We use this “User-agent: *” in our robots.txt file to target all robots to tell their accessibility limit on our blog. Here you can see blogger is protecting the “/search” directory for all kind of robots. So URLs generated through this directory will not be accessible for any robots.
Disallow: – If you want to tell search engine’s crawler not to crawl any particular folder/ directory/ URL of your blog, then you can use this Disallow: part. To exclude a folder from being indexed by search engines, You can use this disallow part like this : “Disallow: /search”. That means crawler is not permitted to crawl and index this folder.
Allow: – This “Allow: /” part is telling crawler to crawl and index everything in your blog except the disallow path.
Sitemap: – Sitemap is another important component of a website. Sitemap tells crawlers about content priority. This is very important for SEO. Blogger robots.txt file contains sitemap by default to let search engines robots know about your website’s content.
Mediapartner-Google – This is a crawler or user agent bot of Google AdSense. By default blogger’s robots.txt file is disabling no folders or URLs for this bot. In this way, Adsense will be able to get clear information about your blog.
It will also be able to serve more relevant ads in your blog because Adsense crawler has the full accessibility of your blog. If you disallow AdSense bot through this robots.txt file, Adsense will not be able to serve ads in your blog.

Can I Use Custom Robots.txt File In Blogger?

Obviously, you can use a custom robots.txt file in blogger. Blogger will let you do this very easily within few clicks. But the important thing is if you don’t understand the use of robots.txt file then you should not add or edit your default robots.txt file because a single mistake in your robots.txt file made by you could block the accessibility of robots.
So my suggestion is, if there is no strong reason for editing robots.txt file, you should leave it default. Blogger provides a standard robots.txt file by default which will work fine for you.
A few days ago blogger default robots.txt file did not contain the sitemap, and it was important for us to edit that robots.txt file to put our sitemap there but by this time blogger robots.txt file is Robots and SEO friendly.

How To Add A Blogger Custom Robots.txt File?

For self-hosted blogs, we usually create a robots.txt file and upload it to our server’s root directory. But blogger provides a robots.txt file by default. We can edit this robots.txt file as the way we want. So if you want to put some custom rule for robots of your blogger blog then
1. Login to your blogger account.
2. Go to the “Settings” section of your blog.
3. Click on the “Search Preferences” section.
4. Scroll down and find “Custom Robots.txt.”
5. Enable it and put your custom rule text for allowing or disallowing something of your blog right after the “User-agent: *” line.
How To Add A Blogger Custom Robots.txt File?
Now click on the “Save Changes” button and you are done.

Disallow File Or Pages By Using Blogger Robots.txt File?

I have already told you that you can disallow those pages you don’t want to index via search engines. So it is totally up to you. Like you can prevent your privacy policy page or contact us page through blogger custom robots.txt file as those pages are not mandatory to get indexed in search engines.
You can also disallow those pages of your website which contains your personal data or any confidential data. Suppose you don’t want the contact page of your site to get indexed. In that case, you can put a line in your blogger robots.txt like: Disallow: /p/contact.html right after the line “User-agent: *”.

How To Check Blogger Custom Robots.txt File?

If you wish to check your blogger robots.txt file after editing then just put robots.txt after your domain name. This will look like: yourdomain.com/robots.txt or yourdomain.blogspot.com/robots.txt
How To Check Blogger Custom Robots.txt File?
Everything seems right? Hope you can successfully add your custom robots.txt file in your blogger blog. If there is any problem to do the entire task, please let me know through a comment. You can also share your feedback there. If this article was helpful, then don’t forget to share.

Post a Comment

Previous Post Next Post