Robots.txt plays an important role in the search engine indexing of your website. It allows the search engines to crawl your site in a specific ways and it also disallow search robots to crawl those pages which are not allowed by the author. It will work for your blog and its content if you put it into your blog in a proper way and it can also be ruin your blog and its ranking if you will use it in wrong way. In these days these option is using by many bloggers and their ranking is increasing day by day. Its function is to give command to the search engines for your blogs. It tells the search bots that which part of the blog is allowed by the author and which is disallowed for indexing. Whenever a bot crawl you blog it first check the Robots.txt file and follow all the instructions.
Our today’s post is totally about the Robots.txt file and we will learn that how to add this file in to blogger blog and command the search engines robots with the help from the Yando SEO professionals.
How To Add Robots.txt InTo Blogger Blog.?
- Sign in to your blogger blog.
- And go to the Setting tab and then to Search Preferences.
- Now look for the Custom Robot.txt box And enable it. Click on yes.
Replace my blog address with your own blog address. And save the Robots.txt file.
Now you have done to adding a Robots.txt file to your blogger blog. And search engine will first look this and then will start indexing your website and I hope this will work best for your blog. I am using this data on my blog also.