Maybe you've heard about the term robots.txt. What is robots.txt? is it necessary in the settings? what if I leave it alone? There are probably many other questions you have, especially if you are newbie blogger.
To better understand the meaning of robots.txt, to make it more understandable, I have made this tutorial.
If you want to get your blog indexed and crawl your pages fast then you should add this custom robots.txt file in your blogger blog. as well as it's a part of search engine optimization so you must be aware of the terms.
To better understand the meaning of robots.txt, to make it more understandable, I have made this tutorial.
If you want to get your blog indexed and crawl your pages fast then you should add this custom robots.txt file in your blogger blog. as well as it's a part of search engine optimization so you must be aware of the terms.
How It Works?
Robots.txt is a command for the search engine robots to explore or browse a pages of our blog. Robots.txtis arguably filter our blog from search engines.
Let's say robot wants to visits a Webpage URL, example, http://www.example.com/about.html. Before it does so, it will check for http://www.example.com/robots.txt, and then it will access the particular webpage.
All the blogs already have a robots.txt file given by blogger/blogspot. By default robots.txt on blogs like this:
Let's say robot wants to visits a Webpage URL, example, http://www.example.com/about.html. Before it does so, it will check for http://www.example.com/robots.txt, and then it will access the particular webpage.
All the blogs already have a robots.txt file given by blogger/blogspot. By default robots.txt on blogs like this:
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://yourblogname.com/atom.xml?redirect=false&start-index=1&max-results=500
User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Allow: / Sitemap: http://namablog/feeds/posts/default?orderby=UPDATED - See more at: http://blog.kangismet.net/2013/10/cara-setting-robotstxt-di-blogger.html#sthash.aSTFQuhf.dpuf
What Is The Meaning Of Above Codes?
User-agent: Mediapartners-Google
This command tells your blog to allow Adsense bots to crawl your blog. If you’re not using Google Adsense on your blog then simply remove this line.
Disallow: This cammand prevents Search Engine bots from crawling pages on your blog.
User-agent: *
All Robot Search Engines / Search engine
Disallow: / search
Not allowed to crawl the search folder, like .... / search / label and ... search / search? Updated ... here lable is not inserted to search because label is not a URL who estate towards one specific page.
This command tells your blog to allow Adsense bots to crawl your blog. If you’re not using Google Adsense on your blog then simply remove this line.
Disallow: This cammand prevents Search Engine bots from crawling pages on your blog.
User-agent: *
All Robot Search Engines / Search engine
Disallow: / search
Not allowed to crawl the search folder, like .... / search / label and ... search / search? Updated ... here lable is not inserted to search because label is not a URL who estate towards one specific page.
Example :
http://www.shoutersclub.blogspot.com/search/label/SEOhttp://www.shoutersclub.blogspot.com/search/SEO
Allow: /
This command tells allow all pages to be crawled, except that written on Disallow above. Mark (/) or less.
Sitemap:
Sitemap: http://yourblogname.com/atom.xml?redirect=false&start-index=1&max-results=500
This command tells allow all pages to be crawled, except that written on Disallow above. Mark (/) or less.
Sitemap:
Sitemap: http://yourblogname.com/atom.xml?redirect=false&start-index=1&max-results=500
How To Prevent Robot On Certain Pages?
To prevent particular page from Google crawling just disallow this page using Disallow command. For example : if I don't want index my about me page in search engines. simply I will paste the code Disallow: /p/about-me.html right after Disallow: /search.
Code will look like this :
Code will look like this :
User-agent: Mediapartners-GoogleDisallow:User-agent: *Disallow: /searchDisallow: /p/about-me.htmlAllow: /Sitemap: http://yourblogname.com/atom.xml?redirect=false&start-index=1&max-results=500
How To Add Custom Robots.txt File In Blogger ?
Note : Before adding custom robots.txt file in blogger you should keep one thing in your mind that if you are using robots.txt file incorrectly then your entire blog being ignored by search engines.
Step1
Go to your Blogger Dashboard
Step2
Now go to Settings >> Search Preferences >> Crawlers and indexing >> Custom robots.txt >> Edit >> Yes
Step1
Go to your Blogger Dashboard
Step2
Now go to Settings >> Search Preferences >> Crawlers and indexing >> Custom robots.txt >> Edit >> Yes
Step3
Now paste your custom robots.txt file code in the box.
Step4
Now click on Save Changes button.
Step5
That's all, You are done! now you can see your robot.txt file. To view the robots.txt fine, please type in browser...
Now paste your custom robots.txt file code in the box.
Step4
Now click on Save Changes button.
Step5
That's all, You are done! now you can see your robot.txt file. To view the robots.txt fine, please type in browser...
http://yourblogname.com/robots.txt
It's okay, your blog will still be crawled by search engine robots because as I mentioned before, every blogalready possessed default robots.txt.
Final Worlds!
So friends! this was the brief tutorial on how to add custom robots.txt file in blogger, if you are a newbie blogger and you have missed the robot.txt then create one here to make your site SEO strong.
If you still have any quires regarding this article then please let me know, You can leave your question in comment box.
Thanks for reading this tutorial and I’ll see you in the next one. till then Happy Blogging!
Post A Comment:
0 comments: