Robots.txt file that help
to properly crawling and indexing your site in the search engines, and this
file root of your site that indicates those parts of your site you don’t want
accessed by search engine crawlers. You
can easily check robots.txt file of any website from the help of this url.
Type url in search box
such as: http://www.yoursite.com/robots.txt
You will see some text
your browser like:
User-agent: *
Allow: /
User-agent: * means this section applied to all
robots and allow:/ tells the robots that it visit all
page on the site.
If you want to Block all
web crawlers from all content in search engine so you can use of this file.
User-agent: * Disallow: /If you remove / from
Disallow: / so search engine crawler your every web pages from your website.User-agent: * Disallow:
If you want to google
don’t crawl your website specific folder and do you want to hide some specific
website’s folder so you can use this file.
User-agent: Googlebot Disallow: /no-google/Robots.txt file basically
tell the google bots what content you want to allow or crawl for search and
what content you don’t want to crawl in the search engine.
No comments
Post a Comment