Tuesday, 29 November 2016

Block your website from search engine indexing

Hi All,

It is a common question asked from web administrators.
Use robots.txt file in your root folder of website/webserver

robots.txt


Web site admins use the /robots.txt file to follow instructions about their website to web robots; this is known as The Robots Exclusion Protocol.

It works likes this: a robot wants to visit a Web site URL, say http://www.abc.com/welcome.html. Before it does so, it firsts checks for http://www.abc.com/robots.txt, and finds:

User-agent: *
Disallow: /

When the search engine encounter such a line, the indexing will be stopped.

No comments:

Post a Comment