Why we use Robots.txt File?
Why we use Robots.txt File?
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
█ Cheap VPS Hosting | $1 VPS Hosting
█ Windows VPS Hosting | Windows with Remote Desktop Access
█ Cheap Dedicated Servers | Linux Dedicated Server Hosting
Robots.txt is a file to allow or disallowed search engine to keep some pages privacy..
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.