Results 1 to 4 of 4

Thread: Why we use Robots.txt File?

  1. #1

  2. #2
    Join Date
    Jun 2013
    Location
    Forum
    Posts
    2,096

    Default

    Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
    Cheap VPS Hosting | $1 VPS Hosting
    Windows VPS Hosting | Windows with Remote Desktop Access
    Cheap Dedicated Servers | Linux Dedicated Server Hosting

  3. #3

    Default

    Robots.txt is a file to allow or disallowed search engine to keep some pages privacy..

  4. #4
    Join Date
    Jan 2016
    Location
    Van Nuys
    Posts
    79

    Default Why we use Robots.txt File?

    Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.

Similar Threads

  1. Creating Your Error Document File
    By 6655380023 in forum Web Design Solutions
    Replies: 7
    Last Post: 03-02-2018, 06:04 AM
  2. What is Robots.txt File?
    By stuartspindlow2 in forum Web Design Solutions
    Replies: 15
    Last Post: 09-29-2017, 07:10 AM
  3. What is the sitemap.xml file?
    By riprook7 in forum Web Design Solutions
    Replies: 23
    Last Post: 09-13-2017, 12:20 PM
  4. How to Create Sitemap.xml file ?
    By artijain1900 in forum Web Design Solutions
    Replies: 9
    Last Post: 04-17-2017, 12:18 PM
  5. What is the use of robots.txt?
    By artijain1900 in forum Web Design Solutions
    Replies: 5
    Last Post: 10-06-2016, 09:35 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •