Results 1 to 6 of 6

Thread: What is the use of robots.txt?

  1. #1

  2. #2

    Default What is the use of robots.txt?

    Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.

  3. #3
    Join Date
    Jun 2013
    Location
    Forum
    Posts
    2,096

    Default

    Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
    Cheap VPS Hosting | $1 VPS Hosting
    Windows VPS Hosting | Windows with Remote Desktop Access
    Cheap Dedicated Servers | Linux Dedicated Server Hosting

  4. #4
    Join Date
    Jun 2016
    Location
    India
    Posts
    102

    Default

    Quote Originally Posted by RH-Calvin View Post
    Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.

    Yes, Thanks, I completely agree with you.

  5. #5
    Join Date
    May 2016
    Posts
    149

    Default

    Robots.txt is use for block website url from google bot.

  6. #6

    Default

    A robots.txt file is a file at the root of your site that indicates those parts of your site you don't want accessed by search engine crawlers.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •