Results 1 to 6 of 6

Thread: Why my sites don't get indexed on search engines..?

  1. #1

    Default Why my sites don't get indexed on search engines..?

    Hello friend,

    I would like to know that Why my sites don't get indexed on search engines..?

  2. #2

    Default

    Hi Friends,
    8 Reasons Why Your Site Might Not Get Indexed

    -Robots.txt - This text file which sits in the root of your website's folder communicates a certain number of guidelines to search engine crawlers. For instance, if your robots.txt file has this line in it; User-agent: * Disallow: / it's basically telling every crawler on the web to take a hike and not index ANY of your site's content.
    -.htaccess - This is an invisible file which also resides in your WWW or public_html folder. You can toggle visibility in most modern text editors and FTP clients. A badly configured htaccess can do nasty stuff like infinite loops, which will never let your site load.
    -Meta tags - Make sure that the page(s) that's not getting indexed doesn't have these meta tags in the source code: <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
    -Sitemaps - Your sitemap isn't updating for some reason, and you keep feeding the old/broken one in Webmaster Tools. Always check, after you have addressed the issues that were pointed out to you in the webmaster tools dashboard, that you've run a fresh sitemap and re-submit that.
    -URL parameters - Within the Webmaster Tools there's a section where you can set URL parameters which tells Google what dynamic links you do not want to get indexed. However, this comes with a warning from Google: "Incorrectly configuring parameters can result in pages from your site being dropped from our index, so we don't recommend you use this tool unless necessary."
    -You don't have enough Pagerank - Matt Cutts revealed in an interview with Eric Enge that the number of pages Google crawls is roughly proportional to your Pagerank.
    -Connectivity or DNS issues - It might happen that for whatever reason Google's spiders cannot reach your server when they try and crawl. Perhaps your host is doing maintenance on their network, or you've just moved your site to a new home, in which case the DNS delegation can stuff up the crawlers access.
    -Inherited issues - You might have registered a domain which had a life before you. I've had a client who got a new domain (or so they thought) and did everything by the book. Wrote good content, nailed the on-page stuff, had a few nice incoming links, but Google refused to index them, even though it accepted their sitemap. After some investigating, it turned out that the domain was used several years before that, and part of a big linkspam farm. We had to file a reconsideration request with Google.

  3. #3
    Join Date
    Jun 2013
    Location
    Forum
    Posts
    2,096

    Default

    You can submit a XML sitemap in webmaster tool and also share the webpages to various social profiles to get your webpages crawled and indexed faster in search engines.
    Cheap VPS Hosting | $1 VPS Hosting
    Windows VPS Hosting | Windows with Remote Desktop Access
    Cheap Dedicated Servers | Linux Dedicated Server Hosting

  4. #4
    Join Date
    Jul 2019
    Posts
    264

    Default

    Sometimes it can take a week or more for a search engine to update search results. This is because your website is new and doesn't have any inbound links. First, create an account on Google webmaster tools. When you register and point Google to your sitemap.xml URL you can request them to re-crawl your URLs.

  5. #5

    Default

    You Have Lots of Duplicate Content Too much duplicate content on a site can confuse search engines and make them give up on indexing your site. If multiple URLs on your site are returning the exact same content, then you have a duplicate content issue on your site.

  6. #6
    Join Date
    Jul 2019
    Posts
    264

    Default

    Sometimes it can take a week or more for a search engine to update search results. This is because your website is new and doesn't have any inbound links. First, create an account on Google webmaster tools. When you register and point Google to your sitemap.xml URL you can request them to re-crawl your URLs.

Similar Threads

  1. How To Get Traffic From Search Engines
    By colleentyler in forum Web Hosting Community
    Replies: 29
    Last Post: 12-27-2017, 12:14 PM
  2. Subdomains and search engines
    By 1paket.com in forum Web Hosting Solutions
    Replies: 1
    Last Post: 12-16-2017, 06:43 AM
  3. How does search engines treats sub-domains?
    By ninadordev in forum Business Tools
    Replies: 2
    Last Post: 12-01-2017, 05:36 AM
  4. What Search Engines Are Looking For?
    By artijain1900 in forum Web Hosting Community
    Replies: 4
    Last Post: 11-13-2017, 05:44 AM
  5. search results at the major engines)?
    By adamdalena in forum Web Design Solutions
    Replies: 4
    Last Post: 03-18-2017, 11:18 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •