Googlebot stopped crawling
-
Hi All, One of my website stopped showing in SERP, after analysing in webmaster, found that Googlebot is not able to crawl. However it was working alright few days back. Try to investigate for panelisation, but no intimation found. I checked robot.txt for no follow etc but all seems to be ok. I resubmitted Sitemap in webmaster again, it crawled 250 pages out of 500 but it still site is not available in SERP (google), in bing it is ok.
Pl suggest the best possible solutions to try.
Thx
-
This might be a shot in the dark not knowing much about your site - can you check in Google Webmaster to see if you accidentally removed your website using the Remove URLs tool? I know of someone that accidentally did this when copying a pasting a URL, but accidentally only copied their main website address and not the full URL (oops!) and their site dropped out of Google SERPs rather quickly. Just a thought...
-
Very hard to say without more details. Does your site have unique, high quality content? If it's just duplicate content, Google may crawl it but won't necessarily show it in the SERPs.
Also, what does your backlink profile look like? Google allocates crawl budget based on your PageRank, so if Google isn't crawling all your pages, then you will want to acquire more external backlinks.
-
There may be many technical things going on with your robot.txt file, no-index tags, etc.
But where I would start first is with your website hosting company.
My guess - not having seen your site - is that you may be hosted on a site with a low-cost hosting provider. And you are experiencing downtime at random times that are affecting Google's ability to crawl your site.
The other clue that points me to your web hosting service is that Google tried to crawl 500 pages, but it was only able to handle 250.
What I would do is first look and see if your site is timing out under heavy loads / lots of visitors.
That's most likely the culprit.
I'd subscribe to a free site monitoring service that will ping the site every 5 minutes or so, and email you if it goes down.
Hope this helps!
-- Jeff
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I stop serious traffic lost on my website
I need help resolving technical SEO issues on my website CamRojud. I have tried allSEO tactics but no improvement yet. Can someone in the forum guide me through please.
White Hat / Black Hat SEO | | Dawodus0 -
Crawl anamoly issue on Search Console
Has anyone checked the crwal anamoly issue under the index section on Search console? We recently move to a new site and I'm seeing a huge list of excluded urls which are classified as crawl anamoly (they all lead to 404 page). Does anyone know that if we need to 301 redirect all the links? Is there any other smarter/ more efficiently way to deal with them like set up canonical link (I thought that's what they're used for isn't it?) Thanks!
White Hat / Black Hat SEO | | greenshinenewenergy0 -
Googlebot crawling AJAX website not always uses _escaped_fragment_
Hi, I started to investigate googlebot crawl log of our website, and it appears that there is no 1:1 correlation between a crawled URL with escaped_fragment and without it.
White Hat / Black Hat SEO | | yohayg
My expectation is that each time that google crawls a URL, a minute or so after, it suppose to crawl the same URL using an escaped_fragment For example:
Googlebot crawl log for https://my_web_site/some_slug Results:
Googlebot crawled this URL 17 times in July: http://i.imgur.com/sA141O0.jpg Googlebot crawled this URL additional 3 crawls using the escaped_fragment: http://i.imgur.com/sOQjyPU.jpg Do you have any idea if this behavior is normal? Thanks, Yohay sOQjyPU.jpg sA141O0.jpg0 -
Google suddenly stops ranking a page for a "keyword" with same "keyword" in title tag. Low competition.
Hi all, We have released our next version of product called like "software 11", which have thousands of searches every month. So we have just added this same keyword "software 11" as page title suffix to one of the top ranking pages. Obviously this is the page has been added suddenly with "software 11" at page title, multiple header tags and 1 mention in paragraph. Google ranked it for 2 days and suddenly stopped showing this page in entire results for the same keyword we optimised the page for. Why does it happened? Does Google think that we are overdoing with this page and ignoring it? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Excluding Googlebot From AB Test - Acceptable Sample Size To Negate Cloaking Risk?
My company uses a proprietary AB testing platform. We are testing out an entirely new experience on our product pages, but it is not optimized for SEO. The testing framework will not show the challenger recipe to search bots. With that being said, to avoid any risks of cloaking, what is an acceptable sample size (or percentage) of traffic to funnel into this test?
White Hat / Black Hat SEO | | edmundsseo0 -
Why have bots (including googlebot) categorized my website as adult?
How do bots decide whether a website is adult? For example, I have a gifting portal, but strangely here, it is categorized as 'Adult'. Also, my google adsense application to run ads on my site got rejected - I have a feeling this is because googlebot categorized my site as adult. And there are good chances that other bots also consider it an adult website, rather than a gifting website. Can anyone please go through the site and tell me why this is happening? Thanks in advance.
White Hat / Black Hat SEO | | rahulkan0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Can good penalize a site, and stop it ranking under a keyword permanently
hi all we recently took on a new client, asking us to improve there google ranking, under the term letting agents glasgow , they told us they used to rank top 10 but now are on page 14 so it looks like google has slapped them one, my question is can google block you permanently from ranking under a keyword or disadvantage you, as we went though the customers links, and removed the ones that looked strange, and kept the links that looked ok. but then there ranking dropped to 21, is it worth gaining new links under there main keyword even tho it looks like google is punishing them for having some bad links. the site is www. fine..lets...ltd...co....uk all one word cheers
White Hat / Black Hat SEO | | willcraig0