Bots gone from site for 1-2 hours
-
Hi everyone,
From the beginning of March bots and crawlers begin to disappear from site completely for 1-2 hours and then come back again. It never happened earlier, they constantly crawled our site without stopping. We didn't make any changes on the site.
After that pages and images start to disappear from G index.
Does anyone have the same trouble?
-
Assuming you are talking about the website in your profile...here is what I would check: Do you have any content duplication between your multiple language subdomains ? Do you have XML Sitemaps for your pages ? Are they submitted via the Webmaster Console ? If yes, what %age of your pages do you see indexed vs how many you submitted ? Have you considered doing an Images XML Sitemap ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Ranking is dancing for 2 months
Hi, We have recently, https on website, all set perfectly. We also moved to php7 from 6. These two changes shouldn't harm our ranking, for keyword Virginia Beach, we used to show at first 5 position in Google.com. But for past 2 months ranking is dancing. This is insane. we have seen us on position #5 several times over the past few weeks, then it just disappears. website vabeach.com Your help will be appreciated.
Technical SEO | | irrfans40 -
I have 2 E-commerce sites - Can i cross link?
Good Morning Everyone, I have 2 e-commerce websites that are similar and sell the same products. The content (text/descriptions/titles) is different so the content is not duplicate. SITE A has a ton of blog posts with highly relevant information and we frequently update the blog with posts about the types of products we carry and how it can help people in their daily lives... SITE B has no blog posts, but the content on the blog from SITE A is extremely relevant and helpful to anyone using SITE B. My question is, do you think it is frowned upon if i were to add links on SITE B that point to specific posts on SITE A... For example, if you are browsing a category page on SITE B, i was thinking of adding links on the bottom that would say "For More Information, Please Check Out These Posts on our Blog" www.sitea.com/blog/relevantinfo1 www.sitea.com/blog/relevantinfo2 www.sitea.com/blog/relevantinfo3 I think this would seriously help our browsers and potential customers get all of the information that they need, but what do you think Google would think about this cross-linking and if it violates their guidelines? Thanks for any opinions and advice.
Technical SEO | | Prime850 -
Representing categories on my site
My site serves a consumer-focused industry that has about 15-20 well recognized categories, which act as a pretty obvious way to segment our content. Each category supports it's own page (with some useful content) and a series of articles relevant to that category. In short, the categories are pretty focal to what we do. I am moving from DNN to WordPress as my CMS/blog. I am taking the opportunity to review and fix SEO-related issues as I migrate. One such area is my URL structure. On my existing site (on DNN), I have the following types of pages for each topic: / <topic>- this is essentially the landing page for the topic and links to articles</topic> /<topic>/articles/ <article-name>- topics have 3-15 articles with this URL structure</article-name></topic> With WordPress, I am considering moving to articles being under the root. So, an article on (making this up) how to make a widget would be under /how-to-make-a-widget, instead of /<widgets>/article/how-to-make-a-widget I will be using WordPress categories to reflect the topics taxonomy, so I can flag my articles using standard WordPress concepts.</widgets> Anyway, I'm trying to get my head around whether it makes sense to "flatten" my URL structure such that the URLs for each article no longer include the topic (the article page will link to the topic page though). Thoughts?
Technical SEO | | MarkWill1 -
What do you think about my new site?
Hi everyone, I'm looking for a review for my new site www.interlive.it Could you please let me know what do you think about the work that I did for my site. I'll be very happy to receive your suggestions. Regards, Mike
Technical SEO | | salvyy0 -
Seomoz Can not Crawl My Site
Hello there Seomoz can not crawl my site. It's been 3 days now not a single page has been crawled. I deleted the campaign and tried again still now crawl not a single page.. Any solutions??
Technical SEO | | ExpertSolutions0 -
Canonical on ecommerce site
I have read tons of guides about canonical implementaiton but still am confused about how I should best use it. On my site with tens of thousands of urls and thousands of afiiliates and shopping networks sending traffic, is it smart to simply add the tag to every page and redirect to the same url. In doing this would that solve the problem of a single page having many different entrances with different tracking codes? Is there a better way to handle this? Also is there any potential problems with rolling out the tag to all pages if they are simply refrencing themselves in the tag? Thanks in advance.
Technical SEO | | Gordian0 -
Index forum sites
Hi Moz Team, somehow the last question i raised a few days ago not only wasnt answered up until now, it was also completely deleted and the credit was not "refunded" - obviously there was some data loss involved with your restructuring. Can you check whether you still find the last question and answer it quickly? I need the answer 🙂 Here is one more question: I bought a website that has a huge forum, loads of pages with user generated content. Overall around 500.000 Threads with 9 Million comments. The complete forum is noindex/nofollow when i bought the site, now i am thinking about what is the best way to unleash the potential. The current system is vBulletin 3.6.10. a) Shall i first do an update of vbulletin to version 4 and use the vSEO tool to make the URLs clean, more user and search engine friendly before i switch to index/follow? b) would you recommend to have the forum in the folder structure or on a subdomain? As far as i know subdomain does take lesser strenght from the TLD, however, it is safer because the subdomain is seen as a separate entity from the regular TLD. Having it in he folder makes it easiert to pass strenght from the TLD to the forum, however, it puts my TLD at risk c) Would you release all forum sites at once or section by section? I think section by section looks rather unnatural not only to search engines but also to users, however, i am afraid of blasting more than a millionpages into the index at once. d) Would you index the first page of a threat or all pages of a threat? I fear duplicate content as the different pages of the threat contain different body content but the same Title and possibly the same h1. Looking forward to hear from you soon! Best Fabian
Technical SEO | | fabiank0