SEOMOZ crawler is still crawling a subdomain despite disallow
-
This is for our client with a subdomain. We only want to analyze their main website as this is the one we want to SEO. The subdomain is not optimized so we know it's bound to have lots of errors.
We added the disallow code when we started and it was working fine. We only saw the errors for the main domain and we were able to fix them. However, just a month ago, the errors and warnings spiked up and the errors we saw were for the subdomain.
As far as our web guys are concerned. the disallow code is still there and was not touched. User-agent: rogerbot Disallow: /
We would like to know if there's anything we might have unintentionally changed or something we need to do so that the SEOMOZ crawler will stop going through the subdomain.
Any help is greatly appreciated!
-
Thanks Peter for your assistance.
Hope to hear from the SEOMOZ team soon with regards to this issue.
-
John,
Thanks for writing in! I would like to take a look at which project you guys were working with that this is happening. I will go ahead and start a ticket so we can better answer your questions You should hear from me soon!
Best,
Peter
Moz Help Team. -
I have heard of this before recently, I think possibly the moz crawler all or sometimes now just ignores the disallow because it is not a usual S.E crawler.
Hopefully one of the staff can provide some insight in this for you.
All the best.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Manual Penalty Lifted - Why is my website still decreasing on traffic?
Hi there, I was hoping that somebody has a potential answer to this or if anyone else has experienced this issue. Our website has recently hit by a manual penalty (structured data wasn't matching the content on the page) After working hard on this to fix the issue across the site, we submitted a reconsideration request which was approved by Google a few days later. I understand that not all websites recover and it doesn't guarantee rankings will go back to normal, but it seems as if the traffic is continuing to drop at an even quicker rate. There's a number of small technical optimisations that have been briefed into the dev team such as: Redirecting duplicate versions, fixing redirects on internal links, There's also work on-page running in the background fixing up keyword cannibalization, consolidating content keyword mapping and ensuring the internal link structure is sound. Has this happened to anyone else before? If so, how did you recover? Any suggestions/advice would be really appreciated. Thank you
Intermediate & Advanced SEO | | dbutler9120 -
Still Need to Write Title & Description Tag?
My SEO has advised me that Google has stopped using title and description tags for search results and as such, it is no longer necessary to write specific title and description tags. I see that Yoast seems to pull text to create these tags and sometimes it looks like it reflects the best elements of the content, sometimes it does not. Should I be asking our SEO team to write dedicated title and description tags or is it best practices to leave it to the Yoast plugin? My SEO is of the opinion that writing these tags is not a productive use of time as Google will serve results based on the user inquiry rather than the content on his tags. It sounds logical but it would be reassuring to receive further confirmation of this. Thoughts?"
Intermediate & Advanced SEO | | Kingalan1
Thanks, Alan0 -
Why Did My Google Crawls Hit A Wall?
Hello, One my the sites I work with, http://www.oransi.com, has seen a significant decrease in crawl Googlebot activity in the last 90 days. See screenshot. This decrease in crawl stats runs in conjunction with less Kb downloaded per day & an increase in how much time it took Google to download a page. The client did just go through a redesign, however that happened on 4/16/15, which was after the decrease in Googlebot activity, so that should not be the issue. Same could be said for the mobilegeddan algorithm change. Any help would be greatly appreciated. 5u1lM6B
Intermediate & Advanced SEO | | BrandLabs0 -
Google can't access/crawl my site!
Hi I'm dealing with this problem for a few days. In fact i didn't realize it was this serious until today when i saw most of my site "de-indexed" and losing most of the rankings. [URL Errors: 1st photo] 8/21/14 there were only 42 errors but in 8/22/14 this number went to 272 and it just keeps going up. The site i'm talking about is gazetaexpress.com (media news, custom cms) with lot's of pages. After i did some research i came to the conclusion that the problem is to the firewall, who might have blocked google bots from accessing the site. But the server administrator is saying that this isn't true and no google bots have been blocked. Also when i go to WMT, and try to Fetch as Google the site, this is what i get: [Fetch as Google: 2nd photo] From more than 60 tries, 2-3 times it showed Complete (and this only to homepage, never to articles). What can be the problem? Can i get Google to crawl properly my site and is there a chance that i will lose my previous rankings? Thanks a lot
Intermediate & Advanced SEO | | granitgash
Granit FvhvDVR.png dKx3m1O.png0 -
Crawl Issue Found: No rel="canonical" Tags
Given that google have stated that duplicate content is not penalised is this really something that will give sufficient benefits for the time involved?Also, reading some of the articles on moz.com they seem very ambivalent about its use – for example http://moz.com/blog/rel-confused-answers-to-your-rel-canonical-questionsWill any page with a canonical link normally NOT be indexed by google?Thanks.
Intermediate & Advanced SEO | | fdmgroup0 -
Website not properly listed on google organic despite SEO efforts
Hello, I have worked thoroughly on my website tags including HTML titles, URLs, H1 headers and text on each section. The problem is that despite this effort, my website does not seem to improve in terms of ranking (the onsite optimization has been done 6 months ago already). We have a sitemap, we have done link building and everything but still no tangible progress. The anomaly I am experiencing is the following: If I search on Google.com.lb for "aparment for sale in lebanon" I don't get the section on my website that is optimized for that particular query (which is the Buy section located here http://www.ramcolb.com/apartment-sale-beirut-lebanon). my site appears only on page 6 but the Homepage appears which is very counter intuitive because it is not optimized for the "apartment for sale in Lebanon" keyword. And this anomaly is present on almost all sections and their relevant queries. The relevant section for the particular query never appears, it is always another irrelevant section that appears but far in the listings (beyond page 6). It is as if Google hasn't indexed properly my website and is mixing up the sections... Has anyone experienced this type of problems? what can be done? Thanks in advance
Intermediate & Advanced SEO | | ImadKaram0 -
SEO Implications of Moving Blog to Subdomain
Hello, We are having some issues upgrading our stack and maintaining Wordpress for our blog. So we are thinking about splitting them up. What are the SEO implications of moving our blog to a subdomain? Our blog URL structure is currently something like https://www.aplossoftware.com/blog/p/2470/fund-accounting/yearend-closing-checklist/. We would like to change to something like https://blog.aplossoftware.com/p/2470/fund-accounting/yearend-closing-checklist/
Intermediate & Advanced SEO | | stageagent0 -
SeoMoz Crawler Shuts Down The Website Completely
Recently I have switched servers and was very happy about the outcome. However, every friday my site shuts down (not very cool if you are getting 700 unique visitors per day). Naturally I was very worried and digged deep to see what is causing it. Unfortunately, the direct answer was that is was coming from "rogerbot". (see sample below) Today (aug 5) Same thing happened but this time it was off for about 7 hours which did a lot of damage in terms of seo. I am inclined to shut down the seomoz service if I can't resolve this immediately. I guess my question is would there be a possibility to make sure this doesn't happen or time out like that because of roger bot. Please let me know if anyone has answer for this. I use your service a lot and I really need it. Here is what caused it from these error lines: 216.244.72.12 - - [29/Jul/2011:09:10:39 -0700] "GET /pregnancy/14-weeks-pregnant/ HTTP/1.1" 200 354 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)" 216.244.72.11 - - [29/Jul/2011:09:10:37 -0700] "GET /pregnancy/17-weeks-pregnant/ HTTP/1.1" 200 51582 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"
Intermediate & Advanced SEO | | Jury0